The IT industry has recently seen some interesting activity from global hyperscale cloud providers surrounding their cloud sovereignty ambitions, and their scrutiny by the regulators covering some basics compliance requirements, like the European Union’s (EU) General Data Protection Regulation (GDPR).

Firstly, AWS made a public pledge called the “AWS Digital Sovereignty Pledge”, consisting of a commitment to provide “the most advanced set of sovereignty controls and features available in the cloud”. After Google’s cooperation with T-Systems and the “Delos” offer from Microsoft, SAP, and Arvato, AWS now follows suit. These initiatives reinforce the growing potential of sovereign cloud services in a world increasingly dominated by questions of cloud choice and control, and complex compliance requirements.

So, what does a pledge mean? The dictionary defines this as a “solemn promise” – which would reasonably beg the question: isn’t this an admission there is little sovereignty in the offering today? Otherwise, why would it be a pledge? A pledge is forward-looking, something that has not been performed or delivered yet. Also, shouldn’t an announcement like this ideally be backed up with a roadmap? Where is the guarantee that items in this pledge will be fulfilled? Instead, AWS mentions what the pledge will generally cover: control over the location of your data, verifiable control over data access, the ability to encrypt everything everywhere, and the resilience of their cloud. The pledge sounds excellent, but does it meet the minimum standards of most data sovereignty requirements worldwide? It appears, from the general language, that none of it addresses the critical concerns around hyperscale usage, jurisdictional control, legal rights to access the data, and complying with sovereign data requirements that require protection from the US Cloud Act or Section 702 of the US Foreign Intelligence Surveillance Act (FISA).

Secondly, Microsoft has run aground in Germany with Office 365 reportedly not complying with GDPR. GDPR is 4+ years old and is a huge issue that most companies have joined in the rush not to be penalised by the EU. With Germany’s federal and state data protection authorities (DSK) raising concerns about the compatibility of 365 with data protection laws in Germany and the wider EU, it makes you wonder how other companies may also be falling short in their obligations to protect EU customers’ data.

Also, how many other regulatory requirements (such as data sovereignty requirements) that global public cloud providers believe they comply with are prone to be scrutinised by the regulators? This news, of course, is food for thought. Microsoft has denied that this is correct and issued a statement asking for more clarification regarding the view that DSK has. IT executives should therefore take this news as a noteworthy case study to fuel the decisions of their cloud choice, as regulatory requirements concerning data sovereignty are much more complex and niche to comply with than GDRP.

All these issues and many more are putting US and global hyperscale cloud providers in a precarious position when operating a sovereign cloud or other regulated cloud solution, in jurisdictions such the EU, where they must adhere to the EU’s GDPR and US legislation. Indeed, it puts the EU in a precarious position as well, given that 72% of the European cloud market spend was aligned with AWS, Microsoft, and Google in Q2 2022.

The EU wants a fair market and a protected European cloud without compromising cloud functionality. However, continued investment by customers in US hyperscale and continual investment in the region of $4bn in US hyperscale organisations into expansion means that no European cloud company will ever seriously challenge this market today. The EU certainly has a quandary: on the one hand, enforcing sovereignty would mean no foreign clouds could be used, which would severely damage the EU cloud market; and on the other hand, how to legislate enough to maintain a level of sovereignty that doesn’t exclude foreign providers with some level of external jurisdictional control? It seems that for the foreseeable future, there will be little answer to this quandary. The most prudent approach to compliance appears to be a national, purpose-built sovereign cloud, using external clouds when your data classification meets the needs of unregulated or non-sovereign environments – this seems to be cloud smart!

European cloud providers tend to be more specialised in their services, with nearly all providing managed services, something not found directly in the major US hyperscale cloud provider offerings. I believe this is a good thing. VMware has consistently stated that the future of a well-run cloud-smart IT strategy is multi-cloud and hybrid cloud and that being cloud-smart means we cannot ignore hyperscale offerings. We need them, especially as there are significant innovations and market-leading scalability in these clouds.

This is where VMware’s strategy is unique: VMware encourages multi-cloud and helps organisations maintain a cloud strategy that avoids lock-in and maintains quality and security while monitoring performance. The VMware Sovereign Cloud initiative provides national and local cloud provider partners the capability to build purpose-built sovereign clouds, including ones that deliver locally specific requirements in areas such as data sovereignty, including data residency and jurisdictional control, data access and integrity, data security and compliance, data independence and mobility, and data innovation and analytics.

The common misunderstanding when considering using a global hyperscale cloud provider as an option for workloads requiring data sovereignty is that there is compliance because the portfolio, data and applications will be limited to only what can be run in a region. This still doesn’t make it sovereign – it is simply a farce. To be clear, physical location (or data residency), while necessary for data sovereignty, does not constitute data sovereignty entirely for almost if not all data sovereignty requirements around the globe.

Data sovereignty requirements are unique to each jurisdiction, but all have many more needs than simple data residency. For example, they all also require jurisdictional control – which cannot be assumed to be met with a data resident cloud, particularly for US or global cloud providers subject to the Cloud Act and FISA ruling. It’s therefore essential to recognise that VMware sovereign cloud providers are independent third-party partners across the globe who also manage extensive portfolios of cloud capabilities. Based on VMware solutions and ecosystem vendors, with tools and competitive advantage (under the current regulatory climate) to be able to provide the highest levels of compliance comfort with data sovereignty requirements and/or other regulations such as GDPR.

getty

So, what is the answer here? VMware’s position has not changed; the usage of “trusted” hyperscale clouds denotes a level of trust whereby data that should be placed in a hyperscale cloud is not top secret or restricted, can be protected (using encryption, bring your own key, confidential computing, or privacy-enhancing compute (PEC)) and should be public—i.e., only low-risk data should be placed in any hyperscale cloud, whether trusted or native. Whilst the battles between the hyperscale clouds continue to attempt to achieve sovereign status in Europe. Across the globe, customers should not wait any longer for a magical one size fits all solution or ever trust that their due diligence of regulatory requirements can be delegated to any vendor. Instead, consider a strategy that utilises the best of all multi-cloud solutions and establishes cloud choices based on data classification, data operations, and risk.

VMware

As the diagram shows, there is increased risk associated with non-sovereign cloud solutions, as jurisdictional control is negated in a trusted or hyperscale public cloud. The volume of data applicable to non-sovereign services that should be considered may be lower when you have conducted a thorough data classification exercise. Remember that a sovereign cloud provider delivers services suited to your vertical, whether government, public sector, financial, or many other verticals, and managed services to help you with your cloud adoption strategy. Some also innovate solutions for secure data exchange to enable monetising your data, a critical component in the growing data market. In addition, VMware Sovereign Cloud Providers may be best suited to support you in managing locally tailored privacy, classifications, and risk analysis, ensuring compliance with the most stringent of standards. As data pertains to personal and non-personal data (think industrial and IoT), a classification exercise will help you understand your risks and how to protect them in alignment with regulatory requirements and mitigate future threats from new data classification standards that are indeed to come.
 
As data markets evolve and data exchange for supply chain and monetisation become a critical component of how we do business, it is essential that the right strategy is decided at day 0 and that the limitations of a cloud choice do not compromise the principles of sovereignty you encompass. Additionally, ensure that the cloud provider you select has the right technology capabilities, security infrastructure, and data governance processes to protect your data, meet compliance standards, and provide a secure platform for your business.

Find your closest VMware Sovereign Cloud provider today

Cloud Management, Cloud Security, Data Management, Data Privacy, VMware

By Olaf de Senerpont Domis, senior editor at DataStax

Premji Invest is an evergreen fund formed to support the Azim Premji Foundation, which was founded by Azim Premji, the former chairman of IT services consultancy Wipro. Premji Invest deploys a “crossover format” (investing in both private and public companies) across the technology, healthcare, consumer, and FinTech landscapes; it has backed market leaders like Outreach, Sysdig, Heyday, Anaplan, Coupa, Moderna, Carta, Flipkart, Looker (acquired by Google), and DataStax. Premji Invest-US managing partner, Sandesh Patnam, established Premji Invest’s US presence in 2014 in Menlo Park, California.

We recently spoke with Sandesh to learn more about the firm’s investment strategy and his excitement about the database market.

What’s Premji Invest’s investment strategy?

We deploy a direct crossover investment strategy with a roughly equal split between mid- to late-stage growth equity and public equities. Our evergreen structure informs and supports our long-term duration investment approach: We think in 5- to 15-year horizons. While many investors view an IPO as a potential point of exit, we see the opposite. We’ve often participated meaningfully in the IPO events of standout members of our private portfolio and continued our partnership well beyond going public. We have a team in India and a team in the US, which I set up about 8-and-a-half years ago. We’re active investors and look to partner with the founders and management teams that are on a mission to create enduring companies.

What qualities do you look for in investments?

We want to invest in companies that thrive in the public markets. On the flip side, our public portfolio in many ways reflects our private practice conviction. We have deep private and public practices that operate under one hood, so it’s through this continuity that we understand the durability of a business model, pricing, quarterly cadence, value creation, and the rigor of a team. All these metrics are easier said than accomplished, but they’re a clear proxy for quality.

We also want to see significant product-market fit. I usually use the term “wild market fit.” A lot of companies can spend a lot of dollars and get a “push-based” model, but that can generate false positives. We want to see significant market “pull.” That requires some level of codification of the go-to-market strategy and process. A lot of companies have heroic sellers or unique customers — but still fail after lots of misspent dollars.

Why invest in the database market?

We’ve all heard software is going to eat the world. But more importantly, I’d say that AI [artificial intelligence] and ML [machine learning] are going to eat software. There are a lot of companies that build software that are often fairly basic workflow tools. For software to be actionable, data must be at the center. If you think of the way the world is headed with AI and ML, how is that going to get more intelligent? What is the basis of ML? In these cases, the most important aspect is: Can you organize your data and can you learn from it and piece together information in real-time that can take real action on that data?

There’s a second element that we look for: With the speed and volume at which we are accreting data, can it be stored in an efficient way? If so, your AI and ML can get better over time, so all software should be predictive in some way.

Why is the database market more interesting today than ever before?

The need for real-time information. Ecommerce, ad spend, and real-time events that happen once need to be moderated. If you don’t have the right instrumentation and analytics, provided on a real-time basis, you’re going to fall behind. 

Think about the companies that have taken share and disrupted industries; think of all the things Amazon has done and Netflix has done — all the things that the tech challengers have done to existing businesses. They all stem from the fact that they were able to instrument their business much more so than others.

At Amazon, the office of CFO is, in many ways, more like the office of the CRO [chief revenue officer]. Amazon can instrument its business at the SKU level. Think about toothpaste with a particular flavor that’s sold in the Northeast. Amazon can tell you what the profit contribution is, what the cycle of buying is, who the potential buyer is, what the supply chain logistics look like. You start breaking that down at the SKU level, and that enables the promotion of certain products in certain geographies and enables you to make specific contribution margin decisions and make near-perfect promotions.

When you think about how that’s even possible, then you start understanding the power of AWS. But you also understand that underneath AWS is the power of a very dynamic, highly distributed database. The relevance of the right data model and the right database and the impact it has on businesses is much more pronounced today. You can say that data is destiny, but I’d add that the right database is destiny — it really impacts the business model in a very profound way.

Learn more about DataStax here.

About Olaf de Senerpont Domis:

DataStax

Olaf is senior editor at DataStax. He has driven awareness via content marketing roles at Google and Apigee. Prior to that, he spent two decades as a journalist, reporting on the financial services industry and technology M&A.

Data Management, IT Leadership

In the last few weeks, the IT industry has seen some very interesting activity from global hyperscale cloud providers surrounding their cloud sovereignty ambitions, and their scrutiny by the regulators covering some basics compliance requirements, like the European Union’s (EU) General Data Protection Regulation (GDPR)

Firstly, AWS made a public pledge called the “AWS Digital Sovereignty Pledge”, consisting of a commitment to provide “the most advanced set of sovereignty controls and features available in the cloud”. After Google’s cooperation with T-Systems and the “Delos” offer from Microsoft, SAP, and Arvato, AWS now follows suit. These initiatives reinforce the growing potential of sovereign cloud services in a world increasingly dominated by questions of cloud choice and control, and complex compliance requirements.

So, what does a pledge mean? The dictionary defines this as a “solemn promise” – which would reasonably beg the question: isn’t this an admission that there is little sovereignty in the offering today? Otherwise, why would it be a pledge? A pledge is forward-looking, something that has not been performed or delivered yet. Also, shouldn’t an announcement like this ideally be backed up with a roadmap? Where is the guarantee that items in this pledge will be fulfilled? Instead, AWS mentions what the pledge will generally cover: control over the location of your data, verifiable control over data access, the ability to encrypt everything everywhere, and the resilience of their cloud. The pledge sounds excellent, but does it meet the minimum standards of most data sovereignty requirements worldwide? It appears, from the general language, that none of it addresses the critical concerns around hyperscale usage, jurisdictional control, legal rights to access the data, and complying with sovereign data requirements that require protection from the U.S. CLOUD Act or Section 702 of the US Foreign Intelligence Surveillance Act (FISA).

Secondly, Microsoft has run aground in Germany with Office 365 reportedly not complying with GDPR. GDPR is 4+ years old and is a huge issue that most companies have joined in the rush not to be penalized by the EU. With Germany’s federal and state data protection authorities (DSK) raising concerns about the compatibility of 365 with data protection laws in Germany and the wider EU, it makes you wonder how other companies may also be falling short in their obligations to protect EU customers’ data. Also, how many other regulatory requirements (such as data sovereignty requirements) that global public cloud providers believe they comply with are prone to be scrutinized by the regulators? This news, of course, is food for thought. Microsoft has denied that this is correct and issued a statement asking for more clarification regarding the view that DSK has. IT executives should therefore take this news as a noteworthy case study to fuel the decisions of their cloud choice, as regulatory requirements concerning data sovereignty are much more complex and niche to comply with than GDRP.

All these issues and many more are putting U.S. and global hyperscale cloud providers in a precarious position when operating a sovereign cloud or other regulated cloud solution, in jurisdictions such the EU, where they must adhere to the EU’s GDPR and U.S. legislation. Indeed, it puts the EU in a precarious position as well, given that 72% of the European cloud market spend was aligned with AWS, Microsoft, and Google in Q2 2022. The EU wants a fair market and a protected European cloud without compromising cloud functionality. However, continued investment by customers in U.S. hyperscale and continual investment in the region of $4b in U.S. hyperscale organizations into expansion means that no European cloud company will ever seriously challenge this market today. The EU certainly has a quandary; on the one hand, enforcing sovereignty would mean no foreign clouds could be used, which would severely damage the EU cloud market; and on the other hand, how to legislate enough to maintain a level of sovereignty that doesn’t exclude foreign providers with some level of external jurisdictional control? It seems that for the foreseeable future, there will be little answer to this quandary, and, in any event, the most prudent approach to compliance appears to be a national, purpose-built sovereign cloud, using external clouds when your data classification meets the needs of unregulated or non-sovereign environments— this seems to be cloud smart!

European cloud providers tend to be more specialized in their services, with nearly all providing managed services, something not found directly in the major U.S. hyperscale cloud provider offerings. I believe this is a good thing. VMware has consistently stated that the future of a well-run cloud-smart IT strategy is multi-cloud and hybrid cloud and that being cloud-smart means we cannot ignore hyperscale offerings. We need them, especially as there are significant innovations and market-leading scalability in these clouds. This is where VMware’s strategy is unique: VMware encourages multi-cloud and helps organizations maintain a cloud strategy that avoids lock-in and maintains quality and security while monitoring performance. The VMware Sovereign Cloud initiative provides national and local cloud provider partners the capability to build purpose-built sovereign clouds, including ones that deliver locally specific requirements in areas such as data sovereignty, including data residency and jurisdictional control, data access and integrity, data security and compliance, data independence and mobility, and data innovation and analytics.

The common misunderstanding when considering using a global hyperscale cloud provider as an option for workloads requiring data sovereignty is that there is compliance because the portfolio, data and applications will be limited to only what can be run in a region. This still doesn’t make it sovereign – it is simply a farce. To be clear, physical location (or data residency), while necessary for data sovereignty, does not constitute data sovereignty entirely for almost if not all data sovereignty requirements around the globe. Data sovereignty requirements are unique to each jurisdiction, but all have many more needs than simple data residency. For example, they all also require jurisdictional control, – which cannot be assumed to be met with a data resident cloud, particularly for U.S. or global cloud providers subject to the CLOUD Act and FISA ruling. It’s therefore essential to recognize that VMware sovereign cloud providers are independent third-party partners across the globe who also manage extensive portfolios of cloud capabilities. Based on VMware solutions and ecosystem vendors, with tools and competitive advantage (under the current regulatory climate) to be able to provide the highest levels of compliance comfort with data sovereignty requirements and/or other regulations such as GDPR.

VMware

So, what is the answer here? VMware’s position has not changed; the usage of “trusted” hyperscale clouds denotes a level of trust whereby data that should be placed in a hyperscale cloud is not top secret or restricted, can be protected (using encryption, bring your own key, confidential computing, or privacy-enhancing compute (PEC)) and should be public—i.e., only low-risk data should be placed in any hyperscale cloud, whether trusted or native. Whilst the battles between the hyperscale clouds continue to attempt to achieve sovereign status in Europe. Across the globe, customers should not wait any longer for a magical one size fits all solution or ever trust that their due diligence of regulatory requirements can be delegated to any vendor. Instead, consider a strategy that utilizes the best of all multi-cloud solutions and establishes cloud choices based on data classification, data operations, and risk.

VMware

As the diagram shows, there is increased risk associated with non-sovereign cloud solutions, as jurisdictional control is negated in a trusted or hyperscale public cloud. The volume of data applicable to non-sovereign services that should be considered may be lower when you have conducted a thorough data classification exercise. Remember that a sovereign cloud provider delivers services suited to your vertical, whether government, public sector, financial, or many other verticals, and managed services to help you with your cloud adoption strategy. Some also innovate solutions for secure data exchange to enable monetizing your data, a critical component in the growing data market. In addition, VMware Sovereign Cloud Providers may be best suited to support you in managing locally tailored privacy, classifications, and risk analysis, ensuring compliance with the most stringent of standards. As data pertains to personal and non-personal data (think industrial and IoT), a classification exercise will help you understand your risks and how to protect them in alignment with regulatory requirements and mitigate future threats from new data classification standards that are indeed to come.
 
As data markets evolve and data exchange for supply chain and monetization become a critical component of how we do business, it is essential that the right strategy is decided at day 0 and that the limitations of a cloud choice do not compromise the principles of sovereignty you encompass. Additionally, ensure that the cloud provider you select has the right technology capabilities, security infrastructure, and data governance processes to protect your data, meet compliance standards, and provide a secure platform for your business.

Find your closest VMware Sovereign Cloud provider today

Cloud Management, Compliance, IT Leadership

Since the premier of the wildly popular 1993 dinosaur cloning film Jurassic Park, the sciences featured in the film, genetic engineering and genomics, have advanced at breathtaking rates. When the film was released, the Human Genome Project was already working on sequencing the entire human genome for the first time. They completed the project in 2003 after 13 years and at a cost of $1 billion. Today, the human genome can be sequenced in less than a day and at a cost of less than $1,000.

One leading genomics research organization, The Wellcome Sanger Institute in England, is on a mission to improve the health of all humans by developing a comprehensive understanding of the 23 chromosomes in the human body. They’re relying on cutting edge technology to operate at incredible speed and scale, including reading and analyzing an average of 40 trillion DNA base pairs a day.

Alongside advances in DNA sequencing techniques and computational biology, high-performance computing (HPC) is at the heart of the advances in genomic research. Powerful HPC helps researchers process large-scale sequencing data to solve complex computing problems and perform intensive computing operations across massive resources.

Genomics at Scale

Genomics is the study of an organism’s genes or genome. From curing cancer and combatting COVID-19 to better understanding human, parasite, and microbe evolution and cellular growth, the science of genomics is booming. The global genomics market is projected to grow to $94.65 billion by 2028 from $27.81 billion in 2021, according to Fortune Business Insights. Enabling this growth is a HPC environment that is contributing daily to a greater understanding of our biology, helping to accelerate the production of vaccines and other approaches to health around the world.

Using HPC resources and math techniques known as bioinformatics, genomics researchers analyze enormous amounts of DNA sequence data to find variations and mutations that affect health, disease, and drug response. The ability to search through the approximately 3 billion units of DNA across 23,000 genes in a human genome, for example, requires massive amounts of compute, storage, and networking resources.

After sequencing, billions of data points must be analyzed to look for things like mutations and variations in viruses. Computational biologists use pattern-matching algorithms, mathematical models, image processing, and other techniques to obtain meaning from this genomic data.

A Genomic Powerhouse

At the Sanger Institute, scientific research is happening at the intersection of genomics and HPC informatics. Scientists at the Institute tackle some of the most difficult challenges in genomic research to fuel scientific discoveries and push the boundaries of our understanding of human biology and pathogens. Among many other projects, the Institute’s Tree of Life program explores the diversity of complex organisms found in the UK through sequencing and cellular technologies. Scientists are also creating a reference map of the different types of human cells.

Science on the scale of that conducted at the Sanger Institute requires access to massive amounts of data processing power. The Institute’s Informatics Support Group (ISG) helps meet this need by providing high performance computing environments for Sanger’s scientific research teams. The ISG team provides support, architecture design and development services for the Sanger Institute’s traditional HPC environment and an expansive OpenStack private cloud compute infrastructure, among other HPC resources.

Responding to a Global Health Crisis

During the COVID-19 pandemic, the Institute started working closely with public health agencies in the UK and academic partners to sequence and analyze the SARS-COV-2 virus as it evolved and spread. The work has been used to inform public health measures and to help save lives.

As of September 2022, over 2.2 million coronavirus genomes have been sequenced at Wellcome Sanger. They are immediately made available to researchers around the world for analysis. Mutations that affect the virus’s spike protein, which it uses to bind to and enter human cells, are of particular interest and the target of current vaccines. Genomic data is used by scientists with other information to ascertain which mutations may affect the virus’s ability to transmit, cause disease, or evade the immune response.

Society’s greater understanding of genomics, and the informatics that goes with it, has accelerated the development of vaccines and our ability to respond to disease in a way that’s never been possible before. Along the way, the world is witnessing firsthand the amazing power of genomic science.

Read more about genomics, informatics, and HPC in this white paper and case study of the Wellcome Sanger Institute.

***

Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

High-Performance Computing