Artificial intelligence (AI) in 2023 feels a bit like déjà vu to me. Back in 2001, as I was just entering the venture industry, I remember the typical VC reaction to a start-up pitch was, “Can’t Microsoft replicate your product with 20 people and a few months of effort, given the resources they have?” Today, any time a new company is pitching its product that uses AI to do ‘X,’ the VC industry asks, “Can’t ChatGPT do that?”

Twenty-two years later, Microsoft is at the table once again. This time they’re making a $13 billion bet by partnering with OpenAI and bringing to market new products like Security Copilot to make sense of the threat landscape using the recently launched text-generating GPT-4 (more on that below). But just as Microsoft did not inhibit the success of thousands of software start-ups in the early 2000s, I do not expect Microsoft or any vendor to own this new AI-enabled market. 

However, the market explosion and hype around AI across the business and investment spectrum over the past few months has led people to ask: what are we to make of it all? And more specifically, how do CIOs, CSOs, and cybersecurity teams learn to deal with technology that may pose serious security and privacy risks?

The good, the bad, and the scary

I look at the good, the bad, and the scary of this recent Microsoft announcement. What’s incredible about ChatGPT and its offspring is that it brings an accessible level of functionality to the masses. It’s versatile, easy to use, and usually produces solid results.

Traditionally, organizations have needed sophisticated, trained analysts to sort through, analyze, and run processes for their security data. This required knowledge of particular query languages and configurations relevant to each product, like Splunk, Elastic, Palo Alto/Demisto, and QRadar. It was a difficult task, and the available talent pool was never enough.   

That difficulty in SIEM (Security Information and Event Management) and SOAR (Security Orchestration, Automation, and Response) still exists today. SIEM helps enterprises collect and analyze security-related data from servers, applications, and network devices. The data is analyzed to identify potential security threats, alert security teams to suspicious activity, and provide insights into a company’s security defenses. SIEM systems typically use advanced analytics to identify patterns, anomalies, and other indicators of potential threats.

SOAR builds on SIM capabilities by automating security workflows and helping businesses respond more quickly and efficiently to security incidents. SOAR platforms can integrate with various security products, including enterprise firewalls, intrusion detection systems, and vulnerability scanners. SIEM/SOAR is where you orchestrate action for an incident response plan. Using those actions helps in the remediation process. Managing the process and products involved in remediation is difficult.

Now, Microsoft is putting a stake in the ground with its generative AI Security Copilot tool. With Security Copilot, the tech company is looking to boost the capability of its data security products for deep integrated analysis and responses. By integrating GPT-4 into Security Copilot, Microsoft hopes to work with companies to

more easily identify malicious activity;

summarize and make sense of threat intelligence;

gather data on various attack incidents by prioritizing the type and level of incidents; and

recommend to clients how to remove and remediate diverse threats in real-time.

And guess what? Theoretically, it should be easier to sort through all that data using GPT APIs and other tools or figure out how to leverage these on incident data. These systems should also make more automated response and orchestration much simpler.

Overall, the emergence of GPT-4 may be a step towards the industry’s dream of “Moneyball for cyber,” allowing for a more robust defensive posture by leveraging the experience and wisdom of the crowds. And it will allow for a stronger defense of smaller organizations that do not have sufficient resources and expertise today.

It’s all about trust

However, there are still significant obstacles to overcome regarding adoption and trust. First and foremost, there is still reluctance among many organizations to share their incident data with others, even if de-identified, as it could potentially lead to leaked information, bad press, and brand damage. Sharing has been talked about for years, but is rarely done in a systematic, or technology-delivered manner for these reasons. The best sharing practices followed today are industry CISOs talking amongst their tight peer group when something significant occurs. Thus, given the reluctance to share in any meaningful way previously, I suspect that the industry will take a long time to put their data in this or any third-party platform for fear that it exposes them in some way.

Another hurdle is overcoming hesitancy about privacy and security concerns. Microsoft claims that integrating data into its systems will maintain privacy and security. Security Copilot will not train on nor learn from their customers’ incident or vulnerability data. However, without full transparency, the market will have lingering doubts. Users may fear that attackers may use the same GPT-based platform to develop attacks that target the vulnerabilities in their systems that it has become aware of, no matter what the ELA states to the contrary.  Wouldn’t an attacker love to ask, “Write an exploit that allows me to navigate the defenses at Corporation X?”

There is also a question about how the system can learn from the newest attacks if it is not training on the data from customer organizations. The system would be more powerful if it did learn in the wild from customer incident and vulnerability data.

Even without specific details learned from any one customer, assuming full transparency on security and privacy is guaranteed, given the wide aperture of knowledge that can be obtained from other public and non-public sources, won’t this AI-based system become an adversary’s favorite exploit development tool?

Given all of this, there are potential risks and rewards involved in using ChatGPT in cybersecurity.

Microsoft has major ambitions for Security Copilot. It’s a tall order to fill, and I hope they get it right for everyone’s sake.

Know the potential consequences

GPT-4 under Microsoft auspices might be a great tool if it figures out ways to cut off all that potentially harmful activity. If it can train the system to focus on the positive and do it so that proprietary internal data is not compromised, it would be a potent tool for mainstream analysis of security incidents and security. To date, this has only been done with very sophisticated, high-priced people and complex systems that cater to the higher end of the market.

But suppose the mid-tier companies, who can’t afford top-quality cybersecurity resources or the best data security teams, choose to open up their data to Microsoft and GPT-4? In that case, I just hope they know there may be possible side effects. Caveat emptor!

Artificial Intelligence, Data and Information Security, Security

We see the metaverse as an intersection of immersive experiences across the augmented reality (AR) and virtual reality (VR) spectrums. Businesses can use it, as many already are, to enrich experiences, products, and services with virtual overlays for navigation and context. Others are creating new, fully immersive environments and finding a way to engage customers in them.

Behind these new experiences is not one but several technologies. Many of those metaverse-enabling technologies, like 5G, blockchain, and AI, have been maturing over time, and things that were once technically feasible but not practical have become more commercially available, more affordable, and more consumable, like ChatGPT.

Key trends

There has been a fundamental transition from closed centralized platforms, where users access free information in exchange for their data, to a connected, open, and immersive world known as Web 3.0, or Web3. An example is Roblox, which has nearly 50 million active users and a huge economy inside the metaverse.

There’s also a growing trend of users expecting to be compensated in some way for bringing their attention to a platform.

Who are the business users of the metaverse?

The typical image of a metaverse user as a gamer with a headset and dual hand controllers playing Fortnite isn’t wrong. But that image is overlooking a fast-rising wave of business adoption.

Consumer brands have been early and enthusiastic adopters. Take Gucci.The luxury-goods brand partnered with Roblox to launch Gucci Town, a digital destination on Roblox “dedicated to those seeking the unexpected and to express one’s own individuality and connect with like-minded individuals from all over the world.” Further, Gucci has built an immersive concept store that showcases rare pieces, fosters conversation across contemporary Gucci creators, and even offers digital collectibles for purchase.

Another top brand, Nike, is assembling a cohort of metaverse brand ambassadors by allowing users to create virtual products and monetize them on a Web3 platform called Swoosh. For its part, Nike can create physical products based on those designs. At last check, Nike had generated $185 million from NFT sales and trading royalties. The bottom line for these brands:  Their metaverse presence generates new revenue and increases brand exposure — a win-win.

In manufacturing, companies have seized opportunities for upskilling and training, for example by providing real-time guided build instructions in an assembly process, or “see what I see” expert assistance when someone is troubleshooting equipment. Mercedes-Benz has invested in AR-based metaverse experiences to upskill its service technicians in their dealerships by providing a virtual overlay to its products.

Other companies are looking at digital twins as a way to increase efficiency, reduce costs and optimize operations. For example, BMW has created a simulation of one of its assembly lines, which may enable it to simulate what may happen in a particular environment prior to pushing operations to the production floor.

Financial services also think the metaverse is “on the money.” JPMorgan Chase was the first to open a metaverse “branch” in Decentraland, and many other financial services brands are trying to figure out how to engage in the metaverse with customers, employees, partners, and other elements of their human ecosystems.

Further, evolution in the crypto space and digital wallets has boosted the ability to transact in the metaverse, with financial institutions — both traditional ones and startups — looking to capitalize on the metaverse economy.

How should companies proceed?

Successful companies stay focused on delivering products and services in any technology context. With this objective in mind:

Determine your organization’s innovation appetite (versus its risk appetite; it is too early to speak of ROI in the metaverse).

Align on a business objective for the metaverse: Are we trying to engage a specific customer population? Are we trying to improve efficiency? Are we trying to unlock a new market?

Create some use cases, keeping in mind the technological possibilities you have or feasibly can put in place. (Partnerships are an option.)

Set up a small, cross-functional team that has autonomy as well as guardrails in which they’re allowed to experiment, innovate, and play.

Protiviti is taking a deep dive into the metaverse on VISION by Protiviti. Learn more at vision.protiviti.com/metaverse.

Connect with the Authors

Christine Livingston
Managing Director, Emerging Technologies

Lata Varghese
Managing Director, Technology Consulting

Alex Weishaupl
Managing Director, Digital Transformation

Digital Transformation

Across the manufacturing industry, innovation is happening at the edge. Edge computing allows manufacturers to process data closer to the source where it is being generated, rather than sending it offsite to a cloud or data center for analysis and response. 

For an industry defined by machinery and supply chains, this comes as no surprise. The proliferation of smart equipment, robotics and AI-powered devices designed for the manufacturing sector underscores the value edge presents to manufacturers. 

Yet, when surveyed, a significant gap appears between organizations that recognize the value of edge computing (94%) and those who are currently running mature edge strategies (10%). Running edge devices and smart-manufacturing machines does not always mean there is a fully functioning edge strategy in place. 

Why the gap? 

What is holding back successful edge implementation in an industry that clearly recognizes its benefits?

The very same survey mentioned above suggests that complexity is to blame– with 85% of respondents saying that a simpler path to edge operations is needed. 

What specifically do these complexities consist of? Top among them is: 

Data security constraints: managing large volumes of data generated at the edge, maintaining adequate risk protections, and adhering to regulatory compliance policies creates edge uncertainty.Infrastructure decisions: choosing, deploying, and testing edge infrastructure solutions can be a complex, costly proposition. Components and configuration options vary significantly based on manufacturing environments and desired use casesOvercoming the IT/OT divide: barriers between OT (operational technology) devices on the factory floor and enterprise applications (IT) in the cloud limit data integration and time to value for edge initiatives. Seamless implementation of edge computing solutions is difficult to achieve without solid IT/OT collaboration in place.Lack of edge expertise: a scarcity of edge experience limits the implementation of effective edge strategies. The move to real-time streaming data, data management, and mission-critical automation has a steep learning curve.

Combined, these challenges are holding back the manufacturing sector today, limiting edge ROI (return on investment), time to market and competitiveness across a critical economic sector. 

As organizations aspire toward transformation, they must find a holistic approach to simplifying—and reaping the benefits of — smart factory initiatives at the edge.

Build a Simpler Edge 

What does a holistic approach to manufacturing edge initiatives look like? It begins with these best practices: 

Start with proven technologies to overcome infrastructure guesswork and obtain a scalable, unified edge architecture that ingests, stores, and analyzes data from disparate sources in near-real time and is ready to run advanced smart-factory applications in a matter of days, not weeks. Deliver IT and OT convergence by eliminating data silos between edge devices on the factory floor (OT) and enterprise applications in the cloud (IT), rapidly integrating diverse data types for faster time to value Streamline the adoption of edge use cases with easy and quick deployment of new applications, such as machine vision for improved production quality and digital twin composition for situational modeling, monitoring, and simulationScale securely using proven security solutions that protect the entire edge estate, from IT to OT. Strengthen industrial cybersecurity using threat detection, vulnerability alerts, network segmentation, and remote incident managementEstablish a foundation for future innovation with edge technologies that scale with your business, are easily configured to adopt new use cases— like artificial intelligence, machine learning and private 5G— that minimize the complexity that holds manufacturers back from operating in the data age.

Don’t go it alone

The best way to apply these practices is to start with a tested solution designed specifically for manufacturing edge applications. Let your solution partner provide much of the edge expertise your organization may not possess internally. A partner who has successfully developed, tested and deployed edge manufacturing solutions for a wide variety of use cases will help you avoid costly mistakes and reduce time to value along the way. 

You don’t need to be an industry expert to know that the manufacturing sector is highly competitive and data-driven. Every bit of information, every insight matters and can mean the difference between success or failure. 

Product design and quality, plant performance and safety, team productivity and retention, customer preferences and satisfaction — are all contained in your edge data. Your ability to access and understand that data depends entirely on the practices you adopt today. 

Digitally transforming edge operations is essential to maintaining and growing your competitive advantage moving forward.

A trusted advisor at the edge

Dell has been designing and testing edge manufacturing solutions for over a decade, with customers that include EricssonMcLarenLinde and the Laboratory for Machine Tools at Aachen University

You can learn more about our approach to edge solutions for the manufacturing sector, featuring Intel® Xeon® processors, at Dell Manufacturing Solutions. The latest 4th Gen Intel® Xeon® Scalable processors have built-in AI acceleration for edge workloads – with up to 10x higher PyTorch real-time inference performance with built-in Intel® Advanced Matrix Extensions (Intel® AMX) (BF16) vs. the prior generation (FP32)1.

See [A17] at intel.com/processorclaims: 4th Gen Intel® Xeon® Scalable processors. Results may vary.

Edge Computing

In the words of J.R.R. Tolkien, “shortcuts make long delays.” I get it, we live in an age of instant gratification, with Doordash and Grubhub meals on-demand, fast-paced social media and same-day Amazon Prime deliveries. But I’ve learned that in some cases, shortcuts are just not possible.

Such is the case with comprehensive AI implementations; you cannot shortcut success. Operationalizing AI at scale mandates that your full suite of data–structured, unstructured and semi-structured get organized and architected in a way that makes it useable, readily accessible and secure. Fortunately, the journey to AI is one that is more than worth the time and effort.

AI Potential: Powering Our World and Your Business

That’s because AI promises to be one of the most transformational technologies of our time. Already, we see its impact across industries and applications. If you’ve experienced any of these, then you’re seeing AI in action:

Automated assistants such as Amazon Alexa, Microsoft Cortana and Google Assistant.COVID vaccines and/or personalized medicine used to treat an illness or disease.Smart cars that alert drivers like you, help you park and ping you when it’s time for maintenance.Shopping preferences that are tailored to your specific tastes and proactively sent to you.

Despite these AI-powered examples, businesses have only just begun to embrace AI, with an estimated 12% fully using AI technology.1 But this is changing rapidly. And that’s because AI holds massive potential. In one Forrester study and financial analysis, it was found that AI-enabled organizations can gain an ROI of 183% over three years. 2

That’s why AI is a key determinant of your future success. Businesses that lead in fully deploying AI will be able to optimize customer experiences and efficiencies that help maximize customer retention and customer acquisition and gain a distinct advantage over the competition. The growing divide between AI haves and have-nots is underway and at a certain point, that chasm will not be crossable.

For example, today airports can use AI to keep passengers and employees safer. AI working on top of a data lakehouse, can help to quickly correlate passenger and security data, enabling real-time threat analysis and advanced threat detection.

In order to move AI forward, we need to first build and fortify the foundational layer: data architecture. This architecture is important because, to reap the full benefits of AI, it must be built to scale across an enterprise versus individual AI applications. 

Constructing the right data architecture cannot be bypassed. That’s because several impeding factors are currently in play that must be resolved. All organizations need an optimized, future-proofed data architecture to move AI forward.

Complexity slows innovation

Data growth is skyrocketing. One estimate3 states that by 2024, 149 zettabytes will be created every day: that’s 1.7 MB every second. A zettabyte has 21 zeroes. What does that mean? According to the World Economic Forum4, “At the beginning of 2020, the number of bytes in the digital universe was 40 times bigger than the number of stars in the observable universe.” 

Dell

Data’s size alone creates inherent complexity. Layered on top of that are the different types of data stored in various siloes and locations throughout an organization. It all adds up to a “perfect storm” of complexity.

A complex data landscape prevents data scientists and data engineers from easily linking the right data together at the right time. Additionally, multiple systems of record create a confusing environment when those sources do not report the same answers.

Extracting value from data

Highly skilled data scientists, analysts and other users grapple with gaining ready access to data. This has become a bottleneck, hindering richer and real-time insights. For AI success, data scientists, analysts and other users need fast, concurrent access to data from all areas of the business.

Securing data as it grows

Securing mission-critical infrastructure, across all data in an enterprise, is a default task for every organization.  However, as data grows within an enterprise, more desire for access and use of that data produces an increasing amount of vulnerable security end points.   

Catalyzing AI at Scale with Data Lakehouse

The good news is that data architectures are evolving to solve these challenges and fully enable AI deployments at scale. Let’s look at the data architecture journey to understand why and how data lakehouses help to solve complexity, value and security.

Traditionally, data warehouses have stored curated, structured data to support analytics and business intelligence, with fast, easy access to data. Data warehouses, however, were not designed to support the demands of AI or semi-structured and unstructured data sources. Data lakes emerged to help solve complex data organizational challenges and store data in its natural format. Used in tandem with data warehouses, data lakes, while helpful, simultaneously create more data silos and increase cost.5

Today, the ideal solution is a data lakehouse, which combines the benefits of data warehouses and data lakes. A data lakehouse handles all types of data via a single repository, eliminating the need for separate systems. This unification of access through the lakehouse removes multiple areas of ingress/egress and simplifies security and management achieving both value extraction and security. Data lakehouses support AI and real-time data applications with streamlined, fast and effective access to data.

The benefits of a data lakehouse address complexity, value and security:

Create more value quickly and efficiently from all data sourcesSimplify the data landscape via carefully engineered design featuresSecure data and ensure data availability at the right time for the right requirements

For example, pharmacies can use a data lakehouse to help patients. By quickly matching drug availability with patient demand, pharmacies can ensure the right medication is at the right pharmacy for the correct patient.

Moving AI Forward

AI deployments at scale will change the trajectory of success around the world and across industries, company types and sizes. But first things first mandate that the right data architecture be put in place to fully enable AI. While data lake solutions help accelerate this process, the right architecture cannot be bypassed. As J.R.R. Tolkien intimated, anything worth achieving takes time.

Want to learn more?  Read this ESG paper.

*************

[1] https://www.zdnet.com/article/what-is-ai-maturity-and-why-does-it-matter/ 

[2] https://www.delltechnologies.com/asset/en-us/products/ready-solutions/industry-market/forrester-tei-dell-ai-solutions.pdf

[3] Finances Online, 53 Important Statistics About How Much Data Is Created Every Day, accessed April 2022

[4] https://www3.weforum.org/docs/WEF_Paths_Towards_Free_and_Trusted_Data%20_Flows_2020.pdf

[5] https://www.dell.com/en-us/blog/break-down-data-silos-with-a-data-lakehouse/

IT Leadership

Based in Italy and with more than 20 years of experience helping enterprises, from large international firms to emerging mid-sized operations, grow their businesses with technology, WIIT serves a rapidly expanding and diverse customer base. With a full portfolio that includes an extensive array of cloud offerings – including private, public, and hybrid cloud services – the company is well-known for its track record of success in helping organizations realize the full potential of the cloud while bypassing the complexity often associated with large-scale digital transformations.

Serving leaders in the energy, fashion, financial services, food, healthcare, manufacturing, media, pharmaceutical, professional services, retail, and telecommunications industries, WIIT works with organizations that have stringent business continuity needs, mission-critical applications, and crucial data security and sovereignty requirements. Customers draw on the company’s full suite of solutions which includes everything from digital collaboration tools, a full cybersecurity stack, extensive software development services, and innovations that let customers embrace the Internet of Things.

We recently caught up with Alessandro Cozzi, founder and CEO of WIIT to learn about the company, what he’s seeing in its burgeoning cloud business, and what he feels will be the next big thing. We also took the opportunity to learn why it was important to achieve the VMware Cloud Verified distinction, not just for WIIT, but the companies it serves in Italy, Germany, and around the globe.

“The traditional IT model is no longer sustainable,” says Cozzi. “Often the most value of the cloud lies in hybrid architectures that for the vast majority of enterprises are complex to design and manage. We offer a platform that secures and optimizes the full mix of disparate infrastructure, from edge computing to public cloud, that many organizations need. We also govern it with specialized expertise, certifications, and top-tier proprietary assets that enable us to exceed the most demanding service level agreements.”

Notably, WIIT offers a highly customized Premium Cloud that is uniquely tailored to each organization, a Premium Private offering for critical applications, and a Public Cloud that offers seamless connectivity to Amazon Web Services, Google Cloud, and Microsoft Azure. The company’s Premium Multicloud services enable customers to combine elements of each to best address their needs.

Cozzi notes that WIIT’s Premium Private Cloud ensures extremely high levels of security, scalability, and data reliability, while the public clouds are complimentary especially for applications that aren’t critical. He also points out that hyperscalers are becoming specialized, prompting more companies to rely on services from different cloud providers.

“The hybrid cloud is often the answer when there is a need to host systems in more than one location for international processes, regulatory needs, network latency, data sovereignty, or application requirements,” he adds. “At WIIT we engineer, implement, and govern custom hybrid cloud and multi-cloud models in conjunction with the complex IT architectures our customers need. And we make the most of the unique capabilities of different clouds and data centers to ensure that our clients can continually evolve their businesses.”

Cozzi stresses that VMware’s trusted technologies and new innovations play a pivotal role in these efforts. It’s what led the company to achieve the VMware Cloud Verified distinction.

“WIIT is among the most innovative cloud solutions and service providers in Europe,” he says. “Most of our customers rely on VMware technologies for critical services. Showing that we have deep expertise with them is yet another way that we provide peace of mind and a serene cloud journey.”

Not surprisingly, Cozzi only sees cloud adoption increasing in light of customers’ success in growing their businesses with the cloud and the new capabilities a flexible, hybrid approach makes possible.

“It gives me great pride that today we’re able to remove so much of the complexity involved in even the largest, most involved cloud deployments so that customers simply experience the full potential of the cloud,” says Cozzi. “But I’m also really excited about the innovations we are seeing in the world of applications and the advancements in cloud microservices now taking shape. The impact of the cloud will only increase.  We’re intent to grow a business that continues to offer customers secure and innovative cloud services that recognize not only people, but also the environment, as a strategic priority.  Ultimately, we are committed to being a key player not just in the realm of digital transformations, but also in just and sustainable transitions of infrastructure and the businesses processes and practices it is used for.”    

Learn more about WIIT and its partnership with VMware here.

Cloud Management, IT Leadership, VMware

Founded in 2011, Lintasarta Cloudeka is a division of Lintasarta, Indonesia’s leading provider of information and communications technology. Offering everything from fiber optics to data centers and satellite networks, Lintasarta has a presence throughout Indonesia, with 54 facilities spread throughout the nation and more than 2,400 enterprise customers. These include leading businesses in a wide range of industries, including agriculture, banking, government, health care, higher education, manufacturing, retail, telecommunications, and technology.

We recently caught up with Ginandjar, Marketing & Solution Director at Lintasarta, to learn what he’s seeing in Indonesia’s rapidly growing market for cloud services and solutions, what’s accelerating cloud adoption in the country, what he sees as the next driver of growth, and what it means for the company to achieve the VMware Cloud Verified distinction.

“Throughout the country Lintasarta Cloudeka is known as the cloud provider from and for Indonesia,” says Ginandjar. “We have a really strong understanding of the needs of the industries we serve and provide end-to-end cloud services to a diverse customer base that includes large global enterprises and small- and medium-sized companies. We really pride ourselves on helping businesses realize the full potential of the cloud, and in the process enhance and grow their businesses.”

Lintasarta Cloudeka’s wide array of cloud solutions and services includes robust public, private, and multi-cloud offerings and an extensive portfolio of managed services, from full Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) to Backup and Disaster Recovery-as-a-Service, cloud object storage, and everything in between.

Notably, the company’s Deka Prime public cloud solution and Deka Premium private cloud solution are both based on VMware technologies, as is the company’s IaaS, PaaS, and other offerings. Ginandjar notes that customers manage them, as well as on-premises VMware-based infrastructure, with ease from a single dashboard using VMware vCloud Director.

“Most of our customers require a flexible infrastructure that doesn’t require them to manage it on their own, but enables them to effortlessly spin up or spin down infrastructure capacity and computing power as needed,” adds Ginandjar. “With Lintasarta Cloudeka they only pay for what they need and use, which is a one of the reasons many enterprises initially turn to us. With other providers, they often receive surprising bills or are forced to pay for services they don’t use.”

He also stresses that reliability and performance are key concerns for most organizations. It’s a reality he says leads many to Lintasarta Cloudeka.

“Because we are directly supported by Lintasarta’s high-speed, high-performance networks that are renowned for their unflinching reliability, and we utilize VMware technologies that are proven and trusted by enterprises in every industry and our relied by most organizations in their on-site data centers, prospective customers know they are getting a solution that will free them to focus more on their applications and their business rather than their new software-defined infrastructure.”

The resulting peace-of-mind is something he believes will be even stronger because Lintasarta Cloudeka now holds the VMware Cloud Verified distinction.

“Achieving the VMware Cloud Verified distinction is not an easy task for any cloud solutions or services provider, says Ginandjar. “For customers, VMware Cloud Verified is more than a distinction – it’s how they seek a trusted cloud service provider. They can be certain that all technology was implemented using best practices by individuals that really understand the solutions being deployed.”

While Ginandjar still feels that some misconceptions about the cloud remain among many companies in Indonesia – including that the primary use cases for the cloud, albeit valuable ones, are storage and backup – he feels that that is changing.

 “We’re seeing a dramatic increase in microservices and autoscaling here in Indonesia,” he says. “And with VMware, we’re ideally positioned to speed up the development process involved. Now our customers can react faster and offer new services faster without ever worrying if they have the infrastructure needed to make them possible.”

Learn more about Lintasarta Cloudeka and its partnership with VMware here.

Cloud Computing, IT Leadership, Managed Cloud Services