Moving SAP workloads to the cloud promises to be transformational, but it’s not for the faint of heart. Goals for an ERP modernization initiative often range from lowering costs through infrastructure savings to adding cloud-based capabilities to ERP tasks with minimal disruption to day-to-day business. Achieving these objectives takes perceptive analysis, meticulous planning, and skillful execution.  

“There are many factors to consider, including application complexity, legacy application requirements, data location, and compliance,” says Dilip Mishra, SAP delivery leader for the Cloud Migration and Modernization practice at Kyndryl. Teams must determine which workloads move to the cloud and which will remain on-premises. What’s more, adds Mishra, many organizations are likely to encounter a “long-tail” of interdependencies between applications and infrastructure that requires special expertise to unravel.  

Perhaps most important, the undertaking will not succeed without cooperation between IT and business leaders. “To overcome the perception that from a business perspective, the migration might look like a lot of effort for a little return, IT leaders must communicate the business case for moving each workload,” Mishra says. CIOs and their teams should also consider providing a systematic framework for delivering and measuring the value to the business now and in the future, covering technology, operations, and financials.  

In short, IT leaders can expect curves in the road that only seasoned experts can navigate without mishap. To that end, Kyndryl and AWS have established a partnership with an extensive track record in rehosting and re-platforming SAP workloads on AWS cloud services.  

Schneider Electric’s story  

Schneider Electric’s journey to the cloud began by moving its SAP applications from an outsourcer to a Kyndryl data center. After stabilizing the environment and integrating the operations of numerous acquired companies, Kyndryl optimized the applications and infrastructure while planning the transition to AWS. With the goal of maintaining continuous business operations, Kyndryl mapped out a migration to AWS that accorded the Kyndryl data center an important role in a hybrid cloud architecture.    

“A hybrid environment provides the flexibility of workload placement based on business requirements and provides a smoother transition to cloud because the customer has time to plan and re-engineer without going through a big-bang cutover,” says Naresh Nayar, Kyndryl distinguished engineer.   

Schneider’s internal team designed and built the AWS “landing zone,” a secure environment with strict rules about firewalls, connectivity, and security groups. Kyndryl architected the new operating environment using its framework for cloud operations and provided specifications that AWS and Schneider technical teams used to provision the new infrastructure in the landing zone.  

Schneider Electric’s move shows that a non-disruptive cloud transition is possible with careful planning and a deep portfolio of skills. For such enterprise migrations, experience matters: Currently, more than 5,000 SAP customers run on AWS. The AWS portfolio includes AWS migration Hub, AWS Application Discovery Service, AWS Application Migration Service, AWS Service Catalogue, and AWS Database Migration Service. For its part, Kyndryl brings to bear more than three decades and 90,000 skilled practitioners providing IT services at the highest level.  

Learn more about how Kyndryl and AWS are innovating to achieve transformational business outcomes for customers.  

ERP Systems

‘Cloud’ is a buzzword that has run its course in a lot of industries, but there is a resurgence of cloud talk in the contact center arena these days.

Contact Center as a Service (CCaaS) is a high-priority digital transformation project for many businesses around the world, and some of the biggest players in tech are jumping in with both feet. Zoom, Microsoft, Amazon Web Services, Google, and Salesforce are all touting new ideas to leverage voice, digital and technological advances like artificial intelligence (AI), natural language processing (NLP), and machine learning (ML) in the contact center. At the same time, legacy on-premises players like Genesys1, Cisco, and Avaya are making big bets on the cloud.

While midsized companies have generally found it easier to move their contact centers to the cloud (many were urged on by the push to have contact center agents working from home during the pandemic), many bigger enterprises have yet to take the plunge. The global market for CCaaS offerings is expected to grow by 26.1% annually2 from 2022 to 2027, expanding from $17.1 billion to $54.6 billion. With all the buzz about CCaaS in the industry and amid looming economic uncertainty, it is realistic to expect that this shift will happen even faster.

The real difference between cloud and on-premises

Traditional contact center technology is built purposefully. It is intended to be used as a telephony-based call center solution, and it has worked really well for a long time, which is partly why larger organizations haven’t been as quick to jump to the cloud.

The cloud is a different kettle of fish. Many CCaaS offerings include an assortment of technologies, apps, and integrations with an ecosystem of partner apps assembled to form a contact center. In addition to core contact center functionality, cloud solutions have added perks, like built-in productivity tools and AI integration, which traditional telephony solutions just can’t match.

Adding AI to your contact center is a game-changer, and moving to the cloud gives you access to the best the industry has to offer. AI comes baked into many CCaaS solutions, which opens fresh opportunities for companies to leverage natural language engines like voice bots, chatbots, or conversational IVRs to enable more self-service options for customers while freeing up human agents and driving the cost of service down.

How to get started: Lift and Shift vs Lift and Shine

Sometimes, it makes perfect sense to take what you have in your on-premises solution and just move it to the cloud. Other times, a cloud migration is a great opportunity to reassess priorities and examine how your customers are interacting with your business. Identify your customers’ true intents: What does a successful interaction with your business look like to them? How can you deliver personalization and opportunities for self-service? Then, bring that vision to life by shifting the way you think about your contact center and throwing the rule book out the window.

If a move to the cloud coincides with a foray into AI, maybe your IVR becomes conversational. Should you revamp call flows or elements of your customer journey to make them more user-friendly? How can you incorporate chatbots, voice bots, or other digital channels or real-time communication? Whatever you decide, you need to have a meticulous plan to ensure a smooth transition.

If you fail to plan, plan to fail

All too often, as a cloud migration gets rolling, the contingencies, nuances, and complexities overwhelm even the best migration teams.

When Electrolux3 set out to consolidate their European on-premises contact centers into one cloud-based solution, the team soon realized that bringing together the staff, processes, and legal requirements from different countries and languages would create a new set of challenges. With its existing fully manual testing process, there was no way for Electrolux to run tests at the new scale demanded by the cloud. Manual testing proved too slow to keep pace. They also struggled to consistently measure cloud environment stability and quickly identify issues that needed to be addressed from the customer’s perspective. Electrolux turned to Cyara for help monitoring voice quality and digital channels, improving end-to-end call routing, and accelerating their regression testing from 14-day cycles to overnight to get their migration back on track.

Assure your cloud migration with the right testing and processes

A rigorous testing regimen will make sure that you know how the system performed before, during, and after key steps in migration. Unfortunately, no team of manual testers can do this astutely without help from automation.

Even in the cloud, the technology required to run a contact center involves integrating disparate systems — such as a CRM or ERP — not delivered by the CCaaS platform. Cloud providers will test their solutions with their own technology, but they won’t test your cloud instance with your data and your integrations, which is necessary to assure performance.

This is an immense, complex task that will require many rounds of coding and testing. You need to successfully get through the entire migration process to day 0 and flip the switch with confidence and assurance. Ensuring this goes smoothly requires comprehensive testing.

The sooner you can fully migrate to the cloud, the sooner you can recognize the benefits and economies that come along with that move. Companies that use Cyara to assure their migration move to the cloud 2x faster than those that don’t. Learn more about how Cyara helps at every stage of the cloud migration journey by reading our eBook.

1: No Jitter. “Genesys Cloud: And Then There Was One.” October 2022.

2. Research And Markets. “Cloud-Based Contact Center Market by Component, Deployment mode, Organization Size, Industry and Region.” June 2022.

3. Electrolux Case Study by Cyara

Cloud Computing

The contact center has traditionally operated through on-premises servers and software, but shifting it to the cloud can help CIOs improve the customer journey. Advances in artificial intelligence (AI) and cloud-based contact center-as-a-service (CCaaS) options now give enterprises more confidence that they can better deliver high-quality customer experiences.

However, there are potential challenges around moving contact center services, especially if the business has spent years cultivating good relationships with its customers. For many organizations, even a 1% drop in the performance of an Intelligent Voice Response (IVR) system can result in a surge of support calls for live agents, who are already under enormous pressure due to workforce shortages.

Here are the top three factors to consider before migrating your contact center to the cloud:

Avoid a rush to the cloud: Contact center software that has been optimized over the years cannot simply be rewritten and moved to a new CCaaS platform. Specific and careful planning must take place to keep optimizations intact and avoid breaking the customer experience. For example, many systems have carefully constructed call flows, routing rules, and natural language libraries featuring customer terminology. Those were refined over time and can’t be immediately replicated in a new system from day one.
Ensure portability to avoid vendor lock-in: Several CCaaS providers offer services that utilize a specific cloud host, making them inflexible. Any cost savings from moving contact center software to the cloud could be negated if an enterprise decides to change cloud providers due to market or technology changes, mergers, or other events. Choosing the right CCaaS provider means looking for options that are cloud-agnostic, as well as nimble enough to move should the need arise.
Enable AI systems that can handle demand spikes or other changes: New technologies that integrate natural language processing and machine learning algorithms can provide flexibility for companies experiencing an influx of new customer interactions. AI software that can learn with each customer engagement helps to ensure that the system recognizes the optimal action for the next customer who inquires about a new topic. For example, early in the pandemic, banking institutions needed to add messages about lobby closures and longer hold times, but many companies began receiving inquiries about stimulus check status that the system couldn’t quickly answer. A solution that can recognize new questions from customers and quickly react with answers is one of the benefits of an AI system that is constantly learning.

Careful planning combined with a strategy that continually prioritizes the customer experience will enable enterprises to achieve a smooth, optimized, and efficient transition to the cloud. IT leaders should partner with well-established experts to guarantee the best experience in that transformation.

To learn more about open, extensible, and collaborative contact center solutions, explore the Microsoft Digital Contact Center Platform powered with Nuance AI, Teams, and Dynamics 365, here.

Cloud Management

By Chet Kapoor, Chairman and CEO, DataStax

There is no doubt that this decade will see more data produced than ever before.

But what’s truly going to transform our lives, define the trajectory of each of our organizations, and reshape industries is not the massive volume of data. It’s the unmatched degree to which this data can now be activated in applications that drive action in real time: minute by minute (or even second by second), across work, play, and commerce. Where technology might have been a constraint in the past, it’s now an enabler.

Here, we’ll take a look at why real-time apps are no longer just the domain of internet giants and discuss three ways that your organization can move toward delivering real-time data.

The future is here

IDC predicts that by next year there will be more than 500 million new cloud native digital apps and services – more than the total created over the past 40 years.

We’re already living in this future. We get turn-by-turn driving directions while listening to an AI-recommended playlist, and then arrive at the exact time our e-commerce order is brought to us curbside – along with a cup of hot coffee.

The real-time data powering apps that change industries is no longer just offered by a Google or a Spotify.

Companies like Target excel at it. The retailer delights customers with an app that shows users what they most want to see, ensures no one ever misses a deal, has a near-perfect record of intelligent substitutions for out-of-stock items, and gets users their orders on their terms (and it might just include a drink from Starbucks, another enterprise that is a real-time app powerhouse).

Smaller businesses are making real-time data core to their offerings, too. Ryzeo offers a marketing platform that leverages real-time data generated by events on its clients’ e-commerce websites. An item that a shopper views or searches for instantly results in an AI-driven recommendation through its “suggested items.”  Real-time data – and the technology that supports it – is how Ryzeo makes this happen. 

Inaction isn’t an option

The door is open to you and your organization, too.

The best-of-breed technologies that power winning real-time apps are open source and available as a service, on demand to all. There are tons of proven use cases across industries. When you leverage these use cases and technologies, there’s a big payoff – you increase your organization’s ability to innovate and turn data into delightful customer experiences.

This will not only transform how your business grows, but how your business works.

As consumers, we never want to go back to dumb apps that evolve slowly, don’t know our context, and fail to act intelligently on our behalf. In fact, we desire the opposite.

When you put the customer’s digital experience at the center of agile workflows, make fast decisions, and rapidly iterate, you create a powerful feedback loop. Every win shows the power of a new and more fulfilling way of working. So does every failure – by providing valuable learnings.

The one thing you can count on is that inaction is not an option. And at this moment in time, why would we want to wait?

There is no doubt that real time data can reduce waste, increase safety, help the environment, make people happier and healthier. And we’re only just getting started.

So how do you get started? You can make three important choices right now to set your organization on a path to excel at delivering real-time data.

Step 1: Pick up the right tools

The technology to deliver outstanding, data-powered, real-time experiences has arrived – and we’ve got it in spades. The best of breed tools are open source. They grew out of the “best of the internet” to solve novel problems about scale and data velocity. Apache Cassandra®, for example, was developed at Facebook to manage massive amounts of messaging data.

Joining the open source ecosystem means you don’t have to reinvent the wheel. This is important because what sets your organization’s real-time data experiences apart won’t be the infrastructure. It’ll be how you put your domain knowledge to use in new ways that delight your users.

Most of these technologies are available on demand as-a-service to everyone. If you didn’t add them to your data infrastructure yesterday, do it today.

Step 2: Assemble the right teams

When every company is a software company, every executive must also be a software executive. This includes your line of business owners, general managers, and functional leaders.

Winning companies reorganize team structures and accountability to match. The days of data scientists experimenting alone in an ivory tower and developers working under requirements that were “thrown over the wall” to IT are over. “The business” can no longer think of data and technology as “IT’s problem.”

All of your employees need to be trained to identify and capitalize on opportunities for using data and technology to drive business results. Your line of business owners must be held accountable for making it happen.

To empower them, assign your developers, data scientists, and technical product managers to cross-functional teams working side-by-side with their business domain colleagues that own customer experiences. This is a ticket out of “pilot purgatory” and a key to democratizing innovation across your company.

Step 3: Ask the right questions

As you advance on your journey, more and more smart systems will be working every minute of every day to answer your industry’s key questions, like “what’s the most compelling personalized offer for this customer?” or “what’s the optimal inventory for each store location?”

What those systems can’t do is ask questions that only humans can, such as “how do we want to evolve our relationship with our customers?” Or “how can we deploy our digital capabilities in ways that differentiate us from our competitors?”

No algorithm is going to kick out the brilliant and empathetic idea to “Show Us Your Tarzhay,” which turned what might have otherwise been the unfortunate necessity of having to shop on a limited budget into the opportunity to celebrate and share a distinctive personal style. Similarly, it took human creativity to expand the concept from clothing into a new category (groceries).

If you take the first two steps listed above, you will start to free up your people’s time to ask creative questions and improve their ability to deliver on the answers using best-of-breed technology. Equip, challenge, and inspire them to think big about where you want to take your customers next, and you’ll get your organization moving in the right direction to provide the benefits of real-time data to your customers.

Learn more about DataStax here.

About Chet Kapoor:

Chet is Chairman and CEO of DataStax. He is a proven leader and innovator in the tech industry with more than 20 years in leadership at innovative software and cloud companies, including Google, IBM, BEA Systems, WebMethods, and NeXT. As Chairman and CEO of Apigee, he led company-wide initiatives to build Apigee into a leading technology provider for digital business. Google (Apigee) is the cross-cloud API management platform that operates in a multi- and hybrid-cloud world. Chet successfully took Apigee public before the company was acquired by Google in 2016. Chet earned his B.S. in engineering from Arizona State University.

Data Management, IT Leadership

Since 2015, the Cloudera DataFlow team has been helping the largest enterprise organizations in the world adopt Apache NiFi as their enterprise standard data movement tool. Over the last few years, we have had a front-row seat in our customers’ hybrid cloud journey as they expand their data estate across the edge, on-premise, and multiple cloud providers. This unique perspective of helping customers move data as they traverse the hybrid cloud path has afforded Cloudera a clear line of sight to the critical requirements that are emerging as customers adopt a modern hybrid data stack. 

One of the critical requirements that has materialized is the need for companies to take control of their data flows from origination through all points of consumption both on-premise and in the cloud in a simple, secure, universal, scalable, and cost-effective way. This need has generated a market opportunity for a universal data distribution service.

Over the last two years, the Cloudera DataFlow team has been hard at work building Cloudera DataFlow for the Public Cloud (CDF-PC). CDF-PC is a cloud native universal data distribution service powered by Apache NiFi on Kubernetes, ​​allowing developers to connect to any data source anywhere with any structure, process it, and deliver to any destination.

This blog aims to answer two questions:

What is a universal data distribution service?Why does every organization need it when using a modern data stack?

In a recent customer workshop with a large retail data science media company, one of the attendees, an engineering leader, made the following observation:

“Everytime I go to your competitor website, they only care about their system. How to onboard data into their system? I don’t care about their system. I want integration between all my systems. Each system is just one of many that I’m using. That’s why we love that Cloudera uses NiFi and the way it integrates between all systems. It’s one tool looking out for the community and we really appreciate that.”

The above sentiment has been a recurring theme from many of the enterprise organizations the Cloudera DataFlow team has worked with, especially those who are adopting a modern data stack in the cloud. 

What is the modern data stack? Some of the more popular viral blogs and LinkedIn posts describe it as the following:

Ben Patterson/IDG

A few observations on the modern stack diagram:

Note the number of different boxes that are present. In the modern data stack, there is a diverse set of destinations where data needs to be delivered. This presents a unique set of challenges.The newer “extract/load” tools seem to focus primarily on cloud data sources with schemas. However, based on the 2000+ enterprise customers that Cloudera works with, more than half the data they need to source from is born outside the cloud (on-prem, edge, etc.) and don’t necessarily have schemas.Numerous “extract/load” tools need to be used to move data across the ecosystem of cloud services. 

We’ll drill into these points further.  

Companies have not treated the collection and distribution of data as a first-class problem

Over the last decade, we have often heard about the proliferation of data creating sources (mobile applications, laptops, sensors, enterprise apps) in heterogeneous environments (cloud, on-prem, edge) resulting in the exponential growth of data being created. What is less frequently mentioned is that during this same time we have also seen a rapid increase of cloud services where data needs to be delivered (data lakes, lakehouses, cloud warehouses, cloud streaming systems, cloud business processes, etc.). Use cases demand that data no longer be distributed to just a data warehouse or subset of data sources, but to a diverse set of hybrid services across cloud providers and on-prem.  

Companies have not treated the collection, distribution, and tracking of data throughout their data estate as a first-class problem requiring a first-class solution. Instead they built or purchased tools for data collection that are confined with a class of sources and destinations. If you take into account the first observation above—that customer source systems are never just limited to cloud structured sources—the problem is further compounded as described in the below diagram:

Unisys

The need for a universal data distribution service

As cloud services continue to proliferate, the current approach of using multiple point solutions becomes intractable. 

A large oil and gas company, who needed to move streaming cyber logs from over 100,000 edge devices to multiple cloud services including Splunk, Microsoft Sentinel, Snowflake, and a data lake, described this need perfectly:

Controlling the data distribution is critical to providing the freedom and flexibility to deliver the data to different services.”

Every organization on the hybrid cloud journey needs the ability to take control of their data flows from origination through all points of consumption. As I stated in the start of the blog, this need has generated a market opportunity for a universal data distribution service.

P Wei / Getty Images

What are the key capabilities that a data distribution service has to have?

Universal Data Connectivity and Application Accessibility: In other words, the service needs to support ingestion in a hybrid world, connecting to any data source anywhere in any cloud with any structure. Hybrid also means supporting ingestion from any data source born outside of the cloud and enabling these applications to easily send data to the distribution service.Universal Indiscriminate Data Delivery: The service should not discriminate where it distributes data, supporting delivery to any destination including data lakes, lakehouses, data meshes, and cloud services.Universal Data Movement Use Cases with Streaming as First-Class Citizen: The service needs to address the entire diversity of data movement use cases: continuous/streaming, batch, event-driven, edge, and microservices. Within this spectrum of use cases, streaming has to be treated as a first-class citizen with the service able to turn any data source into streaming mode and support streaming scale, reinforcing hundreds of thousands of data-generating clients.Universal Developer Accessibility: Data distribution is a data integration problem and all the complexities that come with it. Dumbed down connector wizard–based solutions cannot address the common data integration challenges (e.g: bridging protocols, data formats, routing, filtering, error handling, retries). At the same time, today’s developers demand low-code tooling with extensibility to build these data distribution pipelines.

Cloudera DataFlow for the Public Cloud, a universal data distribution service powered by Apache NiFi

Cloudera DataFlow for the Public Cloud (CDF-PC), a cloud native universal data distribution service powered by Apache NiFi, was built to solve the data collection and distribution problem with the four key capabilities: connectivity and application accessibility, indiscriminate data delivery, streaming data pipelines as a first class citizen, and developer accessibility. 

IDG

CDF-PC offers a flow-based low-code development paradigm that provides the best impedance match with how developers design, develop, and test data distribution pipelines. With over 400+ connectors and processors across the ecosystem of hybrid cloud services including data lakes, lakehouses, cloud warehouses, and sources born outside the cloud, CDF-PC provides indiscriminate data distribution. These data distribution flows can then be version controlled into a catalog where operators can self-serve deployments to different runtimes including cloud providers’ kubernetes services or function services (FaaS). 

Organizations use CDF-PC for diverse data distribution use cases ranging from cyber security analytics and SIEM optimization via streaming data collection from hundreds of thousands of edge devices, to self-service analytics workspace provisioning and hydrating data into lakehouses (e.g: Databricks, Dremio), to ingesting data into cloud providers’ data lakes backed by their cloud object storage (AWS, Azure, Google Cloud) and cloud warehouses (Snowflake, Redshift, Google BigQuery).

In subsequent blogs, we’ll deep dive into some of these use cases and discuss how they are implemented using CDF-PC. 

Get Started Today

Wherever you are on your hybrid cloud journey, a first class data distribution service is critical for successfully adopting a modern hybrid data stack. Cloudera DataFlow for the Public Cloud (CDF-PC) provides a universal, hybrid, and streaming first data distribution service that enables customers to gain control of their data flows. 

Take our interactive product tour to get an impression of CDF-PC in action or sign up for a free trial.

Data Center