Over the last few months, both business and technology worlds alike have been abuzz about ChatGPT, and more than a few leaders are wondering what this AI advancement means for their organizations. Let’s explore ChatGPT, generative AI in general, how leaders might expect the generative AI story to change over the coming months, and how businesses can stay prepared for what’s new now—and what may come next.

What is ChatGPT?

ChatGPT is a product of OpenAI. It’s only one example of generative AI.

GPT stands for generative pre-trained transformer. A transformer is a type of AI deep learning model that was first introduced by Google in a research paper in 2017. Five years later, transformer architecture has evolved to create powerful models such as ChatGPT.

ChatGPT has significantly improved the number of tokens it can accept (4,096 tokens vs 2,049 in GPT-3), which effectively allows the model to “remember” more about a current conversation and informs subsequent responses with context from previous question-answer pairs in a conversation. Every time the maximum number of tokens is reached, the conversation resets without context—reminiscent of a conversation with Dory from Pixar’s Nemo.

ChatGPT was trained on a much larger dataset than its predecessors, with far more parameters. ChatGPT was trained with 175 billion parameters; for comparison, GPT-2 was 1.5B (2019), Google’s LaMBDA was 137B (2021), and Google’s BERT was 0.3B (2018). These attributes make it possible for users to enquire about a broad set of information.

ChatGPT’s conversational interface is a distinguished method of accessing its knowledge. This interface paired with increased tokens and an expansive knowledge base with many more parameters, helps ChatGPT to seem quite human-like.

ChatGPT is certainly impressive, and its conversational interface has made it more accessible and understandable than its predecessors. Meanwhile, however, many other labs have been developing their own generative AI models. Some examples are originating from MicrosoftAmazon Web ServiceGoogleIBM , and more, plus from partnerships among players. The frequency of new generative AI releases, the scope of their training data, the number of parameters they are trained on, and the tokens they can take in will continue to increase. There will be more developments in the generative AI space for the foreseeable future, and they’ll become available rapidly. It was 2 years from GPT-2 (February 2019) to GPT-3 (May 2020), 2.5 years to ChatGPT (November 2022), and only 4 months to GPT-4 (March 2023).

How ChatGPT and generative AI fit with conversational AI

Protiviti

Text-based generative AI can be considered a key component in a broader context of conversational AI. Business applications for conversational AI have, for several years already, included help desks and service desks. A natural language processing (NLP) interpretation layer underpins all conversational AI, as you must first understand a request before responding. Enterprise applications of conversational AI today leverage responses from either a set of curated answers or results generated from searching a named information resource. The AI might use a repository of frequently asked questions (producing a pre-defined response) or an enterprise system of record (producing a cited response) as its knowledge base.

When generative AI is introduced into conversational applications, it is impossible today to provide answers that include the source of the information The nature of generative capabilities of a large language model is to create a novel response by compiling And restructuring information from a body of information. This becomes problematic for enterprise applications, as it is often imperative to cite the information source to validate a response and allow further clarification.

Another key challenge of generative AI today is its obliviousness to the truth. It is not a “liar,” because that would indicate an awareness of fact vs. fiction. It is simply unaware of truthfulness, as it is optimized to predict the most likely response based on the context of the current conversation, the prompt provided, and the data set it is trained on. In its current form, generative AI will oblige information as prompted, which means your question may lead the model to produce false information. Any rules or restrictions on responses today are built in as an additive “safety” layer outside of the model construct itself.

For now, ChatGPT is finding most of its applications in creative settings. But one day soon, generative AI like ChatGPT will draw responses from a curated knowledge base (like an enterprise system of record), after which more organizations will be able to apply generative AI to a variety of strategic and competitive initiatives, as some of these current challenges could be addressed.

Leaders can start preparing today for this eventuality, which could come in a matter of months, if recent developments indicate how fast this story will continue to move: in November of 2022, ChatGPT was only accessible via a web-based interface. By March of 2023, ChatGPT’s maker OpenAI announced the availability of GPT3.5 Turbo, an application programming interface (API) via which developers can integrate ChatGPT into their applications. The API’s availability doesn’t resolve ChatGPT’s inability to cite sources in its responses, but it indicates how rapidly generative AI capabilities are advancing. Enterprise leaders should be thinking about how advances in generative AI today could relate to their business models and processes tomorrow.

What it takes to be ready

Organizations that have already gained some experience with generative AI are in a better position than their peers to apply it one day soon. The next impressive development in generative AI is fewer than six months away. How can organizations find or maintain an edge? The principles of preparing for the great “what’s next?” remain the same, whether the technology in question is generative AI or something else.

It’s hard to achieve a deep, experiential understanding of new technology without experimentation. Leaders should define a process for evaluating these AI technology developments early, as well as an infrastructure and environment to support experimentation.

They should respond to innovations in an agile way: starting small and learning by doing. They’ll keep track of innovation in the marketplace and look for opportunities to refresh their business and competitive strategies as AI advances become available to them.

They should seed a small cross-functional team to monitor these advancements and experiment accordingly. Educate that team about the algorithms, data sources, and training methods used for a given AI application, as these are critical considerations for enterprise adoption. If they haven’t already, they should develop a modular and adaptable AI governance framework to evaluate and sustain solutions, specifically including generative abilities, such as the high-level outline below:

Protiviti

Leaders need not wonder what ChatGPT, other generative AI, and other revolutionary technologies might mean for their business and competitive strategy. By remaining vigilant to new possibilities, leaders should create the environment and infrastructure that supports identification of new technology opportunities and prepare to embrace the technology as it matures for enterprise adoption.

Learn more about Protiviti’s Artificial Intelligence Services.

Connect with the Author

Christine Livingston
Managing Director, Technology Consulting

Artificial Intelligence, Machine Learning

The sixth annual report from Tech Talent Charter (TTC) has revealed that while companies in the UK are making progress toward improving diversity in their overall workforce, there is still a significant lack of diversity among senior technology leaders.

The not-for-profit charity, which focuses on tracking diversity in technology, compiled its report using data from 649 signatory companies, including Global, HP, Lloyds Banking Group, Nominet, PwC and CWJobs. The 210,245 employees included in the data set are estimated to represent around 16% of the UK’s technology workforce.

The Tech Talent Charter is free to sign — the only obligation signatories are required to meet is sharing their data with the charity when requested. It is only mandatory for signatories to share gender and ethnicity data but this year’s report represents the first time the TTC has started to track other aspects of diversity, including age, disability, sexual orientation, religion and neurodiversity.

The aim of collecting this data set is to try to understand what is actually happening at the coalface of diversity and inclusion in tech, since not being able to fill shortages in the tech talent market costs the UK economy about  £63 billion a year, according to Tech Talent Charter COO Lexie Papaspyrou.

Commenting on the report’s key takeaways,  Papaspyrou said that while it’s heartening to see that 28% of tech workers are gender minorities and 25% are from minority ethnic backgrounds, when those figures are compared to the percentage currently holding senior leadership positions, the drop-off is  alarming.

The data collected by TTC found that 22% of senior tech roles are held by gender minorities, a figure that is 6% lower when compared to tech roles overall, while ethnic diversity almost halves in senior roles, dropping from 25% to 13%.

There’s a pervasive idea that these figures just highlight the fact that a large percentage of women naturally leave the workforce at certain point to start families, but that doesn’t explain the drop-off experienced by people from an ethnicity minority background,  Papaspyrou said.

“There are no natural barriers that exist for ethnic diversity and senior roles in the same way that maybe you could argue exist with regards to gender,” she said. “There is a gendered societal problem that women are dropping out of the workforce because they need to take parental or career breaks, but that’s not the case when it comes to ethnicity.”

Papaspyrou added that she doesn’t believe it’s a coincidence that, for senior roles, the data for gender parity is more positive than the data for ethnic parity, given that the UK government has made gender pay gap reporting mandatory, but not ethnicity.

When it comes to D&I, data is key

One of the founding tenets of the Tech Talent Charter is the importance of data, so much so that if a company fails to provide TTC with the information required from them, they are removed as a signatory.

One of the ways the charity is helping organizations to have a better understanding of where they are on their D&I journey is through dynamic benchmarking, with a new tool that is freely available and allows companies to input their diversity figures and see how they compare to other organizations of the same size in their region and sector, Papaspyrou said.

“Those are the three areas we are constantly questioned about by companies,” she said. “They say, ‘I don’t know how to contextualize my figures because the publicly available ones are across all UK companies and I’m a small SME in the North East, so that isn’t relevant to me’.”

For TTC, while there is still an amount of churn regarding organizations that refuse to hand over their data, many companies are entering their fifth or sixth year of being signatories to the charter.

As a result, their support and willingness to provide more data has led to TTC being able to ask questions through eight diversity lenses, with neurodiversity emerging as a distinct area of interest, Papaspyrou said. Among current signatories, 53% are now measuring neurodiversity among employees, a figure that has doubled from last year.

Measurement of social mobility lags

However, Papaspyrou  was concerned about the data gap that exists regarding the measurement of social mobility, which lags far behind other areas including age, religion and orientation.

“We need to be looking at these intersections and social mobility is the one that falls across every single other lens and has such massive tangible impact on what works and what doesn’t work,” Papaspyrou said, adding that it was dismaying to see that lack of reporting, particularly in a year where the country is going through such economic hardship.

“When you look at the technology industry and where tech salaries are, in some cases three times the average UK salary, the fact that more organizations are not focusing on this is a big opportunity lost,” she said.

Looking forward to the next 12 months, Papaspyrou said TTC will be working with its signatories so that when the next data set is collected, organizations are ready to tell the truth, whether the data they have is good, bad, or unavailable.

There will be a lot of activity focused on how to improve progression at higher levels and removing the barriers that are stopping diverse employees from reaching those roles, Papaspyrou said.

“It seems like the entire business community here has really picked up on this idea that you can pack as many people into the tech workforce as soon as you want. And, while it’s great and you’re getting them in, if they’re not getting on, why are we doing it?” she said.

Diversity and Inclusion

IBM reported net income  of $2.9 billion in the fourth quarter of 2022 and year-on-year increases in revenue across all three of its business segments.

That’s an increase in net income of 9% compared to the total reported for the corresponding quarter of 2021, or 17% comparing only continuing operations: IBM spun off most its infrastructure management division as a new business, Kyndryl, in November 2021, and sold some assets of its Watson Health business in January 2022.

On a conference call with analysts to discuss the results, CFO Jim Kavanaugh alluded vaguely to this leaving IBM with some stranded costs in its business, saying, “We expect to address these remaining stranded costs early in the year and anticipate a charge of about $300 million in the first quarter.”

Later that day, in an interview with Bloomberg, Kavanaugh explained that eliminating those stranded costs — staff left with nothing to do following the asset disposals —  would result in IBM cutting about 3,900 jobs, or 1.5% of its workforce. An IBM representative said the company had “nothing further to add.”

IBM CEO Arvind Krishna said on the call with analysts that the company’s fourth quarter and full year results demonstrate the successful execution of its hybrid cloud and AI strategy.

He noted that IBM was continuing to invest in large language and foundation models — the technologies behind tools like ChatGPT — and is infusing these capabilities across the company’s AI portfolio. In addition, Krishna said that business with strategic partners, including SAP, Microsoft and AWS, had generated over $1 billion in revenue for the year.

“I’m confident in our ability to leverage hybrid cloud and AI to help clients turn business challenges into opportunities,” he said.

Growth across all lines of business

Krishna told analysts that IBM had delivered strong revenue growth across its business, with results “broad-based” across its software, consulting, and infrastructure segments as well as across geographies.

Software sales for the fourth quarter rose 2.8% from a year earlier, to $7.3 billion, while infrastructure sales rose 1.6% to of $4.5B.

Consulting revenue also grew, but by just 0.5%, to $4.8 billion. If it continues at that rate, it will fall behind the market: Gartner forecast that global spending on consulting will grow 6.7% this year to $264.9 billion.

IBM’s software segment was boosted by sales of its hybrid platform and solutions, up 5%, while automation, data and AI, and security all contributed growth of 4%; transaction processing revenue was down by 3%. Revenue generated by Red Hat accounted for the biggest area of growth in this segment, up 10% from a year earlier.

The biggest driver in the infrastructure segment was the zSystems line of mainframe computers, up 16% after the z16 model became generally available in May. However, distributed infrastructure revenue remained flat and infrastructure support was down by 8% year on year.

Kavanaugh said on the conference call that he hopes to squeeze another $200 million profit from the infrastructure segment in 2023 by extending the period over which it amortizes the cost of its IT assets. “Due to advances in technology, we are making an accounting change to extend the useful life of our server and networking equipment,” he said. “Given this is a change to the depreciation, there’s no benefit to cash.”

IT Consulting Services, Multi Cloud

The pandemic has led many organizations in the Middle East to shift towards a digital-first strategy. According to IDC’s group vice president and regional managing director for the Middle East, Turkey, and Africa, Jyoti Lalchandani: “This means choosing digitalization options over non-digital options as a rule while implementing or enhancing new products, services, channels, customer/employee experiences, or operational processes.”

While many organizations in the Middle East region already had a digital-first strategy before the pandemic, over 40% of the CIOs of medium and large organizations IDC surveyed over the last month say they have shifted towards being “digital-first” because of the pandemic. A quarter of the respondents say they are now extending the digital-first strategies developed during the pandemic.
The Middle East is on the edge of a massive digital disruption. Companies and governments are aware of the benefits of new technologies and digitization: optimizing costs and operating resources, ensuring customer satisfaction, attracting new customers, and gaining a competitive advantage through digital adoption.

GCC countries are increasingly looking to invest in Digital. Saudi Arabia has created its Saudi Vision 2030 program and the United Arab Emirates is driving initiatives through its UAE Digital Government Strategy 2025.
UAE’s program aims to double the contribution of the digital economy to the country’s GDP from 9.7% to 19.4% over the next 10 years, and the strategy is already bearing fruit.

Jyoti Lalchandani highlights the flagship giga projects in Saudi such as NEOM, the Red Sea Project, AMAALA, and Ad Diriyah that will drive significant technology spending in 2023 and beyond as the Kingdom builds greenfield digital infrastructure and platforms, leveraging advanced technologies like AI/ML, IoT, edge, and 5G to create innovative use cases.

At the same time, as organizations across the region become more digitally mature, they must transition from enabling digital transformation to running a digital business. This requires them to derive a larger share of revenues from digital products, services, channels, and platforms, something that is now a priority for 50% of the Middle East CIOs surveyed by IDC.

“As their organizations make this transition over the coming 12-18 months, CIOs will be increasingly challenged to accelerate the digitalization of operations (process automation, reengineering, and productivity improvements) and deliver insights at scale across the organization by building capabilities in data and enterprise intelligence,” adds Lalchandani.

IDC says we can expect to see a considerable jump in interest around the metaverse over the course of 2023, IDC research shows that only 2% of CIOs in the Middle East currently have a commercial use for the metaverse, 20% of them are planning to develop one in the next 12 months.

LEAP and Gitex: Must-attend events in the region

LEAP is Saudi Arabia’s global tech event focused on bringing together leading tech corporations to inspire the next generation of start-ups and venture capitalists.
The first edition of the conference was held in February last year and attracted more than 100,000 visitors. This will be the second edition of the technology conference which has been launched with the theme of ‘Into New Worlds’. LEAP 2023 will be held from February 6-9 in Riyadh Front Expo Centre, Saudi Arabia.

During Gitex week, Dubai was the stage for Chinese company XPeng’s flying car to make its first public trip. Also last year at Gitex and Gisec (Gitex event focused on cybersecurity) the topic of women in IT came to the fore, with a full-day panel exploring initiatives to close the gender gap in technology. In terms of cybersecurity, much discussion was about how to increase the percentage of women in cybersecurity roles over the last decade, women make up only 25% of the global cybersecurity workforce, according to the latest (ISC)² Cybersecurity Workforce Study.

“Nowadays, only 25% of women are studying computer science. In 1985 that percentage was 37%. We are attracting fewer women to come to the university,” said Inass Farouk, marketing director at Microsoft UAE during the conference. At the same time, women are so underrepresented, the cybersecurity is woefully short of staff; (ISC)2 estimates that the cybersecurity industry urgently needs 2.5 million professionals.
This year, Gitex will take place from October 16-20, and for the first time in the MEA region, Morocco will host the Gitex Africa edition in June.

What will CIOs in the region be focusing on and why?

Cloud is undoubtedly becoming the de facto operating platform for digital business, and we can expect to see a greater number of CIOs across the region move beyond the initial stages of cloud adoption to focus more on the broader implementation of business apps in the cloud.

Hybrid multi-clouds will become the norm in 2023, with spending on both public and private clouds continuing to increase, and application modernization will accelerate as organizations look to transform their app estates leveraging the cloud, according to IDC’s Jyoti Lalchandani.

Sustainability will be another key theme throughout the year, with around one-third of the Middle East CIOs surveyed by IDC having already started the data discovery process to identify the internal systems that house the key data required to drive sustainability initiatives. This year COP28, the 28th session of the Conference of Parties will be held at Dubai Expo City from November 30 to December 12 to discuss and find solutions for climate change.

Technology Industry

Fueled by strong sales of cloud-based software that more than offset a decline in revenue from on-premises applications, SAP revenue jumped in the third quarter compared to the year-earlier period.

Total revenue for the quarter ending Sept. 30 was €7.84 billion (US$7.72 billion), up 15%, according to company’s quarterly financial report, released Tuesday. SAP’s cloud and software revenue for the quarter rose 14% to €6.71 billion.

Cloud revenue alone rose 38% to €3.28 billion. Revenue for the company’s S/4HANA cloud-based ERP offering, in particular, nearly doubled, rising 98% to €546 million.

SAP results benefited from the strong US dollar, as dollar sales were converted to euros—for example, overall revenue growth in constant currency terms, which exclude the effect of currency fluctuations, was 5% rather than the nominal 15%.

Nevertheless, the uptick in cloud sales is good news for SAP, since the company has been pushing customers to migrate from its legacy Business Suite 7, which is usually run on-premises, ever since S/4HANA was launched in 2015. Selling cloud software and services is good business for SAP and other software providers, since, among other things, cloud subscriptions provide a more predictable revenue stream than renewals of licenses for on-premises applications.  

Despite the jump in revenue, SAP posted a 1% dip in operating profit to €1.239 billion due largely to rising costs in areas including research and development, sales and marketing, and the need for more outlay on the maintenance of the company’s cloud services. Licenses for on-premises software declined 38% to €406 million, reflecting customer migration to cloud-based software.

Unsurprisingly, the Germany-based software giant was most eager to talk about its success in the cloud.

“The trust in SAP is reflected in our accelerating cloud momentum,” said CEO Christian Klein in the company’s earnings press release. “It’s clear that our transformation has reached an important inflection point, paving the way for continued growth in the future.”

Despite the negative news on operating profit, investors greeted SAP’s results warmly, with the company’s share price rising $5.69—or 6.25%—to $96.70 in mid-afternoon trading in the US.  

Like much of the tech sector, SAP is facing headwinds caused by the ripple effects of Russia’s invasion of Ukraine, as well as a bearish economic climate. Hence, the positive news on the company’s cloud business appears to have been enough to inspire confidence in SAP’s overall outlook.

SAP has made major headway on its transition from a license- and maintenance-based business to a usage-based, cloud-first company, due in part to a long string of acquisitions in the past decade in addition to the introduction of S/4HANA, which brings the company’s ERP platform to the public cloud of the customer’s choice.

However, the company has a long way to go. Just 12% of current and intended SAP ERP users in the US and Europe responding to a recent survey by digital transformation services provider LeanIX have completed the transition to S/4HANA. Another 12% said they have postponed the start of their move to S/4HANA, and 74% of enterprises that were polled are just at the evaluation and planning phase of their ERP migration.

ERP Systems, Technology Industry

It’s Business 101: A company exists to make money. So as a CIO, Ajay Sabhlok believes his mandate is “to figure out how to generate revenue for the company.”

Sabhlok, CIO and chief data officer of security technology vendor Rubrik, says he does that by searching for unmet needs, bottlenecks, and problem areas and then considering how technology can turn those around.

Case in point was the company’s lead-to-cash process. Data showed that the company wasn’t closing expected orders, which was showing up as lost revenue in quarterly reports, Sabhlok says. So, he and his team identified and articulated the need for a more advanced opportunity management process — one that has an engine for more accurate scoring of business leads, automates manual tasks that were holding up orders, and delivers data-driven insights through user-friendly dashboards.

The efforts paid off: More leads converted to sales, and as a result, the company experiences a boost in quarterly revenue figures.

And all because IT saw a need and took the lead in co-creating with sales and marketing a solution aimed at generating revenue for the business. “Here we saw we were losing business and saw opportunities for improvement, and we brought that to the table as a way to improve revenue,” Sabhlok says.

Revenue generation: A new IT imperative

As Sabhlok’s experience shows, the CIO role continues to evolve. It has moved from focusing only on uptime and availability — the so-called “lights-on” function — and has even advanced past prioritizing cost-cutting and efficiency gains. Now the position is entrenched in the executive suite, where it is facing the requirement to partner with the business and strategize on how technology can transform its value prop to customers.

That’s evident in the 2022 Tech Trends survey from Info-Tech Research Group. The firm polled CIOs about their priorities and found that business process improvements, digital transformation or modernization, and security were the top three. What came fourth? Supporting revenue growth.

“This is something that’s going to be expected, that the technology inside the organization, regardless of the sector, will be expected to bring value in revenue or market valuation,” says Nicola Morini Bianzino, global CTO for professional services firm EY. “CIOs have to be a technology consultant and advisor. They can’t just run an efficient machine.”

Increasing expectations for top-line growth

The expectation for CIOs to contribute in delivering business revenue has been building over recent years, according to multiple sources, who note that retail CIOs paved the path, creating omnichannel experiences and tech-enabled features such as virtual try-on type capabilities, as  did CIOs at companies whose business models were built completely on IT.

“IT was largely doing that work, and that’s clearly aligned with generating revenue,” says Karena Man, a technology consultant who leads the data practice and the West Coast Tech Officers practice at management consulting firm Egon Zehnder.

To be clear, this call for CIOs to directly boost revenue goes beyond delivering value — something CIOs are also increasingly expected to do. To that end, CIOs are indeed partnering with their business unit colleagues and supporting enterprise strategy as well as calculating the value that IT provides on those fronts. And those efforts do, in fact, support the company’s ability to make money.

But those efforts are not always a straight line to increased sales, higher margins, and/or greater market share. The work done by most CIOs “still skews more toward the internal production of systems that support the business overall,” says Kim Villeneuve, co-founder of bluSPARC, a coaching and executive development company, and CEO of Centerstone Executive Search & Consulting.

Much of that work remains about efficiency gains and reduced friction — which, again, is important but not revenue-generating. It’s not surprising, then, that the ability for CIOs to operate in ways that specifically and directly grow revenue remains a tall order.

There are reasons for that, as being a revenue-generating CIO requires a different approach than that traditionally taken by the CIO, says Bobby Cameron, vice president and principal analyst with Forrester Research.

“It’s easy for more traditional IT shops to be preoccupied with being an order-taker,” he says, explaining that CIOs who want to impact revenue figures need to shift to “focusing on and measuring what the company is doing in terms of its financial performance.”

He adds: “That may sound obvious, but the maturity isn’t there; about 59% of IT organizations don’t yet focus on business outcomes yet.”

Essential steps for CIOs to generate revenue

Cameron, CIO advisors, and revenue-generating CIOs point to several key elements that enable IT teams to specifically and directly boost the top line and not just improve bottom-line numbers.

1. Know business objectives and determine how IT can impact them

CIOs have been schooled to know the business, speak the language of the business, partner with the business. But Egon Zehnder’s Man advocates for an even more active engagement between IT and the business, so that IT can identify opportunities where technology can boost sales or improve margins.

“They’re more willing to engage in learning,” she explains. “There’s a genuine desire to make the lives of their customers better. And that’s what drives them to go out and engage with the buyers to learn about what’s missing.”

John Abel, senior vice president and CIO of tech company Extreme Networks, says he’s taking that approach. “Every business case we put through has to have a determined business outcome,” he says.

He’s not talking about traditional IT objectives such as improved operational efficiency or lowered costs but true business goals like improved customer satisfaction, customer engagement, and sales processes — “which all lead to increased revenue.”

Abel says he identifies those opportunities within the IT portfolio that “I think will deliver the most gains in revenue. And those are the ones we want to invest in, because our business strategy is growth.”

He uses customer-centric metrics to determine whether IT initiatives actually have a direct impact on revenue. If so, “those are the ones we bring forward.”

Abel points to his team’s work on the company’s web and digital presence, where IT had determined that existing technology created friction that limited sales growth. So he and his team identified and delivered capabilities, such as more advanced analytics, that would drive up more orders. That means, quite simply, more sales and more revenue.

Others echo this idea, saying CIOs need to focus on and actually co-own business outcomes.

Here, Cameron cites the importance of business capability mapping, a process for identifying and modeling or depicting what the business does to reach its objectives. Cameron says the CIO and the IT team can use this approach to identify how and where technology can be most impactful in the chain of capabilities that fall under business objectives.

To illustrate this idea, he points to a vision products company whose sales agents would schedule equipment demos for prospective customers (typically eye doctors). The CIO used business capability mapping to understand all the capabilities — initial sales calls, the demos, follow-up conversations, and so on — that went into making a sale. This enabled the CIO to move past discussions of what systems supported the work to identifying what could change the actual business outcome itself — that is, the number of closed sales.

“The CIO can see and show where IT can impact the performance of that objective, and the CIO can track and report on that,” Cameron explains. “So if the objective is expanding sales in a particular state, the CIO can talk about what IT changes to make that happen.”

2. Adopt a product orientation

One key way to change business outcomes (and, thus, creating the ability for IT to help with revenue generation) is implementing a product-based approach in place of the traditional project-based structure of IT delivery, according to CIOs and executive advisors.

“A shift to a product-centric point of view means I’m not just worrying about the CRM, I’m actually worried about the flow of value creation,” Cameron explains.

Many CIOs are already shifting this way by adopting agile and DevOps software development methods, both of which center around the notion of iterative product development and creating features and functions that meet needs.

That’s a start, but experts say CIOs need to get cross-functional product teams, led by skilled product managers, focused on meeting the targeted business outcomes and holding them accountable for delivering those outcomes. Cameron says he has seen some companies implement bonuses for CIOs and even those within IT who can deliver revenue growth, which is an effective way to ensure product teams focus on the successful delivery of business outcomes.

3. Think and act like an entrepreneur

CIOs that do all this, Man explains, “think about their roles more expansively. They’re less willing to stay in their swim lanes. They think about their responsibilities a little differently. They see unmet needs. In a way, that’s no different than a founder of a startup.”

That entrepreneurial spirit may come naturally to some, but Man and others says it’s also an approach that others can learn and put into practice. CIOs can more actively engage colleagues throughout the business, engage customers about their experiences, become product centric, reward innovation, enable ways to test ideas, and truly accept failures when they happen.

“It’s understanding where to stop and listen to what the business opportunities are,” Villeneuve says, adding that CIOs who do this well are empathetic, collaborative, nimble, and resilient — and they have practices and processes that support innovation.

Villeneuve says CIOs are often well positioned to be the entrepreneurs within the C-suite because they have a broad perspective of what’s happening across the enterprise — both functionally and geographically. They may, for example, be able to see that a technology, such as an app, is driving up revenue in one region and understand how to adopt that for another market.

“That makes them a solutions-based executive,” she adds.

Abel says he enables entrepreneurship and innovation in part by having his team do what he terms “ride-alongs.” “They go through what a customer or business partner experience is, and we look at competitors and talk to our [company] partners about their experience with our competitors and ask, ‘What is the frustrating part with us versus a competitor,’” he says. “You need to talk to the people closest to driving the business outcome you’re seeking to achieve.”

That approach, however, may not fly in every organization as currently structured, according to sources. Man says CIOs in organizations focused on using IT for efficiency gains likely have what it takes to become entrepreneurial, as they’re “very creative and resourceful in working effectively within a lean environment.” But they may find if they don’t also work on changing colleague expectations that “they’re not going to be rewarded for being entrepreneurial.”

Consequently, CIOs who want to work in ways that drive up revenue must either help create an organization where that would be welcome and rewarded — or find one where those approaches are already valued.

“Entrepreneurship is messy,” Man adds. “You have to work in an environment that encourages that and is OK with that, and not punitive for it.”

CIO, IT Leadership

Without a doubt, one of the key drivers of the Fourth Industrial Revolution is Robotic Process Automation (RPA). Organizations worldwide have increasingly leveraged RPA technology and are now adopting multi-vendor strategies for a multitude of enterprise automation tools, beyond RPA. From recent conversations with the VOCAL Council, I estimate that almost 40% of the nearly 40,000 customers using RPA are deploying a multi-vendor strategy.

RPA tools have evolved from simple bots that automate single, micro tasks or activities to more complex end-to-end, unattended solutions that can automate entire processes and deliver unprecedented benefits. However, automation management is the automation that automation vendors forgot.

At the heart of RPA is the orchestrator. While orchestrators have improved and some have moved to become cloud-based, several have not been rearchitected. As a result, the design debt built over the years (with a focus on selling bots vs. managing them) has limited orchestrators to basic operational bot metrics.

The high cost of orchestrators prevents organizations from fully capitalizing on the current limited benefits, and integration challenges make it even harder to incorporate multi-vendor orchestrators into the tech stack. The swivel chair approach (input data from one system to another) that automation is supposed to eliminate is back, with precious resources swiveling to manage multiple automation vendors, extract metrics only to populate Excel and PowerPoints, meticulously and manually maintain bot schedules and reschedules, and painfully pray that bot failure may be detected early.

Lack of support, missing automation management capabilities, and inadequate/missing self-recovery are just a few of the serious challenges in managing automations.

Automation vs. orchestration: Same pod, different peas?

The easiest way to understand the automation vs. orchestration divide is in terms of “one” vs. “many”. Automation is often about automating a single repetitive task or activity within a process to run on its own and with minimal (or no) human intervention. For example, you could set up an RPA bot to automatically create IT service tickets, another to launch a web server, and a third to change a line of code in JSON files. Each of these bots would continually execute its specific task until you stop it. Repetition is the name of the game. Intelligence and logic – not so much.

Orchestration, on the other hand, is about automating multiple tasks to work seamlessly together as part of a larger workflow. The effort could involve multiple environments, devices, services, and people. And that’s why it is much more complex than a single bot for a simple task.

This complexity makes it vital to understand the many steps involved during orchestration and how these steps intersect. It also requires seamless coordination to prevent bottlenecks and ensure that enterprises can successfully derive the expected benefits from the effort, whether it’s process optimization, error-free output, accelerated innovation, improved employee experiences, or faster time-to-market.

Process automation is not as simple as automating every task within that process, there is also a layer of orchestration and task interdependency where most judgement calls and other complexities live. Automating and managing that layer is the toughest part of the journey.”

Max Ioffe, Global Intelligent Automation Leader, WESCO Distribution

Optimization vs. orchestration

There are network optimization tools that came following the shortcomings with network orchestrators. Cloud optimization followed a similar path.  One can integrate orchestration and scheduling capability to create dynamic schedules, create task queues, initiate workflows, and monitor and track executions to self-recovering and predictive maintenance. You can also incorporate triggering events and advanced logic to automate multiple tasks within a process and ensure timely, efficient and consistent execution.

Over time, these tools can confer benefits like reduced IT costs, higher-quality output, and limited process downtime and going a bit further even provide self-sustainability. Automation promises to remove the mundane, yet automation management is laden with the manual and mundane.

With automation optimization, your precious resources don’t have to worry about mundane activities like managing, scheduling, or restarting bots. The automation optimization tool steps in and boosts your license, utilization and infrastructure efficiency while driving higher employee engagement. 

“For automation to succeed, having & sustaining optimal utilization, is key. Automation optimization tools that integrate with multiple vendors and offer a single pane of glass to manage automations will lead to a better Total Cost of Ownership and perhaps even increased RPA adoption.”

Akash Choudhary, Director Enterprise Architecture, ServiceNow

Sound without music

Benefits notwithstanding, just orchestration – which is almost always part of the overall RPA solution package – has some limitations. For one, many of these solutions are so complex and prohibitively expensive that companies with small IT budgets and teams struggle just to purchase the solution, much less leverage its benefits.

Costs do vary, depending on the instances you want to deploy and can be a TCO barrier[1]. Support is often an afterthought, which impacts the quality and timeliness of automation maintenance, change requests and incident management. The longer the response time to classify, handle and resolve incidents, the larger the amount of disruption and potentially losses for customers. Qualified support people who understand the architecture of orchestrators are limited and hence often customers face more annoying sound than music, with limited or no recommendations on how to apply appropriate solutions.

Some orchestrators don’t monitor all incidents or provide a holistic view of incidents, which is key for incident monitoring, investigations, and root cause analyses. While it is possible to automate these aspects, many orchestrators don’t provide “automation management” capabilities. Further, they don’t provide a single, centralized “incident box” that can help organizations keep track of all their automation initiatives. Under this scenario, determining the lifecycle metrics of an automation program is almost impossible.

Many customers seek a self-recovery/self-healing capability and sometimes want the orchestrator to simply restart a service that stops or becomes unresponsive. At other times, they need higher-level orchestrated workflows to automatically start up a new virtual machine (VM), check that its services start correctly, update the DNS put it into the load balancer, and even shift services to a different data center. In today’s automation environment, if a Virtual Desktop Interface (VDI) fails, it cannot automatically resolve the underlying problem, much less reboot on its own in the fastest possible time. Additionally, due to this limitation, the enterprise cannot monitor CPU or memory usage or self-clone machines to dynamically scale up capabilities to match peak requirements or scale down (“shutdown mode”) when requirements are low.

Organizations must custom-build an inventory management system for the orchestrator, which can be an arduous and resource-intensive effort. An automation management solution with a built-in or integrated inventory management tool increases the visibility into the automation ecosystem. It provides opportunities to streamline systems maintenance, improve automation efficiency, enhance output quality (lower errors), and reduce downtime.

“You need to reduce costs, streamline, and improve visibility of your RPA bots to scale your automation program. An automation optimization tool to monitor and control bots and derive tangible business, value and automation lifecycle metrics is must.”

Amol Rajamane, Global Digital Automation Leader, DuPont

The rise of automation optimization solutions

In an ideal world, automation workloads – whatever their heritage – should be able to move seamlessly between and be shared among, automation providers, wherever the optimal combination of performance, functionality cost, security, resilience etc. can be found – while avoiding the dreaded “vendor lock-in.” What if your automation can meet your demand without demanding more from you? Automation hopping, at the risk of introducing yet another term in a busy highway of terminology in automation, is not a far reality. Consumption pricing will pave the highway.

Automation optimization is a category that brings together end-to-end automation orchestration and management capabilities to cut cost, increase performance, and measure business impact – across multiple automation tools.

Several niche solutions tend to focus just on the control aspect of automation management, ie addressing portions of the orchestration limitations described above. Automation executives are often looking for a holistic, cost-efficient solution. CFOs are seeking a holistic, cost-efficient solution – an integrated, all-around athlete vs. buying multiple brands of shoes to potentially become an athlete.

An end-to-end automation optimization solution strings together the automation lifecycle: idea generation and classification, document gathering (discovery), building the bot, controlling the bots with dynamic scheduling, license/bots/utilization optimization with dynamic caching and AI-powered bot failure prediction, and deriving key value metrics. This approach provides the necessary operational (bot) metrics, value (KPI) metrics, and lifecycle (idea to value) metrics to determine and communicate the automation value to your organization.

Automation optimization solutions address the shortcomings of existing orchestrators but also offer the ultimate outcome challenging and shaping the automation industry today: improving adoption and scale.

A few solutions have emerged in recent years that deliver all these advantages, address the limitations of older platforms, and allow organizations to easily add or remove bots to match their evolving automation needs.

“The cost of bot license, infrastructure and automation management creates an significant dent in the technology budget. To drive customer centricity, automation technology vendors are better served helping their customers improve their existing license, infrastructure, scheduling utilization rather than pushing more bots. The rise of automation optimization solutions is evidence that there is a major gap in the market.”

Ankit Thakkar, Automation & Finance Digitization Leader, Thermofisher

The future of orchestration and optimization solutions

As automation and orchestration technologies develop, many more cutting-edge solutions will emerge for different enterprise use cases. The best automation optimization solutions will allow organizations to capture all the benefits of large-scale automation at the lowest possible TCO. A solution, for example, like Turbotic (disclosure: I sit on Turbotic’s Board) speeds up bot development by allocating development tasks and approval steps appropriately and systematically. It also clearly shows the entire automation opportunity, all the way from ideation to implementation in a single flow. Further, the solution automatically creates an automation business case, boosts bottom-up pipeline generation, and supports value tracking and monitoring to bring greater transparency into the automation ecosystem.

The best automation optimization solutions also provide visually compelling dashboards and real-time metrics to measure these benefits and assess the true value of the automation initiative across the enterprise. In addition, they will effortlessly integrate with existing systems to minimize disruptions and deliver all the advantages promised by enterprise-wide automation.

Advanced orchestrator solutions enable organizations to implement hyperautomation with support for cutting-edge technologies like AI, ML, and cognitive NLP. These “predictive” orchestrators leverage data and learning to better orchestrate all automation using improved predictions and decision support. These capabilities enable enterprises to optimize their automation licenses and resources in order to optimize costs, throughput, compliance, and ROI, and eliminate the chances of costly SLA breaches.

Today’s companies are operating in a highly challenging business landscape with the great resignation, quiet quitting, lack of qualified automation talent, and more. Using orchestration alone, with its current limitations, is not enough.

Orchestrate or optimize? I say both.

[1] Orchestration is often an upfront cost and impacts ROI until one has a critical mass of process automations and bots in production.

Robotic Process Automation

With 65 million vaccine doses to administer at the height of the COVID-19 pandemic, Luigi Guadagno, vice president of pharmacy renewal and healthcare platform technology at Walgreens, needed to know where to send them. To find out, he queried Walgreens’ data lakehouse, implemented with Databricks technology on Microsoft Azure.

“We leveraged the lakehouse to understand the moment,” he says. For Guadagno, the need to match vaccine availability with patient demand came at the right moment, technologically speaking. The giant pharmaceutical chain had put its lakehouse in place to address just such challenges in its quest, to, as Guadagno puts it, “To get the right product in the right place for the right patient.”

Previously, Walgreens was attempting to perform that task with its data lake but faced two significant obstacles: cost and time. Those challenges are well-known to many organizations as they have sought to obtain analytical knowledge from their vast amounts of data. The result is an emerging paradigm shift in how enterprises surface insights, one that sees them leaning on a new category of technology architected to help organizations maximize the value of their data.

Enter the data lakehouse

Traditionally, organizations have maintained two systems as part of their data strategies: a system of record on which to run their business and a system of insight such as a data warehouse from which to gather business intelligence (BI). With the advent of big data, a second system of insight, the data lake, appeared to serve up artificial intelligence and machine learning (AI/ML) insights. Many organizations, however, are finding this paradigm of relying on two separate systems of insight untenable.

The data warehouse requires a time-consuming extract, transform, and load (ETL) process to move data from the system of record to the data warehouse, whereupon the data would be normalized, queried, and answers obtained. Meanwhile, unstructured data would be dumped into a data lake where it would be subjected to analysis by skilled data scientists using tools such as Python, Apache Spark, and TensorFlow.

Under Guadagno, the Deerfield, Ill.-based Walgreens consolidated its systems of insight into a single data lakehouse. And he’s not alone. An increasing number of companies are finding that lakehouses — which fall into a product category generally known as query accelerators — are meeting a critical need.

“Lakehouses redeem the failures of some data lakes. That’s how we got here. People couldn’t get value from the lake,” says Adam Ronthal, vice president and analyst at Gartner. In the case of the Databricks Delta Lake lakehouse, structured data from a data warehouse is typically added to a data lake. To that, the lakehouse adds layers of optimization to make the data more broadly consumable for gathering insights.

The Databricks Delta Lake lakehouse is but one entry in an increasingly crowded marketplace, that includes such vendors as Snowflake, Starburst, Dremio, GridGain, DataRobot, and perhaps a dozen others, according to Gartner’s Market Guide for Analytics Query Accelerators.

Moonfare, a private equity firm, is transitioning from a PostgreSQL-based data warehouse on AWS to a Dremio data lakehouse on AWS for business intelligence and predictive analytics. When the implementation goes live in the fall of 2022, business users will be able to perform self-service analytics on top of data in AWS S3. Queries will include which marketing campaigns are working best with which customers and which fund managers are performing best. The lakehouse will also help with fraud prevention.

“You can intuitively query the data from the data lake. Users coming from a data warehouse environment shouldn’t care where the data resides,” says Angelo Slawik, data engineer at Moonfare. “What’s super important is that it takes away ETL jobs,” he says, adding, “With Dremio, if the data is in S3, you can query what you want.”

Moonfare selected Dremio in a proof-of-concept runoff with AWS Athena, an interactive query service that enables SQL queries on S3 data. According to Slawik, Dremio proved more capable thanks to very fast performance and a highly functional user interface that allows users to track data lineage visually. Also important was Dremio’s role-based views and access control for security and governance, which help the Berlin, Germany-based company comply with GDPR regulations.

At Paris-based BNP Paribas, scattered data silos were being used for BI by different teams at the giant bank. Emmanuel Wiesenfeld, an independent contractor, re-architected the silos to create a centralized system so business users such as traders could run their own analytics queries across “a single source of truth.”

“Trading teams wanted to collaborate, but data was scattered. Tools for analyzing the data also were scattered, making them costly and difficult to maintain,” says Wiesenfeld. “We wanted to centralize data from lots of data sources to enable real-time situational awareness. Now users can write their own scripts and run them over the data,” he explains.  

Using Apache Ignite technology from GridGain, Wiesenfeld created an in-memory computing architecture. Key to the new approach is moving from ETL to ELT, where transformation is carried out while performing computations in order to streamline the entire process, according to Wiesenfeld, who says the result was to reduce latency from hours to seconds. Wiesenfeld has since launched a startup called Kawa to bring similar solutions to other customers, particularly hedge funds.

Starburst takes a mesh approach, leveraging open-source Trino technology in Starburst Enterprise to improve access to distributed data. Rather than moving data into a central warehouse, the mesh enables access while allowing data to stay where it is. Sophia Genetics is using Starburst Enterprise in its cloud-based bioinformatics SaaS analytics platform. One reason: Keeping sensitive healthcare data within specific countries is important for regulatory reasons. “Due to compliance constraints, we simply can not deploy any system that accesses all data from one central point,” said Alexander Seeholzer, director of data services at Switzerland-based Sophia Genetics in a Starburst case study.

The new query acceleration platforms aren’t standing still. Databricks and Snowflake have introduced data clouds and data lakehouses with features designed for the needs of companies in specific industries such as retail and healthcare. These moves echo the introduction of industry-specific clouds by hyperscalers Microsoft Azure, Google Cloud Platform, and Amazon Web Services.  

The lakehouse as best practice

Gartner’s Ronthal sees the evolution of the data lake to the data lakehouse as an inexorable trend. “We are moving in the direction where the data lakehouse becomes a best practice, but everyone is moving at a different speed,” Ronthal says. “In most cases, the lake was not capable of delivering production needs.”

Despite the eagerness of data lakehouse vendors to subsume the data warehouse into their offerings, Gartner predicts the warehouse will endure. “Analytics query accelerators are unlikely to replace the data warehouse, but they can make the data lake significantly more valuable by enabling performance that meets requirements for both business and technical staff,” concludes its report on the query accelerator market.   

Noel Yuhanna, vice president and principal analyst at Forrester Research, disagrees, asserting the lakehouse will indeed take the place of separate warehouses and lakes.

“We do see the future of warehouses and lakes coming into a lakehouse, where one system is good enough,” Yuhanna says. For organizations with distributed warehouses and lakes, the mesh architecture such as that of Starburst will fill a need, according to Yuhanna, because it enables organizations to implement federated governance across various data locations.

Whatever the approach, Yuhanna says companies are seeking to gain faster time to value from their data. “They don’t want ‘customer 360’ six months from now; they want it next week. We call this ‘fast’ data. As soon as the data is created, you’re running analytics and insights on it,” he says. 

From a system of insight to a system of action

For Guadagno, vaccine distribution was a high-profile, lifesaving initiative, but the Walgreens lakehouse does yeoman work in more mundane but essential retail tasks as well, such as sending out prescription reminders and product coupons. These processes combine an understanding of customer behavior with the availability of pharmaceutical and retail inventory. “It can get very sophisticated, with very personalized insights,” he says. “It allows us to become customer-centric.”

To others embarking on a similar journey Guadagno advises, “Put all your data in the lakehouse as fast as possible. Don’t embark on any lengthy data modeling or rationalization. It’s better to think about creating value. Put it all in there and give everybody access through governance and collaboration. Don’t waste money on integration and ETL.”

At Walgreens, the Databricks lakehouse is about more than simply making technology more efficient. It’s key to its overall business strategy. “We’re on a mission to create a very personalized experience. It starts at the point of retail — what you need and when you need it. That’s ultimately what the data is for,” Guadagno says. “There is no more system of record and system of insight. It’s a system of action.”

Analytics, Data Management