For CIOs riding today’s rising wave of robotic process automation (RPA), leading-edge adopters whose mature implementations have paid off can provide invaluable lessons about how to make the best of the technology and where its use can lead.

Telecom titan AT&T is one such enterprise, having began RPA trials in 2015 to reduce repetitive tasks for its service delivery group, which had a large volume of circuits to add at the time, as well as various services in play for provisioning networks, says Mark Austin, vice president of data science at AT&T.

“These things would come in large batches, and they would have Excel files and people were literally typing these things in individually into the systems because they weren’t set up for batch,” Austin says. “We heard about RPA at the time, and we started trying it and all of a sudden we were able to automate one process and then the next process and it kind of grew from there.”

With the technology in its early days, the first thing AT&T IT did was go to its compliance and security experts for guidance on governing RPA, which helped the team make its automation tools stable and secure. The next step was to win the battle for hearts and minds within the company by turning skeptics into believers that automation could make employees’ lives better. Initial efforts focused on addressing unpopular, monotonous tasks such as order entry.

The pilots helped demonstrate how automation could fit into daily operations and workflows.

Within a year, AT&T had implemented 350 automation bots. More than six years into its RPA journey, AT&T has implemented more than 3,000 automation bots. Austin says RPA has helped AT&T recognize hundreds of millions of dollars in annualized value, saved 16.9 million minutes of manual effort each year, and shown a 20x return on investment.

Taking RPA to the next level

Mark Austin, vice president of data science, AT&T

AT&T

With RPA ingrained in its business process DNA, AT&T opted to combine automation with data science and the chief data office because it believes the future is in smarter bots that leverage AI functionality, such as OCR or natural language processing (NLP), an emerging strategy often referred to as intelligent automation.

“Tying those things together is pretty powerful,” says Austin, who runs AT&T’s data science, AI, and automation group.

By way of example, Austin points to what he considers one of the company’s biggest RPA successes: a bot his group has created that uses OCR to scan vehicle registration documents and NLP to understand those documents and any necessary actions AT&T must take in support of more than 10,000 technician vehicles, one of the largest vehicle fleets in the US. If payments are required, the bot can also trigger the payment process.

Being able to create automation bots such as these was invaluable when the COVID-19 pandemic first hit, Austin says.

“There were a lot of customers that were calling and saying they wanted to move the charges from this org to that org in their company,” Austin says. “Someone might call up and say they wanted to move 5,000 lines. What we do now is we have them interface with [interactive voice response (IVR)]. The IVR detects what they want to do and then it triggers a bot to send them a secure form to fill out. They fill out the form, submit that back, and we run the bot to automate the process to get it going.”

The company has also rolled out bots to help customers avoid overage charges. One such bot monitors usage of AT&T’s integrated voice, video, messaging, and meeting services, more than 21,000 records per minute, looking for overage charges above a pre-set amount. If it encounters one, it automatically notifies the customer and the assigned AT&T sales rep.

Codifying RPA best practices

After the first year of pilots, with demand for RPA spreading rapidly through the business, AT&T created an automation center of excellence (COE) to accelerate implementation.

“When you’re the size of AT&T, and you’ve had so many mergers and so many systems, there’s just lots of manual processes,” Austin says, explaining why it was essential to create a COE that could focus on implementing automation throughout the organization.

The centralized automation team now boasts 20 full-time employees and some contractors as well. Austin notes that the real secret to successfully scaling automation is spreading RPA knowledge throughout the organization. The COE helps develop, deploy, manage, measure, and enable automation projects across AT&T. More importantly, it seeks to educate subject matter experts in automating their own tasks and processes.

“Pretty early on, we figured out that if you really want to scale, you’ve got to move to training others how to do it, teach them how to fish, so to speak,” Austin says. “Ninety-two percent of everything we do with the 3,000 bots is done outside of my team. If you’re not an IT person, it’s maybe 40 hours of training.”

The company has trained more than 2,000 citizen RPA developers who have built the lion’s share of AT&T’s 3,000 automation bots. To support them, the company has created a “Bot Marketplace” where citizen developers can “shop” for ready-to-use tools and support to get their automation solutions up and running. The marketplace stores and shares low-code and no-code automation solutions and tools. It now adds roughly 75 new blueprints of reusable automation components every month.

As RPA knowledge has spread, Austin says the lines of business have started forming their own automation teams, creating a hybrid model in which the COE provides tools and support, while front-line teams in the lines of business implement automation.

“They even have some new job titles popping up,” Austin says. “We’ve got a couple process automation managers and automation developers that we’re seeing out there. On our team, we’re continuing to move to automate the process, the platform, and then tie in the data science side.”

When it comes to lessons learned, Austin has some advice for others out there who may be starting their RPA journey. First, start small and get some wins. Second, don’t try to keep things centralized. While the center of excellence has been essential to AT&T’s RPA journey, just as important has been democratizing the effort to scale the proliferation of automation within the company. Finally, evangelization is important. AT&T has created an internal automation summit where groups can present their automation projects to the rest of the company, show off their successes, and help spark new ideas.

Artificial Intelligence, Robotic Process Automation

The idea of having “The right tool for the job” applies across domains. Two thousand years ago Greek mathematician Archimedes is reported to have said “Give me a place to stand, and the right lever, and I could lift the earth.”


Fast forward to today’s cloud-centric environment, and application developers are nodding in enthusiastic agreement with Archimedes; and while things may be considered abundantly more complicated than in 250 BC., Google Cloud partner Aiven has made it their job to streamline some of the complications that can inhibit cloud-centric application development.

“Our mission here at Aiven is quite simple,” says Troy Sellers, presales Solution Architect at Aiven. “It’s to make the developer’s life easier. And when you’re a company that is looking at driving innovations or transformations into the cloud, for example, they need the right tools to support that activity.”

Aiven provides open source solutions that stand up a cloud based data infrastructure, freeing developers to focus on high value projects, and in the process, accelerate cloud migration and modernization.

“Having the right tool is just as important as having the ideas, because it allows the people with the ideas to get on and focus on the things that are important,” says Sellers in Google Cloud’s podcast series “The Principles of a Cloud Data Strategy.”

Dealing with complexity

As digital transformation evolves into broader modernization efforts, organizations face a common milestone — they need to expand their cloud-based services, but they lack the staff and skills to do so at scale.

It’s not just a resource question, though. Sellers says, “The challenges today, they’re worlds apart from the days gone by where I used to be building applications myself. I remember, we used to go and talk to customers, when big data was like a gigabyte.”

Today’s modern data and application development stacks contain many moving parts, and different tiers of logic — not to mention the sheer volume of data in play, the need to be aware of regulatory compliance and security issues, and the pressure to keep up with today’s expectations of Continuous Integration and Continuous Delivery (CICD) for applications.

“There’s this expectation on developers that releases go from, rather than once every three months, to once every month, to every lunchtime at 11 o’clock,” Sellers says. “Time to market is just getting faster and faster and faster. And you definitely are in a race with your competitors to do that.”

“This is probably one of the main reasons that developers turn to companies like Google Cloud and Aiven for fully managed services, because it just takes a lot of that headache out of managing that. And they can get to market really, really fast.”

The Open Source Advantage

Aiven has leaned into Open Source for cloud data infrastructure since its inception in 2016. The advantages: cost savings, agility, no vendor lock-in, productivity, and efficiency.

“We manage database services for our customers, database services in the cloud, open source technologies such as Postgres, MySQL, Apache, Kafka,” says Sellers.  “We help customers adopt those services so they can focus on what they do best, which is building technology for their customers.”

Check “The Principles of a Cloud Data Strategy”  podcast series from Google Cloud on Google podcasts, Apple podcasts, Spotify, or wherever you get your podcasts.

Google Cloud Platform

With more than 30,000 employees spread across more than 60 affiliate locations and 14 manufacturing sites around the world, pharmaceutical company Eli Lilly operates at a truly global scale. Operating at that scale comes with issues, not the least of which is sharing accurate and timely information internally and externally.

“From internal training materials to formal, technical communications to regulatory agencies, Lilly is translating information often,” says Timothy F. Coleman, vice president and information officer for information and digital solutions at Eli Lilly and Co.

For years, Lilly relied on third-party human translation providers for the bulk of its translation needs. Although public web translation services are available, confidentiality requirements meant those services did not meet Lilly’s standards for information security. Even with a footprint of more than 400 translation vendors, the process was slow.

“Lilly engages with costly third-party human-translation providers across the organization to provide verified and reliable translations,” Coleman says. “Depending on the requirements, the planning, translation, and verification of these engagements can take weeks to complete.”

Coleman adds that many bilingual Lilly employees were also being tapped to provide translations in addition to their current scope of work.

To address these challenges, the pharmaceutical firm developed Lilly Translate, a home-grown IT solution that uses natural language processing (NLP) and deep learning to generate content translation via a validated API layer, Coleman says.

“This innovative application of natural language technology enables Lilly to achieve greater efficiency gains, significant cost reduction, higher quality content, and lead the way for future tech innovations using natural language technology to achieve enterprise value at scale,” he says.

Passion project pays off

Coleman says Lilly Translate started as a passion project by a curious software engineer who had an idea for addressing a pain point of the Lilly Regulatory Affairs system portfolio: Business partners continually experienced delays and friction in translation services.

“That married up well with an opportunity to explore and learn emerging technologies,” he says. “It became a great opportunity that a Lilly software engineer picked up and ran with, initially as a great learning opportunity.”

After sharing the idea and technical vision with peers and managers, the project immediately garnered support from leadership at Eli Lilly Global Regulatory Affairs International, who advocated for investment in the tool. Coleman’s team worked closely with Regulatory Affairs to identify requirements around document types, languages, and so on.

The Lilly Translate API and UI are delivered via a serverless tech stack built on Node.js, Python, .NET, and Docker. It can be accessed via mobile devices, web browsers, and programmatically through the secure API.

The service, which earned Eli Lilly a CIO 100 Award in IT Excellence, provides real-time translation of Word, Excel, PowerPoint, and text for users and systems, keeping document format in place. Deep learning language models trained with life sciences and Lilly content help improve translation accuracy, and Lilly is creating refined language models that recognize Lilly-specific terminology and industry-specific technical language, while maintaining the formatting requirements of regulated documentation.

“The product was developed via a DevSecOps agile framework,” Coleman says. “Initially, we did not have a dedicated Scrum master and product owner, but later we were able to adjust that. The increased focus helped us accelerate our delivery efforts.”

The project took about a year to get to MVP, with several iterations and pilots needed to achieve the level of translation quality needed to meet business expectations.

“The level of quality of the translation output was an initial challenge we faced where the team had to work through various services to figure out how to improve the overall general level of quality,” Coleman says. “Once improved, we had to work diligently to ramp up our [organizational change management] efforts to gain confidence in the tool.”

With the tool fully deployed, a process that required the creation of work orders and days or weeks to complete now takes just a few seconds or minutes. The automation has also led to a big cost savings.

“In surveys distributed across the company there is consistent feedback that Lilly Translate is saving time across multiple business processes as well as getting answers to questions faster,” Coleman says. “Lilly Translate touches every area of the company from HR to Corporate Audit Services, to Ethics and Compliance Hotlines, Finance, Sales and Marketing, Regulatory Affairs, and many others. The time savings is extensive. Translations are now taking seconds instead of weeks, providing key resources time to focus on other business-critical activities.” 

CIO 100, Natural Language Processing

Halkbank, founded in 1993, is one of the largest banks in Türkiye, offering corporate and retail banking, investor relations, and SME and commercial services to over 15 million customers. But during the pandemic lockdowns, customers were forced to switch to the bank’s digital channels, and mobile app users quickly soared from one million to 2.5 million.

Since the pandemic, however, this increase in digital customers hasn’t been entirely smooth. The bank recognized the need to scale its mobile banking platform to handle more than double the volume of traffic.

Namik Kemal Uçkan, head of IT operations at Halkbank, lists challenges created in different areas, including prioritizing network availability when traffic volumes surge; making services in high demand available during peaks; ensuring speedy resolution and identification of network issues; having a sufficient capacity of network monitoring solutions; and ensuring faster incident resolutions and troubleshooting across the networks.

As a result, increasing complexity inside the enterprise IT ecosystem has been constant, and managing networks that support it, while always important, have become critical to address.

Over the last 10 years, Halkbank has been using Riverbed SteelHead across more than 1,000 branches for WAN optimization, and network and application performance. Riverbed’s solutions have helped to ensure Halkbank’s business critical applications are always available for its business users.

“Riverbed SteelHead has been used to accelerate the performance of the internal banking applications that are utilized by its own employees,” says Mena Migally, regional VP, META, at Riverbed. “By deploying this solution, they’ve reduced the latency for applications at branch offices while also realizing bandwidth savings.”

Banking, CIO, Digital Transformation