SAP is an enterprise software vendor based in Walldorf, Germany. Its cloud and on-premises enterprise resource planning (ERP) software, including S/4HANA, helps organizations manage their business operations and customer relations. The German multinational also offers a vast array of software solutions tailored to specific facets of the enterprise, including data management, analytics, and supply chain management, as well as solutions aimed at specific industry verticals.

AI is an area of increasing emphasis for SAP, which has a market cap of $295 billion, making it the 14th largest technology vendor in the world.

Here is the latest SAP news and analysis:

SAP systems increasingly targeted by cyber attackers

December 13, 2024: A review of four years of threat intelligence data, presented Friday at Black Hat by Yvan Genuer, a senior security researcher at Onapsis, reports a spike in hacker interest in breaking into SAP ERP systems.

Nearly 25% of SAP ECC customers unsure about their future

December 6, 2024: The future of SAP architectures is hybrid. But according to a survey conducted by the Financials subgroup of the German-speaking SAP User Group (DSAG), where exactly the journey will go has not yet been decided for many organizations.

SAP ups AI factor in its SuccessFactors HCM suite

October 28, 2024: The launch by SAP of new AI capabilities in its SuccessFactors HCM (human capital management) suite Monday is a case of “better late to the party than never,” according to an analyst with Info-Tech Research Group.

Riled by SAP’s AI policy, customers issue list of demands

October 22, 2024: SAP’s strategy of offering AI innovations only in the cloud continues to attract a lot of criticism from its user base. Here’s what SAP customers would like to see happen.

SAP: good figures, but bad mood

October 22, 2024: Employee engagement is suffering from the ongoing restructuring at SAP, although the software company reported good figures and has raised its outlook.

SAP sustainability tracking rollout focuses on data consistency, outlier detection

October 21, 2024: Enterprise CIOs are under increasing pressure from global regulators to rein in sustainability shortfalls due to partner problems. SAP’s pitch is that most enterprise partners are already using SAP, so it’s in an ideal position to collect and distribute partner data.

SAP joins the AI agent era — but not all customers may benefit

October 9, 2024: SAP is expanding its generative AI copilot Joule to include AI agents. Deeply embedded in SAP systems, the company’s agents aim to solve increasingly complex tasks.

SAP launches collaborative AI agents, adds Knowledge Graph

October 8, 2024: SAP’s promised collaboration between its AI copilot, Joule, and other agents will become reality in the fourth quarter of 2024, the company announced at its 2024 TechEd conference Tuesday.

SAP Build gains AI capabilities to help build autonomous agents

October 8, 2024: SAP wants developers to view its Build platform as the one extension solution for all of SAP’s applications, according to Michael Aneling, chief product officer for SAP Business Technology Platform (BTP).

SAP faces probe in the US over alleged price fixing in government contracts

September 24, 2024: German software giant SAP is under investigation by US officials for allegedly conspiring to overcharge the US government for its technology products over the course of a decade. The probe, led by the Department of Justice (DOJ), is focused on whether SAP and its reseller, Carahsoft Technology, colluded to fix prices on sales to the US military and other government entities

SAP CTO to step down after ‘inappropriate behavior’

September 3, 2024: Juergen Mueller is leaving SAP’s executive board, saying his behavior at a company event was incompatible with company values.

SAP partners up to make AI more practical

August 15, 2024: Many companies find it difficult to incorporate AI into their business processes. To change this, SAP wants to work more closely with the appliedAI initiative.

SAP patches critical bugs allowing full system compromise

August 14, 2024: Both the vulnerabilities score above 9 on CVSS and can allow access to sensitive data if not patched immediately.

SAP is restructuring its Executive Board

July 30, 2024: Head of sales Scott Russell and head of marketing Julia White are unexpectedly leaving SAP; White will not be replaced.

SAP restructuring to impact more jobs than expected

July 24, 2024: The restructuring at SAP affects almost a tenth of its workforce. The company estimates the cost of the internal restructuring at around €3 billion.

SAP Q2 results reveal large orgs now firmly on the path to AI

July 24, 2024: It “had a direct impact on our bookings,” company CEO says during second quarter earnings call.

SAP offers AI to all Rise customers — in unknown, varying amounts

July 19, 2024: Joule AI is now available to all Rise with SAP customers, but customers not using SAP Cloud solutions remain out of luck.

SAP security holes raise questions about the rush to AI

July 18, 2024: Cloud security firm Wiz has published a detailed report about SAP security holes, now patched, that raises alarming questions about the secondary role AI efforts are having on cybersecurity defenses.

SAP publishes open source manifesto

June 27, 2024: SAP has made five commitments — make consistent contributions to the community, champion open standards, strive to adopt an open-first approach, nurture open source ecosystems, and adopt a feedback-driven approach.

SAP, Salesforce lead $356 billion enterprise applications market: IDC

June 21, 2024: The software giants were neck-and-neck as the overall enterprise software market grew 12% in 2023, said IDC.

SAP to buy digital adoption specialist WalkMe for $1.5 billion

June 5, 2024: After Signavio and LeanIX, SAP is acquiring the Israeli provider WalkMe to help user companies with their digital transformation.

SAP CEO Christian Klein: Everything we do contains AI

June 5, 2024: SAP CEO Christian Klein kicked off the company’s Sapphire customer conference with the promise of a real productivity boost from AI.

SAP adds more tools for developers on its platform

June 4, 2024: Behind the scenes, SAP is also using AI to extend the capabilities of its Business Technology Platform.

SAP embeds Joule in entire enterprise portfolio, plans integration to other AIs

June 4, 2024: Joule could communicate with other AIs to complete more complex tasks spanning multiple applications, SAP suggests.

SAP AI pact with AWS offers customers more gen AI options

May 29, 2024: SAP wants to work more closely with AWS on AI, complementing existing partnerships with Google and Microsoft.

SAP customers see S/4HANA and AI as top digital transformation drivers

May 20, 2024: With SAP’s end of mainstream maintenance for SAP Business Suite 7 set for 2027, recent findings from the US SAP user group reveal that companies are increasing focus on shifting to S/4HANA and embracing AI.

SAP faces turning point as Hasso Plattner steps down

May 15, 2024: The departure of CEO Hasso Plattner marks the end of the founding era at SAP, and adds further complexities for the German software multinational as it faces ongoing restructuring efforts, among many other challenges to solve.

SAP forecasts clarity in the cloud

May 7, 2024: After customers and user groups that adopted S/4HANA early accused SAP of bait-and-switch tactics, CIO editor-in-chief in Germany Martin Bayer recently sat with Christian Klein, CEO of the multinational software company, to clear the air on cloud reassurance, using gen AI as a migration accelerant, and positive growth for the future.

Deutsche Telekom calls on SAP for IT infrastructure move to Rise

March 22, 2024: Deutsche Telekom will move its SAP infrastructure to Rise with the help of its own IT services subsidiary, T-Systems.

SAP user group: S/4HANA usage is growing, but still in the minority

March 21, 2024: Customers want more information about cloud and AI strategy from the German ERP giant.

SAP and Nvidia expand partnership to aid customers with gen AI

March 18, 2024: SAP is embedding Nvidia’s generative AI foundry service into SAP Datasphere, SAP BTP, RISE with SAP, and SAP’s enterprise applications portfolio to equip customers with greater and more simplified access to the technology.

SAP enhances Datasphere and SAC for AI-driven transformation

March 6, 24: SAP adds new generative AI and data governance features to SAP Datasphere and SAP Analytics Cloud, enabling customers to incorporate non-SAP and unstructured data when creating AI-based planning models and scenarios.

SAP names Philipp Herzig as chief artificial intelligence officer

February 16, 2024: It’s a small promotion and a change of title for one man, and a sign of a larger change in strategic focus for many others at SAP.

SAP 2024 outlook: 5 predictions for customers

February 12, 2024: As SAP continues to position itself as a leader in generative AI and innovative technologies, customers must prepare to navigate new service offerings and an inevitable move to SAP RISE.

SAP has a new succession plan

February 12, 2024: SAP’s board wants to bring former Nokia chairman Pekka Ala-Pietilä on board to succeed founder Hasso Plattner as chairman.

SAP and IBM under scanner of Indian investigative agency for Air India deal

February 5, 2024: Air India failed to adhere to the rules while awarding an ERP contract worth $27 million to SAP India and IBM India.

SAP offers big discount to lure on-prem S/4HANA customers to Rise

January 30, 2024: The restructuring at SAP affects almost a tenth of its workforce. The company estimates the cost of the internal restructuring at around €3 billion.

SAP announces $2.2B restructuring program that’ll impact 8,000 jobs

January 24, 2024: The restructuring program will focus on AI and impact about 7.4% of SAP’s total workforce.

SAP doubles down on cloud-first innovation with executive reshuffle

January 10, 2024: Product engineering head Thomas Saueressig will take on a new role to maximize potential for customers in the cloud, but that’s cold comfort for on-premises users.

SAP pays multi-million fine for bribery

January 11, 2024: With a $220 million fine, SAP is drawing a line under a long-standing investigation by US authorities. The company is alleged to have bribed officials.

SAP faces breakdown in trust over innovation plans

December 5, 2023: The company’s plan to offer future innovations in S/4HANA only to subscribers of its Rise with SAP offering is alienating customers, user conference hears.

SAP unveils tools to help enterprises build their own gen AI apps

November 1, 2023: SAP Build Code suite combines new and existing developer tools, while a foundational AI model trained on anonymized customer data will be available to help automate tasks.

SAP’s new generative AI pricing: Neither transparent nor explainable yet

October 12, 2023: The ERP vendor is adding a new pricing tier to its Rise with SAP offering with an opaque mix of bundled and usage-based pricing for generative AI functionality.

SAP offers faster updates, longer maintenance for S/4HANA in private clouds

October 11, 2023: SAP is offering free migration consultations, more frequent feature releases and two years’ additional maintenance to entice customers to update to S/4HANA Cloud private edition and, ultimately, adopt Rise with SAP.

SAP prepares to add Joule generative AI copilot across its apps

September 26, 2023: Like Salesforce and ServiceNow, SAP is promising to embed an AI copilot throughout its applications, but planning a more gradual roll-out than some competitors.

CIOs are under intense pressure to deliver massive digital transformation initiatives with limited resources under tight time constraints. Boards of directors are placing a high priority on deploying generative AI as fast as possible so their organizations don’t lose competitive advantage. Meanwhile, organizations running SAP ERP platforms have until 2027 to upgrade from ECC and R3 to S/4HANA, when support will end. These are just two examples of the many challenges CIOs have on their plate.

Enter process intelligence, a data-driven approach that’s revolutionizing how CIOs navigate these challenging transformations. By providing a fact-based view of how systems and processes flow within organizations, it enables more informed decision-making at both strategic and tactical levels.

Here’s how it works. The platform uses process mining and augments it with business context to give companies a living digital twin showing the way their business operates. It’s system-agnostic and without bias, which means companies share a common language for understanding and improving how their business runs, connecting them to their processes, their teams to each other, and emerging technologies to their business. Meaning employees and teams can better collaborate to optimize their business within and across processes. Process intelligence can be applied to every process in every industry, allowing processes to scale to the level of your ambition, and drive the results we all know are possible.

Consider a large system migration challenge. Process intelligence helps CIOs tackle the complexity by providing clear visibility into current operations. For instance, a major alcohol distributor uses process intelligence to create detailed heat maps of their requirements across regions and geographies, an analysis that would have been prohibitively expensive and time-consuming using traditional methods. Process intelligence provides a common language between stakeholders by objectively documenting how work flows through the organization, helping managers to make data-driven decisions.

The technology also provides common language for the often-challenging gap between business and IT teams. During an upgrade, when custom code often needs to be retired and bespoke processes need to be standardized, business units may resist change With facts and data, this decision making becomes simpler.

When it comes to generative AI initiatives, many organizations rush in without a proper understanding of their processes and risk implementing a large language model that doesn’t produce the ROI the business expects. Deployments are often extremely complex, involving specialized, high-performance hardware, rollout of use cases, change management and lengthy training cycles to help people adjust to new ways of working. Process intelligence identifies where slowdowns and bottlenecks occur so managers can speed up and, where appropriate, simplify the deployment process.

Real-world success stories demonstrate the technology’s impact. HARMAN, a wholly-owned subsidiary of Samsung Electronics, leveraged process intelligence for business case planning during its transformation journey and currently uses it for fit-gap, custom code analysis and master data cleanup. As a result, accelerating progress towards completing its system migration. Another large consumer products company employed process intelligence to monitor user adoption during hyper care phases of their implementation, quickly identifying and resolving challenges in order execution and fulfillment. The end result? Happier customers.

The benefits of process intelligence extend beyond technical considerations. Project Management Offices (PMOs) find that process intelligence helps define clearer program scope, reducing the risk of scope creep and budget overruns. Systems integrators can bid more accurately on projects and complete them faster when they have detailed process insights at their disposal.

Celonis is the global leader in process mining and process intelligence. Well-known brands such as PepsiCo, Uber, ExxonMobil, Diageo, Mars, Calor Gas, Pfizer   and many more employ their platform for system transformation and execute initiatives faster.

To find out how Celonis can help your organization, visit here.


 

Olga Forné, CISO de Abertis, ha sido galardonada como CISO del Año en la reciente edición de los CIO 100 Awards Spain 2024. En concreto, el jurado ha valorado la trayectoria y experiencia de la directiva, que han sido claves para introducir la seguridad por defecto en los equipos de innovación y desarrollo de la multinacional española. De este modo, ha logrado anticipar y abordar los riesgos de una forma práctica y posicionar a su organización como referente en un entorno digital cada vez más complejo.

Fundada en 2003, la empresa especializada en gestión de autopistas aprovecha de lleno las capacidades de las nuevas tecnologías, pero no sin descuidar los desafíos que presenta la ciberseguridad, que son muchos, “cada día más”, según explica Forné. “Hay que distinguir dos sectores, el de las corporaciones y el del cibercrimen, que se dedica única y exclusivamente a explotar todas las herramientas posibles y a producir ciberataques. La mayoría de las empresas nos centramos en otras actividades, y dedicamos divisiones pequeñas a la seguridad, por lo que estamos en desventaja”.

En este contexto hay que tirar de “creatividad” y trabajar en varias direcciones: “Muy enfocados en el negocio para que no nos paren la operativa y, por otro lado, intentando mitigar riesgos”, dice. Además, añade, el escenario se está volviendo cada vez más difícil, con nuevas amenazas basadas en potentes tecnologías. Por ello, “tenemos que estudiar muy bien qué vamos a hacer con los recursos que tenemos y, repito, poner creatividad”.

“Nos encontramos en un sector en el que el aprendizaje es diario”

Las claves del éxito

De cara a 2025, y como no podría ser de otra manera, los retos persistirán. “Tenemos que poner mucho acento en la parte de resiliencia, asumir brecha en muchos casos, y en automatizar los aspectos de prevención y detección”, subraya. “[Las empresas] no tenemos gente suficiente, pero aun así hay que aprovechar las nuevas tecnologías tal y como lo hacen desde el cibercrimen, como por ejemplo, la inteligencia artificial (IA)”.

Preguntada por las claves de su éxito, y como extrapolarlas al resteo de la industria, Forné estima que lo principal es “que te apasione tu trabajo porque estamos en un sector de aprendizaje constante, diario”. En este sentido, prosigue, el networking es esencial; “conocer qué hacen otras personas. Muchas veces vamos al corto o medio plazo y no vemos lo que hay fuera, lo que nos puede llegar a pasar. Por último, asegura, “hay que rodearse de personas que te aconsejen de verdad y que tengan mucho espíritu crítico”.

Olga Forné, CISO de Abertis

Olga Forné, CISO de Abertis, durante el discurso posterior a la entrega del premio a CISO del Año.

IDG

Discover Financial Services has moved aggressively to the cloud in 2024 with a migration strategy focused on retaining hybrid flexibility and making the most of cloud elasticity.

EVP and CIO Jason Strle, who joined Discover 18 months ago after CIO and CTO roles at Wells Fargo and JPMorgan Chase & Co., has opted to migrate mission-critical workloads using Red Hat OpenShift on AWS. Moving these containerized workloads to AWS offers Discover greater flexibility and agility to handle the spikes and dips of seasonal consumer spending far more efficiently, he says.

Now that much of the migration is complete, the benefits of cloud elasticity have “paid off,” Strle says.

Discover’s implementation is unique in that it operates its OpenShift platform in AWS virtual private clouds (VPC) on an AWS multi-tenant public cloud infrastructure, and with this approach, OpenShift allows for abstraction to the cloud, explains Ed Calusinski, Discover’s VP of enterprise architecture and technology strategy.  

For many years, the Riverwood, Ill.-based finserv hosted workloads on a cloud platform within its own data centers. The OpenShift hybrid approach gives Discover the choice to run workloads on private or public clouds, enabling it to better manage and move workloads to multiple clouds and prevent vendor lock-in.

“More workloads were moved [to the cloud] in the first six months of this year than in all the years before, by far, orders of magnitude more,” Strle says. “Due to the elasticity of the environment, we were able to handle circumstances such as big surges, and that’s very important to us because of the way we do marketing and campaigns and different ways people interact with our rewards. That can lead to very spiky consumer behavior, and we can dynamically grow our capacity on public clouds.”

The container-based approach also provides Discover with connectivity to on-prem systems and a gateway that allows access to Discover’s core SaaS vendors — ServiceNow and Workday — as well as integration with external vendors, says Strle, who is also considering alternative container-based architectures as cloud options expand.

Banking on hybrid cloud

Discover’s decision to take a container-based approach as early as 2018 reflects the hybrid approach many consumer financial services have adopted to have maximum control over their workloads. For example, by leveraging OpenShift, Discover and other enterprises can achieve portability across AWS, Microsoft Azure, Google Cloud Platform, and IBM Cloud.

But introducing a container-based approach to cloud computing can introduce complexities and challenges, analysts note. Still the openness and capabilities outweigh the risks for those using OpenShift for AWS, says Sid Nag, VP of cloud, edge and AI infrastructure at Gartner.

“They’re using AWS for basic compute services but not for upper-layer compute services,” Nag explains. “They want to have the ability to run OpenShift anywhere — on the public cloud, on premise, or in a private cloud and they can move workloads around across different hybrid environments.”

Gartner predicts 90% of enterprises will adopt a hybrid cloud approach through 2027. The research firm notes that one major challenge all enterprises face in deploying generative AI will be data synchronization across the hybrid cloud environment.

Gearing up for generative AI

In terms of gen AI, Strle and his teams are exploring the potential long-term benefits, beginning with the company’s use of Microsoft’s Copilot for Office and for GitHub.

But Discover is taking a measured approach to the technology, with a centralized AI governance function within the company responsible for evaluating risk management around developing gen AI solutions, Strle says.

Another part of the organization that oversees data and decision analytics, dubbed DNA, is experimenting with Google’s Vertex gen AI platform for possible contact center use cases, he adds. Some Vertex capabilities are in production and the “ecosystem approach” to managing generative AI solutions as opposed to “stitching together a bunch of different AI tools” is the current gameplan, the CIO says.

“We are intentional about allowing some organic exploration of gen AI capabilities,” Strle says, emphasizing that Discover is not yet exposing customers to gen AI capabilities.

The financial services company is also evaluating open-source models based on Meta’s Llama and is considering more advanced gen AI models that make decisions autonomously — but Discover is not in embracing agentic AI yet.

“We are still focused on that ‘human in the loop’ with our deployment because we still have to manage all the risks and compliance associated with these solutions,” Strle says of the current gen AI models, which assist employees with internal tasks or validate and double-check human activity to eliminate errors.

Initially, Discover’s foray into GenAI will be limited to large language models (LLMs) performing document summarization and possibly supporting customer agents but there will be nothing directly customer-facing for the foreseeable future.

“We’re not going to go there,” the CIO says. “Anything that could potentially be making an important decision for the customer or could cause harm or confusion, those are things in the ‘Do Not Touch’ category.”

But in this era of speedy transformation, Strle won’t count anything out. “I’m not seeing an imminent opportunity, but I know that could change quickly so we’re not closing the door on anything,” the CIO says.

The finserv AI playbook

That approach appears to be a common one among the larger financial services players.

In a recent interview with CIO.com, Gill Haus, Chase CIO at JPMorgan Chase, said he is evaluating use of generative AI to improve internal operations, the contact center, and Chase’s travel business, with some gen AI use cases in production. But he will not deploy the technology in customer-facing applications until it is battle-tested and errors such as hallucinations are gone.

Like Discover, Chase has embarked on a major digital transformation, including the development of a new deposit platform, as well as a modernization of its legacy applications into microservices deployed on private clouds and on AWS and other public cloud providers.

“We will be doing use case-based approach,” Haus said. “It’s not going to be geared for a particular line of business. It’s geared for solving a type of problem or action.”

Their cautious approach to cloud and generative AI is typical for consumer lenders, one analyst says.

“While these companies continue to operate a significant number of financial systems in on-premises datacenters, they have been adopting cloud services for customer-facing websites and mobile apps,” says Dave McCarthy, VP of cloud and edge services at IDC.

“The excitement of implementing gen AI capabilities is tempered by the fact that much of this technology is new and unproven [and] this causes risk-averse companies in financial services to take a cautious approach,” McCarthy says. “Most companies start by experimenting with gen AI to improve internal process before adding customer-facing features.”

What is data science?

Data science is a method to glean insights from structured and unstructured data using approaches ranging from statistical analysis to machine learning (ML). For most organizations, it’s employed to transform data into value in the form of improved revenue, reduced costs, business agility, improved customer experience, developing new products, and so on. In short, data science gives the data collected by an organization a purpose.

Data science vs. data analytics

While closely related, data analytics is a component of data science, used to understand what an organization’s data looks like. Data science takes the output of analytics to solve problems. Data scientists say that investigating something with data is simply analysis, so data science takes analysis a step further to explain and solve problems. Another difference between data analytics and data science is timescale. Data analytics describes the current state of reality, whereas data science uses that data to predict and understand the future.

The benefits of data science

The business value of data science depends on organizational needs. Data science could help an organization build tools to predict hardware failures, enabling the organization to perform maintenance and prevent unplanned downtime. It could also help predict what to put on supermarket shelves, or how popular a product will be based on its attributes.

For further insight into the business value of data science, see The unexpected benefits of data analytics and Demystifying the dark science of data analytics.

Data science jobs

While the number of data science degree programs are increasing at a rapid clip, they aren’t necessarily what organizations look for when seeking data scientists. Candidates with a statistics background are popular, especially if they can demonstrate they know whether they’re looking at real results, have domain knowledge to put results in context, and have communication skills that allow them to convey results to business users.

Many organizations look for candidates with PhDs, especially in physics, math, computer science, economics, or even social science. A PhD proves a candidate is capable of doing deep research on a topic and disseminating information to others.

Some of the best data scientists or leaders in data science groups have untraditional backgrounds, even ones with little formal computer training. In many cases, the key is an ability to look at something from a unconventional perspective and understand it.

For further information about data scientist skills, see What is a data scientist? A key data analytics role and a lucrative career, and Essential skills and traits of elite data scientists.

Data science salaries

Here are some of the most popular job titles related to data science and the average salary for each position, according to the most recent data from Indeed:

  • Analytics manager: $80,000-$176,000
  • Business intelligence analyst: $56,000-$147,000
  • Data analyst: $50,000-$128,000
  • Data architect: $67,000-$173,000
  • Data engineer: $83,000-$195,000
  • Data scientist: $76,000-$195,000
  • Research analyst: $41,000-$134,000
  • Statistician: $50,000-$143,000

Data science degrees

According to Fortune, these are the top graduate degree programs in data science:

  • University of California, Berkeley
  • University of Illinois at Urbana-Champaign
  • Marshall University
  • Bay Path University
  • University of Texas, Austin
  • University of Missouri, Columbia
  • Texas Tech University
  • University of Chicago
  • University of California, Riverside
  • Clemson University

Data science training and bootcamps

Given the current shortage of data science talent, many organizations are building out programs to develop internal data science talent.

Bootcamps are another fast-growing avenue for training workers to take on data science roles, and for more details on data science bootcamps, see 15 best data science bootcamps for boosting your career.

Data science certifications

Organizations need data scientists and analysts with expertise in techniques to analyze data. They also need big data architects to translate requirements into systems, data engineers to build and maintain data pipelines, developers who know their way around Hadoop clusters and other technologies, and system administrators and managers to tie everything together. Certifications are one way for candidates to show they have the right skillset. Some of the top data science certifications include:

  • Certified Analytics Professional (CAP)
  • Cloudera Data Platform Generalist Certification
  • Data Science Council of America (DASCA) Senior Data Scientist (SDS)
  • Data Science Council of America (DASCA) Principal Data Scientist (PDS)
  • IBM Data Science Professional Certificate
  • Microsoft Certified: Azure AI Fundamentals
  • Microsoft Certified: Azure Data Scientist Associate
  • Open Certified Data Scientist (Open CDS)
  • SAS Certified Professional: AI and Machine Learning
  • SAS Certified Advanced Analytics Professional
  • SAS Certified Data Scientist
  • Tensorflow Developer Certificate

For more information about big data and data analytics certifications, see The top 9 data analytics certifications, and 12 data science certifications that will pay off.

Data science teams

Data science is generally a team discipline, and data scientists are the core of most data science teams. But moving from data to analysis to production value requires a range of skills and roles. For example, data analysts should be on board to investigate the data before presenting it to the team and to maintain data models. Data engineers are necessary to build data pipelines to enrich data sets and make the data available to the rest of the company.

For further insight into building data science teams, see How to assemble a highly effective analytics team and The secrets of highly successful data analytics teams.

Data science goals and deliverables

The goal of data science is to construct the means to extract business-focused insights from data, and ultimately optimize business processes or provide decision support. This requires an understanding of how value and information flows in a business, and the ability to use that understanding to identify business opportunities. While that may involve one-off projects, data science teams more typically seek to identify key data assets that can be turned into data pipelines that feed maintainable tools and solutions. Examples include credit card fraud monitoring solutions used by banks, or tools used to optimize the placement of wind turbines in wind farms.

Incrementally, presentations that communicate what the team is up to are also important deliverables.

Data science processes

Production engineering teams work on sprint cycles, with projected timelines. That’s often difficult for data science teams to do because a lot of time upfront can be spent just determining whether a project is feasible. Data must be collected and cleaned, and then the team must determine whether it can answer the question efficiently.

Data science ideally should follow the scientific method, though that’s not always the case, or even feasible. Real science takes time: You spend a little bit confirming your hypothesis and then a lot trying to disprove yourself. In business, time-to-answer is important. As a result, data science can often mean going with the good enough answer rather than the best answer. The danger, though, is results can fall victim to confirmation bias or overfitting.

According to computer science portal GeeksforGeeks, a typical data science process includes the following steps:

  1. Define the problem and create a project charter. A data science project charter outlines the objectives, resources, deliverables, and timeline to ensure all stakeholders are aligned.
  2. Retrieve data. Data relevant to the project could be stored in databases, data warehouses, or data lakes. Accessing that data may require navigating the organization’s policies and requesting permissions.
  3. Employ data cleansing, integration, and transformation. Data cleansing removes errors, inconsistencies, and outliers in the data. Integration combines datasets from various sources. Transformation prepares the data for modeling.
  4. Enact exploratory data analysis (EDA). This step uses graphical techniques like scatter plots, histograms, and box plots to visualize data and identify trends. This step helps in the selection of the correct modeling techniques for the project.
  5. Build models. This step involves building ML or deep learning models to make predictions or classifications based on the data.
  6. Present findings and deploy models. After completing the analysis, this step involves presenting the results to stakeholders and deploying models into production systems to automate decision-making or support ongoing analysis.

Data science tools

Data science teams make use of a wide range of tools, including SQL, Python, R, Java, and a cornucopia of open source projects such as Hive, oozie, and TensorFlow. These tools are used for a variety of data-related tasks, ranging from extracting and cleaning data, to subjecting data to algorithmic analysis via statistical methods or ML. According to the Data Science Council of America, some of the most popular data science tools include:

  • Python: A versatile programming language that’s a favorite of data scientists. It features extensive libraries for manipulating and analyzing data and implementing ML algorithms, including: NumPy, Pandas, seaborn, and scikit-learn.
  • R: A language and environment for statistical computing and graphics. R is an integral part of the data science toolkit, useful for data exploration, visualization, and statistical modeling.
  • JupyterLab: This web-based interactive development environment for notebooks, code, and data offers a flexible interface to configure and arrange workflows in data science and ML.
  • Excel: Microsoft’s spreadsheet software is perhaps the most extensively used BI tool around. It’s also handy for data scientists, working with smaller datasets.
  • ChatGPT: This generative pre-trained transformer (GPT) has become a powerful tool for data science tasks that can generate and execute Python code, and produce comprehensive analysis reports. It also features plugins for research, math, statistics, automation, and document review.
  • TensorFlow and PyTorch: These deep learning frameworks help data scientists develop and deploy ML models in the domain of neural networks. They help data scientists perform complex tasks including image recognition and natural language processing (NLP).
  • Tableau: Now owned by Salesforce, Tableau is a data visualization tool used to create interactive and shareable dashboards.
  • Apache Spark: This unified analytics engine is designed to process large-scale data, with support for data cleansing, transformation, model building, and evaluation.
  • Power BI: Microsoft’s Power BI facilitates data gathering, analysis, and presentation.

Juan Antonio Relaño, Chief Information Officer (CIO) de Bosch en España, ha sido reconocido con la máxima distinción en el marco de los CIO 100 Awards Spain 2024. El recién galardonado como CIO del Año ha visto reconocida su labor y trayectoria al frente de la multinacional especializada en tecnología e innovación, reconocida globalmente por su compromiso con la sostenibilidad y el avance en soluciones tecnológicas que mejoran la calidad de vida de las personas. Con una presencia arraigada en España desde hace más de 100 años, Bosch continúa impulsando la transformación digital en el país, mejorando la eficiencia y la competitividad en sectores diversos de la mano de Relaño. “Como CIO de Bosch en España desde 2013 y miembro del Comité de Dirección de Robert Bosch Iberia, reporto directamente al presidente”.

Este rol, asegura en entrevista para CIO España, le ha permitido “alinear las metas globales del grupo con la estrategia a nivel nacional”, liderando proyectos que han transformado sus operaciones en un ejemplo de innovación y eficiencia. Desde su desembarco en la compañía en 2005, tras una sólida carrera en multinacionales como Accenture o IBM, Relaño ha promovido la adopción de tecnología avanzada y la creación de una cultura de innovación en la que cada miembro del equipo de TI está capacitado para resolver problemas de manera proactiva.

“Lo más importante cuando trabajas en tecnología es la alineación con el negocio. Evidentemente hay que cubrir ese nexo, pero abogamos por ir más allá: hay que crear un impacto positivo en la cuenta de resultados, algo que hemos conseguido”

Desafíos a los que hacer frente

Entre los principales retos de su agenda, afirma, está el diseño de una estrategia tecnológica integrada para todas las unidades de negocio en España, las cuales operan en sectores muy diversos, incluyendo tanto modelos B2B como B2C. Este desafío, continúa, exige una visión holística y adaptable, capaz de equilibrar las distintas necesidades operativas y comerciales de negocios con dinámicas, mercados y clientes diferentes, sin perder de vista los objetivos globales de la corporación.

Por si esto no fuera tarea mayor, el CIO está liderando la transición de la compañía hacia la Industria 4.0, donde la conectividad, la automatización y el uso intensivo de datos actúan como pilares fundamentales. “Este enfoque incluye la adopción de tecnologías como el Internet de las Cosas (IoT), la analítica avanzada y soluciones de inteligencia artificial (IA) para optimizar la operativa de negocio, reducir costes y mejorar la sostenibilidad”. En este sentido, confiesa, un hito clave es el establecimiento de las bases para la fábrica del futuro, un modelo de producción modular en el que todo es flexible excepto el techo y el suelo. “Esta visión tiene como objetivo revolucionar la manera en que concebimos la fabricación, permitiendo que nuestras plantas sean altamente adaptables a las demandas del mercado, los cambios en los productos y los avances tecnológicos”.

Juan Antonio Relaño, CIIO de Bosch

Juan Antonio Relaño, CIIO de Bosch (dcha), junto a Manuel Ávalos (Cognizant).

Garpress | Foundry

Agenda y prioridades

Otra de las piedras angulares de la agenda de Relaño tiene que ver con el programa de Innovation Evangelist en IT Field Services, el cual coordina a nivel mundial y abarca más de 130 plantas y operaciones. “Este programa impulsa la adopción de tecnologías disruptivas, pero también establece una cultura de innovación dentro de Bosch que motiva a los equipos a anticiparse a las necesidades y optimizar sus recursos con una visión alineada con el negocio”, sostiene.

Firme defensor de la colaboración y la cooperación, en torno a las alianzas estratégicas ha desarrollado un programa de innovación abierta con universidades, escuelas de negocio, startups y ONG que ha fortalecido su compromiso con el entorno local y ha enriquecido sus capacidades. Un ejemplo destacado de ello es la cátedra en AIoT lanzada de manera conjunta con la Universidad Complutense de Madrid (UCM). “Esta iniciativa pionera explora la aplicación de la IA en el IoT y ha permitido la creación de tecnologías que fortalecen tanto nuestra posición de liderazgo como la vinculación entre academia e industria”. Siguiendo esta estela, en 2024 crearon un centro de desarrollo en sensórica cuántica en colaboración con el European Quantum Consortium, la Comunidad de Madrid y la UCM. Este centro es un proyecto de investigación y desarrollo cuyo objetivo radica en sentar las bases para una nueva generación de dispositivos que permitan realizar mediciones de precisión sin precedentes, beneficiando sectores como la automoción y la producción avanzada.

En este escenario, adelanta el CIO, su responsabilidad ha sido liderar la coordinación entre el equipo de España y los equipos de R&D, gestionando sinergias y asegurando que la infraestructura cumpla con los altos estándares técnicos y de seguridad requeridos. Este centro, incide Relaño, “no solo representa un hito tecnológico, sino que es una semilla para futuros desarrollos y fabricación de dispositivos en nuestras plantas de Madrid”.

Entre sus quehaceres el ejecutivo también encuentra tiempo para presidir el CIO Executive Council, una organización que, a su juicio, “refuerza mi marca profesional y potencia mi rol como líder en la comunidad de CIO”. Desde esta posición Relaño no solo genera valor para Bosch, sino que también contribuye con la industria fomentando el networking, compartiendo experiencias y explorando tendencias tecnológicas que anticipan los desafíos globales.

Programa de digitalización de Bosch

El programa de transformación digital que impulsa el CIO ha sido “clave” en la evolución de Bosch España, impulsando mejoras en procesos y resultados financieros. Entre sus logros puede presumir de la migración a la nube con SAP S/4HANA, optimizando la gestión de datos y reduciendo en un 15% los costes de mantenimiento. Además, incide, esta plataforma ha acelerado los tiempos de procesamiento en un 20%, aumentando con ello la agilidad en la toma de decisiones. “La transformación digital en Bosch no es solo un proyecto tecnológico, sino un esfuerzo de capacitación que ha beneficiado a más de 500 empleados”. En este sentido han implementado un programa de IA y analítica de datos elevando la eficiencia operativa en un 25% en áreas como la logística y producción.

En el ámbito de la IA, una de las tecnologías que más clamores ha desatado en los últimos tiempos, han creado un portal interno que permite a los empleados acceder a la IA generativa para tareas administrativas y de toma de decisiones. Esta plataforma, que permite desde la redacción de correos hasta la creación de respuestas automáticas a consultas complejas, garantiza la seguridad de los datos. Desde su implementación, apunta Relaño, más de 6.000 empleados de Bosch Iberia han utilizado la plataforma contribuyendo con ello a automatizar procesos que anteriormente requerían la friolera de 100.000 horas anuales de trabajo manual.

En términos de eficiencia, la compañía ha logrado un ahorro estimado de 2 millones de euros al año. Además, se ha registrado un incremento del 25% en la velocidad de respuesta a solicitudes internas y externas, mejorando significativamente la satisfacción del cliente interno y externo. Esta iniciativa no solo ha optimizado la eficiencia operativa, sino que también ha democratizado el acceso a soluciones avanzadas de IA dentro de la organización, asegurando que cada empleado cuente con el soporte necesario para maximizar su desempeño, impulsando una transformación digital “inclusiva y efectiva”.

Juan Antonio Relaño, CIO de Bosch y pemio al CIO del Año, junto a miembros del CIO Executive Council que preside.

Juan Antonio Relaño, CIO de Bosch y pemio al CIO del Año, junto a miembros del CIO Executive Council que preside.

Garpress | Foundry

Ventaja competitiva

Gracias a un modelo de gobernanza de datos sólido y bien definido han logrado desarrollar e implementar algoritmos avanzados de mantenimiento predictivo basados en IA y machine learning. “Estos algoritmos monitorizan constantemente el estado de los equipos en nuestras plantas, identificando patrones que anticipan posibles fallos antes de que ocurran”. Dicho modelo de gobierno de datos ha permitido implementar algoritmos de mantenimiento predictivo que han reducido un 30% las interrupciones en sus plantas.

Este enfoque no solo ha permitido evitar costosos tiempos de inactividad, sino que también ha generado un ahorro superior al millón de euros en el primer año de operación al optimizar las intervenciones de mantenimiento, minimizando reparaciones no planificadas y prolongando la vida útil de los equipos.

“En línea con nuestro compromiso hacia la sostenibilidad, hemos aprovechado tecnologías digitales como IoT y analítica avanzada para optimizar procesos clave en nuestras operaciones”. Estas tecnologías han permitido, por ejemplo, ajustar dinámicamente los consumos de energía y recursos en tiempo real, reducir desperdicios en las líneas de producción y optimizar rutas logísticas internas y externas. Este esfuerzo ha dado como resultado una reducción del 10% en las emisiones de CO₂ de sus operaciones industriales en el último año. Según declaraciones del CIO, este logro no solo refuerza nuestro compromiso con la neutralidad climática, sino que también consolida a Bosch como un líder responsable en su sector, marcando la pauta para que otras empresas adopten enfoques similares.

Más allá de los ahorros operativos, el modelo de gobernanza de datos ha creado una plataforma para democratizar el acceso a insights clave dentro de la organización. Esto no solo mejora la eficiencia operativa, sino que también fortalece la capacidad de sus líderes para tomar decisiones estratégicas basadas en datos confiables y en tiempo real. En resumen, concluye Relaño, “hemos demostrado el valor de una visión integral de la tecnología alineada con la estrategia de negocio”.

한국레노버는 12월부터 고객 지원 서비스 ‘프리미엄 케어(Premium Care)’에 온사이트 서비스를 옵션으로 포함한다. 프리미엄 케어 서비스가 적용되는 기간 동안 온사이트 서비스를 선택한 고객에게 적용되며, 일부 도서산간 지역을 제외한 전국 대부분의 지역에서 지원한다.

레노버 프리미엄 케어 서비스는 요가와 아이디어패드는 물론, 리전(Legion)과 로크(LOQ) 등 게이밍 노트북 브랜드에도 제공된다. 자세한 대상 제품은 레노버 서비스센터에서 확인하실 수 있다.

한국레노버는 “온사이트 서비스는 신속함과 편리성이 극대화한 레노버의 차별화된 서비스”라고 소개했다. 소비자는 고객센터를 통해 온사이트 서비스를 접수할 경우, 실시간으로 배정된 전문 엔지니어가 다음날 고객의 집으로 방문해 수리를 지원한다. 엔지니어의 실시간 이동 현황과 서비스 완료 결과서는 카카오톡 알림 톡으로 확인할 수 있다.

서비스 접수는 레노버 고객센터 번호(1670-0088)나 공식 홈페이지, 또는 카카오톡 ‘레노버 서비스’ 채널을 통해 할 수 있다.

한국레노버 신규식 대표는 “레노버는 고객의 시간을 절약하고 불편을 최소화하는 신속하고 효율적인 온사이트 서비스를 제공함으로써 고객 서비스 경험을 재정의하기 위해 최선을 다하고 있다”며, “이를 통해 높은 수준의 고객 지원 및 관리 서비스와 만족스러운 고객 경험을 최우선으로 하는 레노버의 진정성이 전달되기를 바란다”고 밝혔다. 이어 “국내 소비자가 글로벌 기술 리더의 탁월한 서비스 표준을 경험할 수 있도록 고객 서비스의 우수성에 대한 새로운 기준을 제시해 나갈 것”이라고 덧붙였다.
jihyun.lee@foundryco.com

구글이 11일 AI 코딩 에이전트 ‘줄스’를 통해 치열한 AI 코딩 도구 시장에 본격 진출했다. 줄스는 개발자의 작업 흐름을 개선하고 깃허브 코파일럿이나 아마존 Q 디벨로퍼와 같은 기존 도구에 도전장을 내민 AI 코딩 도구다.

여러 산업 분야에서 소프트웨어 개발을 가속화하는 데 AI가 필수 요소로 자리잡으면서 시장 경쟁이 격화되고 있다.

스태티스타의 자료에 따르면, 2024년 개발자 사이에서 오픈AI의 챗GPT가 가장 널리 사용되는 AI 기반 도구로 부상했으며, 82%가 정기적으로 사용한다고 응답했다. 깃허브 코파일럿이 44%로 2위를, 구글 제미나이가 22%로 3위를 차지했다.

구글 줄스가 성공적으로 자리잡는다면, 기업의 AI 개발 워크플로 도입과 통합 방식을 새롭게 정의할 수 있을 것이다.

줄스는 기본적으로 제미나이 2.0 AI 모델과 통합되어 버그 수정, 파일 관리와 같은 반복적인 코딩 작업을 처리한다. 개발자는 줄스를 활용해 우선순위가 높은 작업에 더 집중할 수 있을 것이라는 게 구글의 생각이다.

구글은 공식 블로그에서 “버그 수정 작업 후 긴 목록을 마주하게 되었을 때, 줄스를 통해 파이썬 및 자바스크립트 코딩 작업을 처리할 수 있다”라며 “줄스는 깃허브 워크플로우와 비동기적으로 작동하며, 버그 수정과 기타 시간 소모적인 작업을 맡아 개발자가 원하는 빌드 작업에 집중할 수 있도록 돕는다”라고 전했다.

줄스는 세부적인 다단계 계획 수립, 파일 간 코드 수정, 깃허브 통합을 위한 풀 리퀘스트 준비 등의 기능을 갖추어 복잡한 프로젝트 개발이나 대규모 개발팀의 협업에 효과적인 도구가 될 수 있다.

구글에 따르면, 줄스는 실시간 진행 상황을 추적할 수 있는 기능도 제공하며, 이를 통해 개발자는 작업 우선순위를 정하고 즉각적인 조치가 필요한 부분을 확인할 수 있다. 또한, 개발자는 줄스가 생성한 계획을 검토하고 수정 요청을 하며, 생성된 코드를 통합하기 전에 검토할 수 있어 자동화와 품질 관리를 균형 있게 유지할 수 있다.

구글의 전략

많은 개발자가 AI 코딩 도구를 활용하면서 AI 기반 개발 솔루션이 빠르게 확산되고 있다. 구글의 줄스도 이러한 시장 수요에 대응하기 위한 전략으로 보인다.

줄스는 자동화와 투명성 및 제어를 결합하여 경쟁이 치열하고 성장하는 분야에서 독특한 위치를 차지하는 것을 목표로 한다.

컨설팅 기업 에베레스트 그룹의 매뉴크리쉬난 SR 실장은 “현재 AI 기반 코딩 도구 시장은 깃허브 코파일럿과 아마존 Q 디벨로퍼 같은 빅테크 기업이 장악하고 있다”라며 “기존 제미나이의 코드 생성 기능은 챗GPT를 포함한 오픈AI의 제품 수준에 미치지 못했기 때문에 제미나이가 이 분야에서 오픈AI를 이길 수 있을지는 지켜봐야 한다”고 설명했다.

이러한 도전에도 불구하고, 분석가는 구글의 방대한 생태계가 줄스에게 강력한 기반을 제공한다고 분석했다.

컨설팅 기업 카운터포인트 리서치의 파트너이자 공동 설립자인 닐 샤는 “제미나이는 늦게 등장했지만, 가장 큰 규모의 개발자 및 코드 기반을 보유하고 있으며, 안드로이드뿐만 아니라 파이썬과 자바스크립트 같은 AI 중심 코드 기반을 위한 코딩 방식을 변화시킬 수 있다”라며 ” 제미나이는 경쟁 제품보다 더 빠른 속도로 시장에 확산될 것이며, 이는 줄스의 성능과 기능 향상을 촉진하고 구글 제미나이 2.0의 발전에도 큰 동력이 될 것”이라고 전했다.

샤는 또한, 구글이 검색, 안드로이드, G-스위트, 지도, 유튜브 등 매일 수십억 명이 사용하는 자사 애플리케이션과 자사 플랫폼을 쓰는 외부 개발자 네트워크를 통해 제미나이 2.0 기술을 더욱 확장할 수 있는 구글의 주요 강점으로 꼽았다.

이와 같은 확장성은 구글이 경쟁사보다 빠르게 줄스와 관련 AI 기술을 확산시키는 데 유리한 위치를 제공한다.

기업에 미치는 영향

줄스와 같은 AI 기반 코딩 도구는 소프트웨어 개발 방식을 근본적으로 바꿀 가능성이 있다.  과거 손으로 일일이 쓰는 코딩 방식에서 AI의 도움을 받는 소프트웨어 관리로 개발 패러다임이 바뀔 수 있다.

업계 전문가는 이런 변화가 특히 대규모 프로젝트를 관리하는 데 있어 기업 워크플로우에 큰 영향을 미칠 수 있다고 지적한다.

매뉴크리쉬난은 “대규모 프로젝트를 관리하는 기업의 경우, 특히 파이썬이나 자바스크립트 같은 요즘의 주류 프로그래밍 언어를 사용하는 경우, 개발자의 생산성과 경험이 크게 달라 질 수 있다”라며 ” 다만 COBOL과 같은 레거시 언어에서는 훈련 데이터의 부족으로 인해 생산성을 높이는데는 제한이 있을 것”이라고 예측했다.

한편, 기업들이 줄스와 같은 도구를 기존 워크플로우에 통합하는 과정에서 여러 도전에 직면할 수 있다. 특히 기업 코딩 표준을 유지하고 일관된 코드 품질을 보장하는 문제가 주요 과제로 지적된다.

매뉴크리쉬난은 “AI 코딩 도구는 새로운 애플리케이션 개발에서는 유용하지만, 복잡한 통합 요구사항이 있는 대규모 현대화 프로젝트에서는 가치를 입증하기 어려운 경우가 많다”라며 “AI 기반 코딩 도구의 잠재력은 분명 크지만, 기업들은 이를 채택할 때 기존 시스템이나 엄격한 통합 요구사항과 관련된 도전 과제를 신중히 고려해야 한다. 이러한 균형이 기업 내 AI 코딩 도구의 향후 발전 방향을 결정지을 것이다”라고 분석했다.
dl-ciokorea@foundryco.com

최근 캐피털원(Capital One)이 발표한 AI 준비 수준 조사에 따르면, 비즈니스 리더의 90% 가까이가 자사 데이터 생태계가 AI를 대규모로 구축하고 배포할 준비를 갖췄다고 답했다. 그러나 데이터 과학자, 데이터 아키텍트, 데이터 분석가 등 IT 실무자의 84%는 매일 최소 1시간 이상을 데이터 문제 해결에 할애하고 있다고 응답했다. IT 전문가의 70%는 하루 1~4시간을, 14%는 4시간 이상을 데이터 문제 해결에 쓰고 있는 것으로 나타났다.

공급망 지속가능성 데이터 인사이트 플랫폼인 월드리(Worldly)의 CTO 존 암스트롱은 대부분의 AI 도구 배포에 필요한 데이터 작업에 대해 많은 비즈니스 리더들이 근본적으로 오해하고 있음을 이 조사 결과가 보여준다고 설명했다.

암스트롱은 “AI에 데이터를 많이 던져 넣기만 하면 모든 문제가 해결될 것이라는 관점이 있다. 기술 리더의 역할은 조직에서 무엇이 가능하고 목표 달성에 무엇이 필요한지 교육하는 것”이라고 말했다.

그는 AI의 데이터 관리 요구사항에 대한 오해가 계속되는 상황이 시사하는 바가 크다면서, 다른 IT 리더와 대화해 보면 모두가 AI 도입 압박에 시달리고 있다고 전했다. 암스트롱은 “제대로 하지 않으면 조직이 잘못된 결과를 내놓는 솔루션 세트에 말 그대로 수백만 달러를 지출하게 되기 때문에 매우 중요한 문제”라고 지적했다.

AI 역량에 대한 오해

소프트웨어 아웃소싱 제공업체인 베어스데브(BairesDev)의 CTO 저스티스 에롤린은 이번 조사가 조직 내 인식 차이를 전형적으로 보여준다고 말했다.

에롤린은 “경영진은 파일럿 프로젝트나 발표 자료를 통해 AI의 가능성에 매료된다. 하지만 일상 업무에 적용하는 데 필요한 핵심을 항상 파악하는 것은 아니다. 여기서 마찰이 발생한다”라고 지적했다.

그는 비즈니스 리더의 자신감이 주로 AI 모델이나 알고리즘에 집중된다면서, “데이터 품질, 기존 시스템 통합과 같은 까다로운 기초 작업은 고려하지 않는 경향이 있다”라고 말했다. 성공적인 파일럿 프로젝트나 성과가 좋은 알고리즘이 비즈니스 리더에게 잘못된 희망을 줄 수 있다는 설명이다. 에롤린은 “더 큰 그림을 보면 다른 이야기가 나올 수 있다”라고 덧붙였다.

예를 들어, 에롤린은 바이레스데브의 한 고객사가 AI 프로젝트 일정의 30%를 기존 시스템 통합에 써야 한다는 사실에 큰 충격을 받았다고 언급했다. 이는 많은 기업들이 AI 도입 시 실제로 필요한 기초 작업의 규모와 시간을 과소평가하고 있음을 보여주는 사례다.

AI 프로젝트 전에 데이터 문제를 해결하기 위한 기초 작업은 예상할 수 있지만, 직원들이 매일 수 시간씩 지속적으로 데이터 문제를 수정한다면 조직의 데이터가 AI에 준비되지 않았다는 경고 신호일 수 있다고 그는 말했다. 또한 그는 AI 준비가 된 조직은 데이터 관리 작업의 일부를 자동화할 수 있어야 한다고 덧붙였다.

에롤린은 “데이터 운영과 정제의 기본적인 작업에 그렇게 많은 시간을 쓴다면, 도메인 전문가들을 더 큰 전략적 업무에 활용하지 못하고 있는 것”이라고 진단했다.

기존 시스템 문제

컴플라이언스 솔루션 제공업체인 에비돌로지 시스템즈(Evidology Systems)의 설립자이자 CTO인 루퍼트 브라운은 데이터 수집 및 저장이 제한된 기존 시스템이 문제일 수 있다고 언급했다. 일부 산업에서는 오늘날 AI 모델이 필요로 하는 방식으로 데이터를 수집, 전송, 저장하도록 설계되지 않은 소프트웨어와 미들웨어를 사용하고 있다는 설명이다.

브라운은 “데이터 품질은 가까운 미래에 AI 기술의 유용성을 제한할 문제다. 업계에는 입력 데이터 필드가 제한되거나 계좌 번호를 재사용해야 하는 기존 시스템이 여전히 널리 쓰인다. 이는 AI가 이해할 수 없는 수정 사항을 낳는다”라고 지적했다.

에롤린은 과장된 기대와 낮은 데이터 준비도의 문제를 해결하기 위해 CIO 및 IT 리더가 투명성과 협력에 주목해야 한다고 말했다. 그는 베어스데브가 비기술 이해 관계자에게 AI 구현의 현실과 과제를 교육하는 데 집중하고 있다고 언급했다.

에롤린은 “실제 과제와 기술팀이 이를 해결하는 데 쓰는 시간을 경영진인 납득하고 나면 강력한 데이터 관행에 투자하고 기대치를 조정할 가능성이 높아진다. 모두가 같은 생각을 갖도록 하는 것이 중요하다”라고 말했다.

캐피털원의 데이터 엔지니어링 부사장인 테렌 피터슨은 비즈니스 리더의 기대치와 IT 실무자의 경험 사이에 괴리가 나타났지만, 오히려 조직의 오랜 데이터 문제를 해결하는 데 필요한 자원을 제공하는 기회일 수 있다고 언급했다. 그는 캐피털원이 고객 서비스 향상을 위한 AI 도구 도입을 검토하는 과정에서, 다른 기업들의 AI 준비 상태와 실제 구현 과정에서의 어려움을 파악하기 위해 이번 조사를 진행했다고 설명했다.

피터슨은 “데이터 위생, 품질, 보안은 모두 지난 20년에 걸쳐 이야기해 온 주제다. AI와 ML이 데이터 기본 요소에 대한 관심을 높이는 촉매제가 될 가능성을 생각해 볼 수 있다”라고 말했다. 그는 AI 혁명이 데이터 품질의 중요성에 대한 이해 수준을 높일 수 있다면서, “비록 여러 CIO의 과제 목록에서 후순위였을지 모르지만 이제는 우선순위가 높아질 것”이라고 내다봤다.

소규모 프로토타입으로 시작하기

현재의 과장된 분위기 때문에 많은 비즈니스 리더가 생성형 AI 배포에 집중하고 있지만, 월드리의 암스트롱은 IT 리더가 특정 AI 기술 대신 사용 사례에 집중할 것을 권장했다. 일부 사용 사례에서는 머신러닝이나 신경망과 같은 오래된 AI 기술이 더 적합하고 저렴할 수 있기 때문이다. 그는 생성형 AI가 다른 AI 도구에 비해 엄청난 양의 에너지를 사용한다고 덧붙였다.

암스트롱은 또한 CIO가 자사에 가장 적합한 AI 사용 사례를 찾기 위해 소규모 프로토타입으로 시작할 것을 추천하며, 일부 실험은 성공하지 못할 수 있다는 점을 인정해야 한다고 설명했다.

그는 “실험이 거대할 필요는 없다. 기술에 친숙해질 기회를 얻는 것이다. 실험은 가능성을 제공한다. 전술적 조언을 하나 한다면 섣불리 제품화하지 말고 천천히, 꾸준히 투자하라는 것”이라고 말했다.

이어 그는 “일련의 지식을 쌓고 싶을 수 있다. 모두가 빠른 개발을 원하지만, 아무도 실패하고 싶어 하지 않는다. 우리 업계의 위선이다. 시도해 보고 잘 되면 계속하고, 잘 안 되면 배우면 된다”라고 조언했다.
dl-ciokorea@foundryco.com

안드로이드 XR은 VR·AR·XR 헤드셋과 스마트 안경에 최적화된 플랫폼으로, 구글의 주요 서비스와의 연동이 특징이다. 사용자는 XR 내 대형 가상 스크린으로 유튜브를 시청하고, 구글 포토의 사진·영상을 입체감 있게 볼 수 있다. 구글 맵은 실제 거리와 건물을 3D로 체험할 수 있으며, 크롬 브라우저도 XR 환경에 맞춰 최적화됐다.

구글 플레이의 기존 앱과 게임도 안드로이드 XR에서 구동이 가능한 가운데, 구글은 2025년 안드로이드 XR을 중심으로 개발자 생태계를 확대해 전용 앱과 게임, 몰입형 콘텐츠를 선보일 계획이다.

구글은 이번 발표를 통해 2013년 첫 선을 보이고 2023년 단종된 ‘구글 글래스’와 유사한 제품의 재출시를 시사했다. 구글이 하드웨어까지 직접 개발하는지 구체적으로 밝히지 않았으나 “현재 안드로이드 XR이 적용된 프로토타입 글래스를 소규모 사용자 그룹 대상으로 테스트 중”이라고 전했다.

안드로이드 XR이 탑재된 스마트 안경은 터치 한 번으로 제미나이 AI 모델 기반 서비스 이용이 가능하다. 길 안내, 번역, 메시지 요약 등의 기능을 통해 스마트폰 없이도 유용한 정보를 안경에 보여주겠다는 구상이다.

구글은 개발자 생태계 확대를 위해 증강현실 개발도구(ARCore), 안드로이드 스튜디오, 유니티 등 개발 도구 지원도 강화하겠다고 밝혔다. 아울러 링스(Lynx), 소니, 엑스리얼(XREAL) 등 퀄컴 파트너사와 매직리프(Magic Leap)와의 협력을 통해 다양한 XR 기기 개발을 지원한다.

삼성전자도 주요 파트너로 참여한다. 삼성은 프로젝트 무한(Project Moohan) 코드명의 XR 헤드셋을 개발 중이며, 안드로이드 XR 플랫폼과 제미나이를 활용할 예정이라고 밝혔다.
jihyun.lee@foundryco.com