Like most CIOs you’ve no doubt leaned on ROI, TCO and KPIs to measure the business value of your IT investments. Maybe you’ve even surpassed expectations in each of these yardsticks.

Those Three Big Acronyms are still important for fine-tuning your IT operations, but success today is increasingly measured in business outcomes. Put another way: Did you achieve the desired results for your IT investments?

For more than a decade, IT departments derived business value from cloud computing—public, private and maybe hybrid. Of late, concerns about the public “cloud-first” approach have emerged to challenge business value and skewer ROI, TCO and KPIs. And it drew the curtain on a critical reality: IT profiles are much more complex.

A more thoughtful approach to procuring and managing assets is needed to help hurdle the challenges posed by those diverse estates. To understand how to get there, it helps to first unpack how we got here.

When Diminishing Returns Become Budget Busters

For years enterprises scrambled to build applications in public cloud environments; there was legitimate business value in rapid innovation, deployment and scalability, as well as unfettered access to more geographical regions.

“Cloud-first strategy” became a cure-all for datacenter impediments, as well as an IT leader’s tentpole for digital transformation.

More recently some organizations have reported diminishing returns from their public cloud implementations. Some companies calculated savings after moving from public clouds to on-premises—or cloud repatriation. Others conducted apples-to-apples comparisons of public cloud versus on-premises costs.

In some instances, poor implementation and faulty configurations were the culprits for deteriorating ROI, TCO and KPI values. Collectively these factors have dulled the initial sheen of agility and innovation around the public cloud.

The reality is the decision to put applications in the public cloud or on-premises systems is not an either-or argument; rather, it requires a nuanced conversation, as consultant Ian Meill points out in this sober assessment.

Smart Workload Placement is Key

Meill is right. The real argument about where to allocate applications to generate business value is around the most appropriate location to place each workload. Because, again, IT environments are far more complex these days. They’ve become multicloud estates.

To accommodate an accrual of disparate applications, you’re likely running a mix of public (probably more than one) and (maybe) private clouds in addition to your traditional on-premises systems. You might even operate out of a colo facility for the benefits cloud adjacency affords you in reducing latency. Maybe you manage edge devices, too.

Workload placement is based on several factors, including performance, latency, costs, and data governance rules, among other variables. How, where and when you opt to place workloads helps determine the business value of your IT investments.

For example, you may elect to place a critical HR application on-premises for data locality rules that govern in which geographies employee data can run. Or perhaps you choose to offload an analytics application to the public cloud for rapid scalability during peak traffic cycles. And maybe you need to move an app to the edge for speedier data retrieval.

Of course, achieving business value via strategic workload placement isn’t a given. There is no setting them and forgetting them.

As you navigate the intricacies of workload placement, you face many challenges such as: Economic uncertainty (the market is whipsawing); deficit in IT talent (do you honestly recall a time this wasn’t an issue?); abundant risk (data resiliency, cybersecurity, governance, natural disasters); and other disruptions that threaten to crimp innovation (long IT procurement cycles and slow provisioning of developer services).

You can try to tackle those challenges with a piecemeal approach, but you’ll get more value if you deploy an intentional approach to running workloads in their most optimal location. This planning is part of a multicloud-by-design strategy that will enable you to run your IT estate with a modern cloud experience.

A Cloud Experience Boosts Business Value

As it happens, an as-a-Service model can help deliver the cloud experience you seek.
For instance, developers can access resources needed to build cloud-native applications via a self-service environment, freeing up your staff from racking and stacking, provisioning and configuring assets to focus on other business critical tasks.

To help you better align cost structure with business value, pay-as-you-go consumption reduces your reliance on the rigorous IT procurement process. This cloud experience will also help you reduce risk associated with unplanned downtime, latency and other issues that impact performance and availability SLAs aligned to your needs.

Leveraging such a model—and in conjunction with trusted partners—IT departments can reduce overprovisioning by 42% and support costs by up to 70%, as well as realize a 65% reduction in unplanned downtime events, according to IDC research commissioned by Dell1.

Dell Technologies APEX portfolio of services can help you successfully manage applications and data spanning core datacenters to the edge, as well as the mix of public and private clouds that comprise your multicloud environment. This will help you achieve the business outcomes you seek.

Regardless of where you opt to run your assets, doing so without a modern cloud experience is bound to leave business value languishing on your (or someone else’s) datacenter floor.

Learn more about our portfolio of cloud experiences delivering simplicity, agility and control as-a-Service: Dell Technologies APEX.

[1] The Business Value of Dell Technologies APEX as-a-Service Solutions, Dell Technologies and IDC, August 2021

Cloud Management

Every business leader wants to be the next hero, praised for sharpening the corporate competitive edge. Business heroes are the ones who solve big problems by leveraging emerging technology to awaken new powers accelerating strategic outcomes. So, why not use artificial intelligence (AI) to step into your higher potential, automating a system that drives more dollar value out of your corporate IT investments?

It’s time to get more value out of accelerated innovation

Thanks to years of accelerated innovation, businesses of all sizes are capitalizing on the agility of digital services and remote work. But at what cost? The challenge today is: How efficient and sustainable is your IT spending when it’s gone unrestrained over the past three years? Business heroes might even be taking on the task of curing a corporate digital transformational hangover. Consider that:

While 78% of companies adopt the cloud, not all are seeing value of their investment. Cloud overspending can be as high as 70% according to Gartner.Roughly 29% of cloud investments go to waste, according to an upcoming CIO study.Shadow IT can consume 30-40% of IT budgets, and it’s not uncommon for companies to have 10-20 times more cloud applications than they anticipate—especially following panic-stricken video conferencing purchases.

Today, IT investments happen at warp speed, and afterward business leaders are expected to govern those investments, normalizing them into the company’s standards of operational excellence. That requires applying security protections to cloud investments and remote work, taking a recount of all IT resources after widespread changes, putting checks and balances in place to manage new assets, and realigning spending with business goals.

While anyone can achieve these goals, only those who can automate them will be celebrated as a hero. But as we all know, a hero can’t win the big prize without first going on a journey.

AI: the journey to intelligent IT expense optimization

AI is making intelligent automation the new business standard. In IT, machine learning and behavioral analytics are no longer used only for making sense of security threat data or predicting network service outages. They are now being applied to address the problems of IT cost control, vendor management, and administration burdens surrounding today’s highly distributed business ecosystems.

Much like a robot vacuum learns the layout of your house, AI-powered analytics can be used to study the entire IT environment alongside its associated services and expenses, correlating this information with usage data. Tracking cloud infrastructure, network connections, mobile devices, and their services generates granular data intelligence, allowing AI engines to “understand” current IT spending trends and “see” how effectively a company uses its existing investments. Let’s look at one example.

IT service sprawl: championing vendor management with AI

AI is solving the problems of vendor management and provider sprawl—issues all too familiar to IT leaders handling an ever-expanding landscape of tools, services, and dashboards. In fact, it’s assisting with some of the downsides of software-defined networks (SD-WAN) and Secure Access Service Edge (SASE) investments when companies suddenly find themselves with an overabundance of internet service providers (ISPs) to manage. AI works to eliminate the manual work of handling hundreds of ISPs and other sprawling IT service providers.

How does it work? Advanced analytics observe network services, connectivity usage, and the costs of global links across multiple vendors, allowing IT leaders to make quick sense of highly complex telecommunications environments. With a mountain of data crunched across ISPs, voice, and all fixed wireline services, companies can gain contextual clarity into how they are using their network services all in one consolidated view for elevated insights.

AI-powered telecom expense optimization can:

Eliminate the time-consuming need for administrators to collect, review, and correlate expansive datasets including inventories of services, providers, contracts, and service level agreements.Evaluate the usage activities of all network services in one dashboard, identifying unused assets, pinpointing billing inaccuracies, and streamlining the process of chasing down credits when network service providers fail to fulfill their SLA commitments.Prevent telecom service disruptions by automating complex invoice validation and approval processes to pay bills on time and accurately allocate IT costs across business units and departments.

This is one way AI automates IT expense optimization. Let’s explore the others.

Cloud optimization: awakening the powers of AI and closed-loop automation

Everyone is migrating to the cloud, and AI engines designed to automate cloud cost savings have two unique capabilities worth highlighting.

The first important distinction is AI’s ability to recommend solutions for the problems it recognizes. Big data insights and problem identification are the advantages of yesterday—actionable recommendations and automated problem solving are today’s biggest AI benefits. For example, AI can observe your corporate cloud infrastructure services and cloud application investments, essentially guiding you in how to use what you already own more efficiently.

An AI engine might recommend how to:

Optimize cloud service provider contracts, using long-term discounts to lower costs.More efficiently use the cloud storage and servers you have in place.Get more savings out of pausing features, turning off services when they aren’t needed.Reduce redundant applications, consolidating providers to lower IT expenses .Pinpoint unused application licenses, helping reallocate resources to other users.Identify security risks associated with unsanctioned cloud applications.

The second important distinction: the ability for AI to automatically act on its own recommendations. This is the signature of advanced AI capabilities known as “closed-loop automation.” Not only can AI recognize the problem alongside the solution, but it can also make that solution a reality with just the click of an approval button. Tight integration makes this possible. Only when AI engines are connected to the cloud service delivery platform can they manipulate settings and make changes to the control panel on your behalf.

Closed-loop automation marks the moment when AI advances from a data intelligence service to a virtual assistant, doing the more meaningful work of actually solving the core problem.

Using the cloud cost optimization examples from above, here’s what closed-loop automation looks like in a real-world scenario:

AI engine: Recommends using cloud infrastructure pausing features for the IT development environment because resources are only used during business hours.IT engineer: Clicks approve.AI engine: Uses API calls to implement changes inside your cloud service dashboard (inside the AWS environment).

This is the type of automation that gets business leaders crowned heroes. Automated problem solving is the true digital advantage because it literally accelerates business outcomes. Let’s face it, every business hero knows that nothing stops innovation in its tracks like the moment when a computer-automated workflow gets handed back to the human, essentially asking the employee to take it from there.

Arriving at automated IT expense optimization  

After accelerated innovation, harnessing information across the IT ecosystem is harder than it was just three years ago, and AI is the best tool for smarter resource allocation and tighter cost control.

The first step for business heroes is to apply advanced analytics to cloud and network services, so AI engines can start to understand what’s happening inside the IT environment. The key is to align AI to your strategic cost-savings initiatives, knowing which data streams coincide. After using AI to quickly recognize spending patterns and discrepancies between service usage and costs, it then becomes easier to advance into automated problem-solving using closed-loop automation.

Worried about how to get started?

Start with any functional area that is plagued by a combination of complex data with manual administrative processes and lean on IT expense management providers to usher in AI-powered platforms that simplify implementation through software and services. If you have a vast Iandscape of global IT services to cost optimize, look for a partner that can integrate with hundreds of IT service providers across the globe. The best expense optimization teams bring a library of IT spending insights, understanding the latest pricing information as well as how companies should shift their IT investments in response to economic pressures, remote work, and new technology trends.

In the end, business leaders crowned true heroes are the ones who save 15-40% of their IT costs by automating expense optimization. But in doing so, they also help their companies spend less money on the tools they need to simply run the business and more money on digital innovation.

To learn more about IT expense and asset management services, visit us here.  

Digital Transformation, Endpoint Protection, Master Data Management, Remote Access Security, Security Infrastructure

Enterprises worldwide are not tapping the potential of their data when making critical business decisions and navigating uncertain macroeconomic conditions, according to a Salesforce survey.

Nearly 67% of 10,000 business leaders polled globally are not using data to set pricing in line with economic conditions such as inflation, according to the Untapped Data Research survey.

Only 29% of these leaders are using data to set strategy when launching products or services in new markets, and just 17% are using data to achieve their climate goals, according to the survey. Just 21% of the survey respondents said they are using data to make decisions about their company’s diversity goals.

The lack of data utilization is happening even though 80% of the leaders said that data is critical to decision making and 73% said that data reduces uncertainties.

The business leaders who were polled also believe that data can help generate more efficiency and trust in their organizations if leveraged correctly, according to the survey. Nearly 72% of these leaders said that data keeps people focused on the things that matter and that are relevant to the business.

In addition, more than 66% of the executives surveyed said that they think data can help minimize the influence of personal opinions or egos in a business conversation.

Data deluge sparks operational challenges

The volume of data generated and the lack of knowledge to operationalize or utilize it in the most effective way are impediments to tapping the potential of enterprises’ data reserves, according to survey respondents.

“While 80% of business leaders say data is critical in decision-making, 41% cite a lack of understanding of data because it is too complex or not accessible enough. What’s more, one-third of leaders said they lack the ability to generate insights from data,” Francois Ajenstat, chief product officer at Tableau, wrote in a blog post.

Salesforce acquired visual analytics software provider Tableau in August 2019.

In addition to the impediments cited by Ajenstat, the volume of data generated globally is expected to more than double by 2026, adding to more complexities for enterprises, according to the study.

Investing in data literacy skills could be the solution

Enterprise leadership teams can work to eliminate these impediments by investing in data literacy programs for employees and weaving a data culture into the fabric of the enterprise, according to Ajenstat.

“If a company doesn’t yet have a data culture, then they need to invest in platforms that allow them to turn repeatable processes into core capabilities,” Ajenstat said, adding that data literacy programs should be offered to all employees.

The proliferation of generative AI and natural language processing will break down learning barriers for employees, Ajenstat said.

“These innovations are giving non-data people the confidence to make an informed decision and act on it,” Ajenstat wrote.

Data Management

Understanding the student lifecycle isn’t easy. With more higher education institutions attempting to embrace digital learning, there is a growing need for visibility throughout the student journey. By gathering data across every student, faculty and alumni touchpoint, institutions can optimise each stage of the admission and onboarding process. 

The appetite for insights among higher education institutions is such that the global big data analytics in education market is expected to grow from a value of $18.02 billion in 2022 to reach $36.12 billion by 2027.

Unfortunately, many institutions remain reliant on legacy solutions with siloed data, which introduces lots of ad hoc manual tasks that slow the process of attracting and nurturing prospects. 

Automation will play a key role in enabling providers to implement a data-first approach – and better support prospects and recruitment faculties to ensure the student lifecycle runs as smoothly as possible. 

The problem with legacy tools in higher education 

Most higher education institutions today rely on legacy middleware they are familiar with, but that fails to offer visibility over the student lifecycle. These solutions make it difficult to access student records, accommodation, financial data and third party or cloud platforms. 

Data is also isolated and siloed in on-premises solutions, making it difficult to generate insights and optimise the student experience. 

In order to generate concrete insights, data needs to be collected at the edge of the network and across campus to feed into a centralised analytics solution. There it can be processed to develop insights into how to improve operations over the long-term.

How Boomi addresses these challenges 

The answer for these organisations is to undergo digital transformation by migrating datasets to the cloud. Ultimately, this will generate concrete insights to enhance the experience for students and faculties. 

While this transition is already underway, with 54.3% of higher education institutions reporting they were cloud-based in 2021, there are many that still need to migrate to the cloud. 

Integration platform as a service (iPaaS) solutions like the Boomi AtomSphere Platform can help enable this transition by unifying application data to ensure insights are accessible throughout the environment via a single cloud platform. 

Essentially, Boomi offers organisations the ability to connect data from a variety of sources, helping with the process of migrating data to the cloud and connecting data sources wherever they may be.  

Connecting data allows decision makers to generate the insights needed to make faster admission decisions – such as streamlining the onboarding experience for prospects and recruitment faculties. 

The easy way to move to the cloud 

Boomi has emerged as a key provider in enabling higher education institutions to move to the cloud. Boomi supports Amazon Web Services (AWS) data migration and application modernisation to link data, systems, applications, processes, and people together as part of a cohesive ecosystem. 

This approach enables higher education institutions to leverage a growing number of services through AWS, simplify data pipelines and improve transparency for decision makers. 

Ultimately, by providing decision makers with access to high quality data, institutions will not only increase the quality of the student experience but become more cost efficient by maximising retention.

To find out more about Boomi click here.

Your CTA

Education and Training Software, Education Industry

The successful journey to cloud adoption for Banking, Financial Services, and Insurance (BFSI) enterprises cannot be completed without addressing the complexities of core business systems. Many businesses have been able to migrate corporate support systems – such as ERP and CRM, as well as IT security and infrastructure systems to the public cloud. However, security concerns, legacy architecture, country-specific regulations, latency requirements, and transition challenges continue to keep the core system from cloud adoption.

BFSI enterprises will be unable to realize the full cloud potential until their core business systems use cloud platforms and services. Firms are looking for solutions that will allow them to continue operating out of their data centers while also providing access to the cloud-shared infrastructure made available to them in their own data centers.

To address these challenges, leading cloud service providers have launched hybrid integrated solution offerings that allow enterprises to access cloud services from their respective data centers via shared infrastructure provided by cloud providers. These allow enterprises to deploy their applications on either the cloud shared infrastructure or on their own data centers without having to rewrite the code.

Enterprises have two options: run applications directly on the cloud or run computing and storage on-premises using the same APIs. To provide a consistent experience across on-premises and cloud environments, the on-premises cloud solution is linked to the nearest cloud service provider region. Cloud infrastructure, services, and updates, like public cloud services, are managed by cloud service providers.

AWS Outposts is a leading hybrid integrated solution that provides enterprises with seamless public cloud services at their data centers. Outpost is a managed AWS service that includes computing and storage. Outpost provides enterprises with an option to be closer to the data center, as many BFSI core systems require significantly low latency, as well as an ecosystem of business applications residing on on-premise data centers.

AWS Outposts will deliver value to BFSI enterprises

Several BFSI enterprises use appliance-based databases for high performance and high availability computing. In the short and medium term, it is unlikely that these enterprises will migrate their appliance-based databases to the cloud; however, AWS provides an option to run these systems on Outposts while keeping databases on the appliances. Outpost also assists in the migration of databases from proprietary and expansive operating systems and hardware to more cost-effective and economical hardware options.

Other use cases, such as commercial off-the-shelf BFSI products that require high-end servers, can be easily moved to AWS Outposts, lowering the total cost of ownership. As a strategy, legacy monolithic core applications that require reengineering can be easily moved to AWS Outposts first and then modernized incrementally onto the public cloud.

A unified hybrid cloud system is the way forward for BFSI enterprises

AWS Outposts offer BFSI enterprises a solution that combines public and private infrastructure, consistent service APIs, and centralized management interfaces. The AWS Outpost service will be able to assist BFSI enterprises in dealing with the many expansive appliance-based core systems that run on proprietary vendor-provided operating systems and hardware that are very expensive.

AWS Outposts will allow BFSI enterprises to gradually migrate to the public cloud while maintaining core application dependencies. AWS Outpost enables a true hybrid cloud for BFSI enterprises.

Author Bio

TCS

Ph: +91 9841412619

E-mail: asim.kar@tcs.com

Asim Kar has 25+ years of overall IT experience spanning executing large-scale transformation programs and running technology organizations in BFSI in migration and reengineering space. He is currently heading the cloud technology focus group in BFSI. He leads complex transformation projects in traditional technologies in telecom, insurance, banks, and financial services programs.

To learn more, visit us here.

Hybrid Cloud

Ask anyone who’s lost an online auction or abandoned a shopping cart in today’s hybrid & multi-cloud world, speed matters for transactions and business decisions –      nanoseconds count.

As cloud migration continues apace, accessing and using the data that runs and informs your applications has become a challenge for organizations of all sizes. Cloud Search company Elastic takes the challenge of the observability of data availability and security head on.

Defining the challenge

For a typical data team, 80% of time is spent on data discovery, preparation, and protection, and only 20% of time is spent on actual analytics and getting to insight, says IDC. Data silos, legacy data management tools and skill sets, and the impact of the COVID-19 pandemic all have hobbled organizations’ efforts to unify, share and analyze data for forecasting and better business decisions.

“It’s astonishing how much inefficiency exists across the industry,” says Brian Bergholm, Senior Marketing Manager in the Cloud Product Marketing team at Elastic. “In fact, based on some recent survey work that Elastic conducted with Wakefield Research, we found that 81% of knowledge workers say they have a hard time finding documents under pressure.”[1] 

That inefficiency has some hard costs associated with it, Bergholm says, which are manifested in three trends noted by Elastic.

“One, information is becoming harder to find, and this inability to find information actually costs the average enterprise $2.5 million per year,” he says. “Second, enterprise IT is becoming harder to keep performant, and system downtime costs the average enterprise $1.5 million per hour. Third, cyber threats are becoming harder to prevent, and these data breaches cost the average enterprise $3.8 million per incident. Adding these all together accrues to millions of dollars of unnecessary costs.”

Components for data success

To tackle these costly inefficiencies, organizations are turning increasingly to an integrated approach as opposed to a suite of point solutions. Bergholm points to three advantages of an integrated solution: speed, scale, and relevance.

“From a speed standpoint, you can find matches in milliseconds within both structured and unstructured datasets,” says Bergholm. “You can scale massively and horizontally across literally hundreds of systems, and the most important aspect is that you can generate highly relevant results and actionable insights from your data.”

Integrated solutions also increasingly take advantage of technologies like artificial intelligence and machine learning, he says.

“We’ve also built machine learning in so it’s in the suite, and these capabilities can be leveraged across all three solution areas: search, observability, and security.

The security mandate

When talking about data challenges, security is a prime consideration. There are two sides to the coin: data security, and leveraging data for security intelligence.

First, any search solution itself must be secure and compliant. “Elastic takes data sovereignty very seriously” says Bergholm. “We’ve invested to ensure Elastic is operating in compliance with the principles of GDPR, and, in fact, Elastic Cloud is available in 17 Google Cloud regions. This allows you to place applications where the data lives and supports local data sovereignty and governance requirements.”

Second, an integrated search approach can be applied to the security data that’s collected routinely by organizations.

“By using advanced search analytics, you can leverage petabytes of data, enriched with threat intelligence to glean the insights you need to protect your organization,” says Bergholm. “Search also helps mitigate cyber threats by exposing unfolding attacks by correlating diverse data. We use machine learning algorithms and natural language processing capabilities and other tools to better understand context and meaning from a wider array of data types and formats, and all of this helps your SEC ops teams to quickly identify issues.”

Google Cloud Platform, Managed Cloud Services

The meager supply and high salaries of data scientists have led to a decision among many companies totally in keeping with artificial intelligence ― to automate whatever is possible. Case in point is machine learning. A Forrester study found that automated machine learning (AutoML) has been adopted by 61% of data and analytics decision makers in companies using AI, with another 25% of companies saying they’ll do so in the next year. 

Automated machine learning (AutoML) automates repetitive and manual machine learning tasks. That’s no small thing, especially when data scientists and data analysts now spend a majority of their time cleaning, sourcing, and preparing data. AutoML allows them to outsource these tasks to machines to more quickly develop and deploy AI models. 

If your company is still hesitating in adoption of AutoML, here are some very good reasons to deploy it sooner than later.

1. AutoML Super Empowers Data Scientists

AutoML transfers data to a training algorithm. It then searches for the best neural network for each desired use case. Results can be generated within 15 minutes instead of hours. Deep neural networks in particular are notoriously difficult for a non-expert to tune properly. AutoML automates the process of training a large selection of deep learning and other types of candidate models. 

With AutoML, data scientists can say goodbye to repetitive, tedious, time-consuming tasks. They can iterate faster and explore new approaches to what they’re modeling. The ease of use of AutoML allows more non-programmers and senior executives to get involved in conceiving and executing projects and experiments.

2. AutoML Can Have Big Financial Benefits

With automation comes acceleration. Acceleration can be monetized. 

Companies using AutoML have experienced increased revenue and savings from their use of the technology. A healthcare organization saved $2 million per year from reducing nursing hours and $10 million from reduced patient stays. A financial services firm saw revenue climb 1.5-4% by using AutoML to handle pricing optimization.

3. AutoML Improves AI Development Efforts

AutoML simplifies the process of choosing and optimizing the best algorithm for each machine learning model. The technology selects from a wide array of choices (e.g., decision trees, logistic regression, gradient boosted trees) and automatically optimizes the model. It then transfers data to each training algorithm to help determine the optimal architecture. Automating ML modeling also reduces the risk of human error.

One company reduced time-to-deployment of ML models by a factor of 10 over past projects. Others boosted lead scoring and prediction accuracy and reduced engineering time. Using ML models created with AutoML, customers have reduced customer churn, reduced inventory carryovers, improved email opening rates, and generated more revenue.

4. AutoML is Great at Many Use Cases

Use cases where AutoML excels include risk assessment in banking, financial services, and insurance; cybersecurity monitoring and testing; chatbot sentiment analysis; predictive analytics in marketing; content suggestions by entertainment firms; and inventory optimization in retail. AutoML is also being put to work in healthcare and research environments to analyze and develop actionable insights from large data sets.

AutoML is being used effectively to improve the accuracy and precision of fraud detection models. One large payments company improved the accuracy of their fraud detection model from 89% to 94.7% and created and deployed fraud models 6 times faster than before. Another company that connects retailers with manufacturers reduced false positive rates by 55% and sped up deployment of models from 3-4 weeks to 8 hours. 

A Booming Market for AutoML

The global AutoML market is booming, with revenue of $270 million in 2019 and predictions that the market will approach $15 billion by 2030, a CAGR of 44%. A report by P&S Intelligence summed up the primary areas of growth for the automation technology: “The major factors driving the market are the burgeoning requirement for efficient fraud detection solutions, soaring demand for personalized product recommendations, and increasing need for predictive lead scoring.”

Experts caution that AutoML is not going to replace data scientists any time soon. It is merely a powerful tool that accelerates their work and allows them to develop, test, and finetune their strategies. With AutoML, more people can participate in AI and ML projects, utilizing their understanding of their data and business and letting automation do much of the drudgery. 

The Easy Button

Whether you’re just getting started or you’ve been doing AI, ML and DL for some time, Dell Technologies can help you capitalize on the latest technological advances, making AI simpler, speeding time to insights with proven Validated Designs for AI.

Validated Designs for AI are jointly engineered and validated to make it quick and easy to deploy a hardware-software stack optimized to accelerate AI initiatives. These integrated solutions leverage H2o.ai for Automatic Machine Learning. NVIDIA AI Enterprise software can increase data scientist productivity, while VMware® vSphere with Tanzu simplifies IT operations. Customers report that Validated Designs enable 18–20% faster configuration and integration, save 12 employee hours a week with automated reconciliation feeds, and reduce support requirements by 25%.

Validated Designs for AI speed time to insight with automatic machine learning, MLOps and a comprehensive set of AI tools. Dell PowerScale storage improves AI model training accuracy with fast access to larger data sets, enabling AI at scale to drive real‑time, actionable responses. VxRail enables 44% faster deployment of new VMs, while Validated Designs enable 18x faster AI models.

You can confidently deploy an engineering‑tested AI solution backed by world‑class Dell Technologies Services and support for Dell Technologies and VMware solutions. Our worldwide Customer Solution Centers with AI Experience Zones enable you to leverage engineering expertise to test and optimize solutions for your environments. Our expert consulting services for AI help you plan, implement and optimize AI solutions, while more than 35,000 services experts can meet you where you are on your AI journey. 

AI for AI is here, making it easier and faster than ever to scale AI success. For more information, visit Dell Artificial Intelligence Solutions.  

***

Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

IT Leadership

Whether the cause is a disability, long illness, psychological challenges, or parental choice, many children find themselves facing the daunting task of acquiring an education at home.

During the global pandemic, homeschooling issues became a particular concern, as teachers and parents attempted to use their time and energy most productively and vulnerable students worried about falling behind, among other anxieties.

On a practical level, in some cases, homeschooling can eat up 10% of a particular school’s budget even when less than 1% of the pupils require the service.

As the world shut down, international technology company NTT DATA Business Solutions attempted to remedy apprehensions by creating a digital teaching engine to help children learn, teachers teach, and parents homeschool.

Not only would the Artificial Intelligence (AI) Learning Helper assist students in improving reading and other skills, the new platform would also manage to meet the emotional needs of each child.

Human avatars

From its headquarters in Bielefeld, Germany, NTT DATA assists a variety of industries, including chemical, pharmaceutical, wholesale, and consumer products, always searching for new places to innovate.

The company also works closely with the Media Lab at the Massachusetts Institute of Technology (MIT), drawing research from such disciplines as technology, media, science, art, and design.

“The possibilities of artificial intelligence fascinate us,” noted Thomas Nørmark NTT DATA’s global head of AI and robotics.

Previously, the company’s “digital human platform” allowed NTT DATA to develop avatars for receptionists, sales associates, shop floor assistants, and car vendors, among others.

Those experiences would prove invaluable in the development of the avatars that would both teach children and interact with them in a personalized way.

Emotional fluency

Turning to enterprise resource planning software leader SAP for its foundation, NTT DATA used a number of the solutions to make the AI Learning helper come to life: SAP Data Intelligence for both AI and Machine Learning (ML), SAP Analytics Cloud to amass data about individual students’ learning progress, and SAP Conversational AI to manage the conversations between Learning Helper and the students. 

These allowed the platform to utilize AI specialties like body language detection, emotional feedback, micro expression – which registers facial expressions that sometimes last only 1/30th of a second – and summarization algorithms to create new phrases to relay information in a language every student could grasp.

Each screen would be the equivalent of a virtual buddy who could understand a pupil’s changing emotions – detecting whether he or she were frustrated or unmotivated – and patiently adjust to the situation.

At times, the Learning Helper would conclude, a child simply needed to engage in small talk for a few minutes before turning back to the lesson.

The innovation would provide much needed relief for parents and teachers who are not always able to exhibit the same type of sensitivity when they are dealing with so many other obligations.

Tracking for success

The app was deployed in January 2021, with the Danish municipality of Toender’s school district and a British hospital school becoming the first places to use the AI Learning Helper.

Students discovered that they could access the platform at any time on any device, and there was no limit on how long a session could last.

In addition to assisting pupils with vocabulary, pronunciation, and story comprehension, the avatar generated and answered questions.

Through classwork, as opposed to testing, the solution could track each child’s progress, communicating that information to parents and teachers, and come up with lessons tailored to areas where the student could improve.

Participating schools noted that estimated homeschooling costs decreased by 50%, leading to a 5% reduction in the overall budget.

For creating a personalized virtual helper to bring out student strengths and alleviate both loneliness and frustration, NTT DATA Business Solutions received a 2022 SAP Innovation Award – part of an annual ceremony rewarding organizations using SAP technologies to improve the world.

You can read all about what NTT DATA did to win this coveted award, and how, in their Innovation Awards pitch deck.

As developing nations gain greater access to the Internet and education becomes more democratized, the company plans to use the AI Learning Helper to teach thousands more.

Artificial Intelligence, Machine Learning