With headquarters in Boston and over 2,700 employees worldwide, Novanta is an $800 million global supplier of laser photonics, precision motion control, and vision technologies. CIO Sarah Betadam, who joined in 2019 as VP of business applications, and then became global CIO in January 2021, is tasked with the strategic direction, leadership, and implementation of the company’s digital transformation, juggling several initiatives simultaneously, many of which surround efforts to become a fully functional data-driven enterprise.

“My team and I are very proud of our transformation that started in 2019,” she says. “When I joined, there was a lot of silo data everywhere throughout the organization, and everyone was doing their own reporting. So in monthly or quarterly combined meetings, there weren’t apples to apples being compared. It was also a lot of churning for the different groups to come up with those data on the weekly, monthly and quarterly basis.”

So from the business side, there was a lot of inefficiency getting data to the point where it was presentable to different audiences, which presented in its own right a big business problem for Betadam. But where to begin?

“We started from a focused business case by partnering with three different groups to showcase how centralization of data can be efficient, helpful, and a good roadmap for the company,” she says. “You have to build trust within stakeholders, and really prove you can help them help themselves. That’s the first level of a cultural shift. It took us about six months to do the proof of concept for three different business units, but it was highly successful. In fact, the ROI was so high, we gained the trust of our executives to invest in a platform to begin centralizing data.”

CIO contributing editor Julia King recently spoke with Betadam about Novanta’s unified shift from its fractured reporting culture to a more efficient data-driven organization. Here are some edited excerpts of that conversation. Watch the full video below for more insights.

On investing in capabilities: We’ve set up something called a BI Center of Excellence where we train and have workshops and seminars on a monthly basis that team members across Novanta can join to learn about how they could leverage data marts or data sources to build their own reporting. So we have a visualization layer where we teach different groups within our organization to learn. It’s evolved from over the past four years from having nothing and siloed data sets of spreadsheets and everyone doing their own thing, to being centralized based on KPIs and the trust in what they receive from the data. They’re learning how to visualize data on their own, so they don’t really need IT other than the data marts in order to build their own dashboards.

On a positive mentality: Transformations aren’t just technology driven, they’re people and process driven. And change doesn’t come easy no matter which organization you’re at. So when you talk about data, there’s a lot of change that can happen from the way people work to how they manage data, and how you report and make decisions based on that data. That’s integral for any business. When you’re making proposals, if the answer is no, don’t be discouraged, go back and try why there was a no. Keep at it because I’ve heard no throughout my career. If you’re a firm believer of something that will have a huge impact in your business, you just have to have the tenacity to go after it, understand it, try to explain it and educate people on it to create momentum. Once you prove that, then the rest is history.          

On BI maturity: When it comes to reporting and analytics or BI, in order to gain the trust of team members, you have to be able to educate as well as let them know that just the data reporting and having access to data from a centralized view doesn’t mean your data is necessarily accurate, because if you don’t input the data correctly, you get garbage in, garbage out. I think part of educating team members, when we’re doing proofs of concept, is about not expecting a miracle. We have a multitude of ERP systems to map as well as data sources. We could do all that mapping and validation with you, but if the underlying data isn’t accurate, it has nothing to do with the mechanism which provides that. It’s the clean-up effort. It’s about being transparent and educating your business in terms of what the expectation of the BI tool can deliver.

On data governance: We have 17 different ERP systems, and Novanta is a very acquisitive company, so it’s an ongoing challenge. But my team is familiar with different backend technologies for the mainstream ERP. Yet if we come across an ERP that’s not necessarily mainstream, they’ll have challenges getting into the back end, and integrating and understanding the relational data to connect it to our central data lake. That’s going to be an ongoing technical risk that we’ll have and we need to overcome that. What we understood in 2019 was when people don’t see what they’re inputting, they often forget different entry variations, like how many ways there are to say, “United States.” But through some key business cases, and now to the entire group, discrepancies from data duplication are visible, as well as the visibility and movement for data governance that’s spun off across Novanta, and the BI platform and reporting. It’s a work in progress. We’re discovering more, but it definitely helps to have the visibility and data governance to clean the data and integrate the data mapping, which helps the BI team to publish data marts.

Business Intelligence, CIO, Digital Transformation, Enterprise Architecture, IT Leadership

In today’s challenging economy, customer expectations are high, patience is low, and attention is at a premium. Your customers demand a seamless experience with your products and services, with easy access to detailed, helpful self-service support options. So how do you stay ahead of ever-increasing customer demands? Data. Harnessing numerous customer data points, often scattered across multiple departments, is the key to unlocking a proactive approach to customer satisfaction (and growth).

So your customer success organization is more integral to your brand than ever. And its job is significantly more complex, too. Your customer success team is tasked with ensuring that your customers have everything they need when they need it. And they must also offer a personalized experience that leads to increased product or service adoption and revenue growth.

Essentially, they need to act as a growth engine for your organization. So, instead of simply responding to customer requests, your teams should be proactive and prescriptive. Anticipate your customers’ needs, impressing and delighting them at every turn. The key to this transformation lies in intelligently using the data you’re already collecting.

Predictive insights. Self-service experiences. Highly satisfied customers.

Unifying data in the cloud to visualize it, analyze it, and apply tools like machine learning allows you to unlock new customer insights. Predict when they’ll need support. Better understand when they’re most likely to drop out of your lifecycle. Recognize when they’re most apt to increase their investment. And, of course, doing all this while carefully respecting privacy and adhering to laws regulating the use of data.

Armed with this information, and the right tech platform to glean insights from it, your teams can digitally engage with your customers at the right time with relevant content.

Like any business initiative, scalability is critical. You likely don’t have the workforce to connect with every customer personally. In an already overcrowded digital communication landscape, you’ll achieve greater success by putting the power back in your customers’ hands. Offer the self-service options they want, powered by elegant search experiences that deliver fast access to the information they need.

3 key customer experience drivers

There are three initiatives your customer success organization can implement now to ensure it proactively engages with your customers, offers a self-service experience, and generates continued and repeat business:

Ensure that a customer-first approach is baked into your organization’s DNA.

To position your customer success team as a growth engine, you must have alignment with sales, marketing, product, and other parts of your business.

Make sure everyone in your organization is on the same page about your data collection efforts. And, most importantly, evangelize how all your teams can use that data to set customers up for success and help them grow long-term relationships with your organization.

2. Identify and fill gaps.

What KPIs are important to your customer success team? Are you collecting the right data to report on them?

Ask the right questions of your data based on your KPIs, and you’re likely to uncover gaps or attrition points and identify ways to resolve them. Maybe your customers aren’t receiving enough training or information. Or your team is reaching out to them at the wrong times. Or not at all. When you understand the critical gaps, you can fill them to ensure a smooth road to customer loyalty.

Invest in documentation and metadata.

Your customers need to be able to search for, and quickly and easily find, tools and resources. Your metadata tagging strategy is vital to ensuring they can.

Many companies simply tag their content with internal or company-driven terms, but incorporating the language your customers use to search for information will help them find it faster. Continue to analyze your data over time to see if you’re missing additional content your customers need.

Your customers are at the heart of your organization’s success. And your data is what keeps it beating. When you leverage it strategically to delight your customers, you cultivate loyal customers who are eager to increase their investment in your products or services — a real win-win!

See how Elasticsearch helps foster a culture of customer success.

Rick Laner is the Chief Customer Officer at Elastic.

Data Management

In today’s data-driven world, many organizations face major hurdles as they navigate a transformation journey that eliminates silos, unifies data, and transforms it into value. For many, building a culture of innovation remains elusive.

IDC’s Future of Intelligence predictions for 2023 show what’s possible when businesses get it right. Top-quartile enterprise intelligence performers are 2.7 times more likely to have experienced strong revenue growth over 2020–2022 and 3.6 times more likely to have accelerated their time to market for new products, services, experiences, and other initiatives.

But businesses must avoid trying to solve every problem and instead narrow focus to areas like customer experience or productivity, says Vrinda Khurjekar, Senior Director, AMER business at Searce.

Companies that commit to clear and focused action now will achieve a competitive advantage. Searce’s work in a range of industries indicates that successful companies embrace specific strategies and tools based on an intelligent data environment as they focus on building a culture of innovation. With data at the core of business strategy, operations become more seamless.

“If you don’t embrace data capabilities to help drive your culture, you’re definitely going to get disrupted,” says Khurjekar. “We’re seeing it across every industry. It’s no longer a ‘good to have’.”

Khurjekar cites several barriers that enterprises continue to face. They often have patchwork and duplicitous software and solutions. They want a better customer experience and to increase productivity, but there’s a lack of clarity as to what that means.

Finally, companies need talent to execute, however traditional enterprises continue to struggle with attracting top talent in a challenging job market.

To bring about culture change, Khurjekar says, organizations should pick one or two use cases to start with.

Customer lens

“If you’re trying to do too many things at the same time, the overall initiative gets lost,” she says.

Choose initiatives by looking through a customer lens, she adds, and collaborate “ruthlessly” with multiple departments. Employees, too, need to see success and analytics need to be used to help them not just customers.

Leaders that want to create a culture of innovation should start by looking at their current processes with a critical eye. If processes are not helping the company to get ahead, then a change is needed.

“Even before we get to the technological challenges, it’s about the willingness to really experiment and to disrupt yourself,” says Khurjekar. “You need to have that courage as an organization to do that.”

Next, it’s crucial to embrace data capabilities to help drive that innovative culture, including an intelligent data cloud that’s agile, discoverable, intelligent, trusted, and open. An intelligent data environment gives enterprises capability and scalability from an infrastructure standpoint to solve business problems more seamlessly.

“Cloud offers a good home to be able to experiment fast and also to put use cases into production quickly,” Khurjekar says. For example, a transportation company that needs to better secure its warehouses can’t wait several years to do so. It needs to be able to quickly connect to cameras already installed for better monitoring and alerting.

Cloud enables intelligent data-driven insights and allows businesses to do quick proofs of concepts and move into production fast, she says.

She adds: “As an organization, if you say, no, I’m going to build on my own, you’re missing out.”

Cloud removes the overhead so the business can focus on solving the problem at hand.

With its data and analytics services, applied AI service, and secure API product Recognic.AI, Searce helps support clients across the entire spectrum of their data journey. Organizations that take steps toward developing a data-driven culture appreciate that this is an ongoing, long-term process.

“It is truly a journey and not a one-time thing,” says Khurjekar.

To learn more about how Searce can help, click here.

Analytics

Data is what drives digital business. Consider how strategically important it has become for companies to leverage advanced analytics to uncover trends that can help them gain decisive insights they might not otherwise possess.

But data-driven projects are not always easy to launch, let alone complete. In fact, enterprises face several challenges as they look to leverage their information resources to gain a competitive advantage.

Foundry’s recent Data & Analytics Study looked into why organizations have difficulty making good on the promise of data-driven projects, and revealed several key roadblocks to success. Here are the top six reasons data initiatives fail to materialize and deliver, as revealed by the research, along with tips from IT leaders and data experts on how to overcome them.

1. Lack of funding for data initiatives

Funding can be hard to come by for any technology initiatives, particularly in an uncertain economy. This certainly applies to data projects. These undertakings might be competing with a host of other initiatives in need of financing, so it’s important for IT leaders and their data teams to present a strong business case for each project, and to not make them overly complex.

“While budget is always tricky, this is a question of priorities and right-sizing the body of work,” says Craig Susen, CTO and technology enablement lead at management consulting firm Unify Consulting. “Looking for obvious outcomes [does not] always require reworking the entire infrastructure.”

Being data-driven is as much a cultural pursuit as it is anything else, Susen says. “It requires designing/rethinking key performance indicators, capturing data in a smart timely manner, landing it in common areas quickly,” he says. “Then it can be evaluated and aggregated, either applying advanced visualization technologies or working it against machine learning algorithms. It’s all a complicated bit of science. Having said that, many companies overcomplicate this process by trying to do too much all at once or over-indexing in places that don’t drive true value to their businesses and customers.”

CIOs and other technology leaders need to develop strong working relationships with fellow C-suite members, particularly CFOs. In many cases it’s the finance executive who makes the decision on budget approvals, so to improve the likelihood of getting the needing funding technology chiefs need to be able to demonstrate why data-driven projects are important to the bottom line.

2. Lack of a clearly articulated data strategy

Lacking a complete data strategy to guide data-driven projects “is like not having an outline to guide a thesis,” says Charles Link, senior director of data and analytics at Covanta, a provider of sustainable materials management and environmental solutions.

“Every project should contribute some paving stones to the road leading to the desired destination,” Link says. “A data strategy identifies how to align information and technology to help you get there. Your business should be able to travel down the road as you deliver value.”

To be successful, a data strategy should have both a data management component — generally IT tools, technologies, and methods — and a data use strategy, Link says.

Oftentimes there isn’t a clear understanding within enterprises of what data is available, how the data is defined, how frequently it changes, and how it is being used, says Mike Clifton, executive vice president and chief information and digital officer at Alorica, a global customer service outsourcing firm.

Companies need to create a common language among stakeholders in advance of establishing any data-driven projects, Clifton says. “If you don’t have a solid foundation, budget and funding are too unpredictable and often get cut first due to a lack of clear scope and achievable outcome,” he says.

3. Technology to implement data projects is too costly

Making the challenge of getting sufficient funding for data projects even more daunting is the fact that they can be expensive endeavors. Data-driven projects require a substantial investment of resources and budget from inception, Clifton says.

“They are generally long-term projects that can’t be applied as a quick fix to address urgent priorities,” Clifton says. “Many decision makers don’t fully understand how they work or deliver for the business. The complex nature of gathering data to use it efficiently to deliver clear [return on investment] is often intimidating to businesses because one mistake can exponentially drive costs.”

When done correctly, however, these projects can streamline and save the organization time and money over the long haul, Clifton says. “That’s why it is essential to have a clear strategy for maximizing data and then ensuring that key stakeholders understand the plan and execution,” he says.

In addition to investing in the tools needed to support data-driven projects, organizations need to recruit and retain professionals such as data scientists. These in-demand positions typically command high levels of compensation.

4. Other digital transformation initiatives took priority

Digital transformations are under way at organizations in virtually every industry, and it’s easy to see how projects related to these efforts could be given a high priority. That doesn’t mean data-driven projects should be put on the back burner.

“If digital transformation efforts are taking priority over data initiatives, then you need to re-evaluate,” Link says. “All digital transformation initiatives should envelope data initiatives. You cannot have one without the other.”

Ignoring the data aspects of transformation could invite failure of other initiatives. “I would be concerned to pursue digital transformation without a solid data strategy, as the results, iterations, and pivots needed to be successful should all be data-driven decisions,” says David Smith, vice president and CIO at moving and logistics company Atlas Van Lines.

“If this is an organizational roadblock, I would recommend using the digital transformation initiative as the genesis of a data strategy execution,” Smith says.

5. Lack of executive buy-in or advocacy for data initiatives

If senior executives are not sold on data-driven projects, their chance of success will likely diminish because of lack of adequate funding and resources.

“Lack of buy-in from the top can kill a data-driven project before it starts,” says Scott duFour, global CIO at Fleetcor, a provider of business payments services. “I am fortunate that isn’t a problem at Fleetcor, as I get buy-in for projects from our CEO by partnering with leadership running lines of business to validate the importance of big data for company growth and success.”

To get executive buy-in, technology leaders must be able to articulate from the beginning what the outcomes of data projects will be and align them to business priorities or pain points, Clifton says. Ironically, all digital-related deployments depend heavily on data to achieve benefits, “so whether or not the executives realize it, they are funding data initiatives,” he says.

The organization’s data strategy should inform executives about how data projects can support the goals of the business. “The data initiatives should focus on the accomplishment of those objectives through actionable intelligence and automation,” Link says.

In some cases, the lack of support might stem from the fact that business leaders do not really know what they want from data projects, and therefore do not understand the value, Smith says. “If they cannot see the value, then they won’t support it,” he says.

It’s a good practice to use small proof of concept opportunities to show the value through operational dashboards or the automation of manual tasks, Smith says. “This will create interest from the executive team,” he says.

6. Lack of appropriate skill sets

The technology skills shortage is affecting nearly every area of IT, including data-driven projects.

“Without enough IT talent and people with the right skill sets, it’s tough to get data-driven projects done,” duFour says. “And the IT employee shortage is real in several areas of IT.” To try to draw technology workers, Fleercor offers flexible working arrangements and provides training so employees can improve their skills.

“We have also cast a wider net in the talent search,” duFour says. “Although a four-year degree or more is ideal, companies should look for potential employees with associate degrees, IT-type certifications, and other pertinent skills that can help move data-driven projects forward.”

Hiring talent with the specific technical experience needed to lead and manage data-driven projects “is a challenge in this competitive job market, but it’s key in ensuring you have the right skills in place to successfully implement the projects,” Clifton says. “Without the right skills and expertise up front, companies can start a project and then run into issues where the team is unable to quickly and effectively identify and resolve the problem.”

Data scientists, data stewards, and data forensics experts are becoming mainstay roles, Clifton says, whereas data architects were the higher-end skills most needed in prior years.

“Affordable talent has been my biggest challenge,” Link says. “There is no one right answer. I have brought in fresh talent from recent graduates and invested time, only to have them poached at crazy salaries. In my experience, there is a lot of value in having people co-located for faster learning and collaboration. My latest approach is to work with organizations like Workforce Opportunity Services to build my own team from high-caliber workers. It will take time to get there but we are focused on the long-term results.”

Analytics, Data Management, Data Science

Every organization pursuing digital transformation needs to optimize IT from edge to cloud to move faster and speed time to innovation. But the devil’s in the details. Each proposed IT infrastructure purchase presents decision-makers with difficult questions. What’s the right infrastructure configuration to meet our service level agreements (SLAs)? Where should we modernize — on-premises or in the cloud? And how do we demonstrate ROI in order to proceed?

There are no easy, straightforward answers. Every organization is at a different stage in the transformation journey, and each one faces unique challenges. The conventional approach to IT purchasing decisions has been overwhelmingly manual: looking through spreadsheets, applying heuristics, and trying to understand all the complex dependencies of workloads on underlying infrastructure.

Partners and sellers are similarly constrained. They must provide a unique solution for each customer with little to no visibility into a prospect’s IT environment. This has created an IT infrastructure planning and buying process that is inaccurate, time-consuming, wasteful, and inherently risky from the perspective of meeting SLAs.

Smarter solutions make for smarter IT decisions

It’s time to discard legacy processes and reinvent IT procurement with a new approach that leverages the power of data-driven insights. For IT decision makers and their partners and sellers, a modern approach involves three essential steps to optimize procurement — and accelerate digital transformation:

1. Understand your VM needs

Before investing in infrastructure modernization, it’s critical to get a handle on your current workloads. After all, you must have a clear understanding of what you already have before deciding on what you need. To reach that understanding, enterprises, partners, and sellers should be able to collect and analyze fine-grained resource utilization data per virtual machine (VM) — and then leverage those insights to precisely determine the resources each VM needs to perform its job.

Why is this so important? VM admins often select from a menu of different sized VM templates when they provision a workload. They typically do so without access to data — which can lead to slowed performance due to under-provisioning, or oversubscribed VMs if they choose an oversized template. It’s essential to right-size your infrastructure plan before proceeding.

2. Model and price infrastructure with accuracy

Any infrastructure purchase requires a budget, or at least an understanding of how much money you intend to spend. To build that budget, an ideal IT procurement solution provides an overview of your inventory, including aggregate information on storage, compute, virtual resource allocation, and configuration details. It would also provide a simulator for on-premises IT that includes the ability to input your actual costs of storage, hosts, and memory. Bonus points for the ability to customize your estimate with depreciation term, as well as options for third-party licensing and hypervisor and environmental costs.

Taken together, these capabilities will tell you how much money you’re spending to meet your needs — and help you to avoid overpaying for infrastructure.

3. Optimize workloads across public and private clouds

Many IT decision makers wonder about the true cost of running particular applications in the public cloud versus keeping them on-premises. Public cloud costs often start out attractively low but can increase precipitously as usage and data volumes grow. As a result, it’s vital to have a clear understanding of cost before deciding where workloads will live. A complete cost estimate involves identifying the ideal configurations for compute, memory, storage, and network when moving apps and data to the cloud.

To do this, your organization and your partners and sellers need a procurement solution that can map their entire infrastructure against current pricing and configuration options from leading cloud providers. This enables you to make quick, easy, data-driven decisions about the costs of running applications in the cloud based on the actual resource needs of your VMs.

And, since you’ve already right sized your infrastructure (step 1), you won’t have to worry about moving idle resources to the cloud and paying for capacity you don’t need.

HPE leads the way in modern IT procurement

HPE has transformed the IT purchasing experience with a simple procurement solution delivered as a service: HPE CloudPhysics. Part of the HPE GreenLake edge-to-cloud platform, HPE CloudPhysics continuously monitors and analyzes your IT infrastructure, models that infrastructure as a virtual environment, and provides cost estimates of cloud migrations. Since it’s SaaS, there’s no hardware or software to deal with — and no future maintenance.

HPE CloudPhysics is powered by some of the most granular data capture in the industry, with over 200 metrics for VMs, hosts, data stores, and networks. With insights and visibility from HPE CloudPhysics, you and your sellers and partners can seamlessly collaborate to right-size infrastructure, optimize application workload placement, and lower costs. Installation takes just minutes, with insights generated in as little as 15 minutes.

Across industries, HPE CloudPhysics has already collected more than 200 trillion data samples from more than one million VM instances worldwide. With well over 4,500 infrastructure assessments completed, HPE CloudPhysics already has a proven record of significantly increasing the ROI of infrastructure investments.

This is the kind of game-changing solution you’re going to need to transform your planning and purchasing experience — and power your digital transformation.

____________________________________

About Jenna Colleran

HPE

Jenna Colleran is a Worldwide Product Marketing Manager at HPE. With over six years in the storage industry, Jenna has worked in primary storage and cloud storage, most recently in cloud data and infrastructure services. She holds a Bachelor of Arts degree from the University of Connecticut.

Cloud Management, HPE, IT Leadership

Data intelligence helps organizations create new customer experiences, accelerate operations and capitalize on new market opportunities. It also gives them the agility to pivot when the unexpected strikes. 

While becoming data-driven makes great sense, some organizations struggle to put it into practice. Adopting a company-wide data culture can be particularly challenging, requiring companies to overhaul how they operate from the board room to the shop floor. On the technical side, IT teams must liberate and unify data long segregated in departmental silos so IT can provide broad accessibility and comprehensive analytics. 

Some businesses have stumbled in their data transformation initiatives, causing them to question whether the effort is worth it. Recent research would suggest that it is worth it: IDC estimates that organizations that excel at leveraging data in decision-making enjoy more than three times greater revenue and nearly two-and-a-half-times greater profit than those who don’t.

Success Factors

Generally, organizations that succeed with data-driven initiatives have built and executed a cohesive data strategy. That strategy is reflected in how the business operates, which requires buy-in at executive levels and buy-in across departments and business units. The goal now is to have more data to base decisions on for greater accuracy, rather than relying solely on collective experience.

Executing the strategy necessitates aggregating data and expanding access company-wide for analytics to expose bigger-picture insights and trends. The combined data serves as a single source of truth for creating corporate value that runs the gamut from gauging customer sentiment to troubleshooting IT snafus to averting supply-chain delays. 

The Role of the Data Lakehouse

A data lakehouse is one solution that organizations can use to help break down data silos and use as a foundation for an intelligent data ecosystem. It combines a data lake’s ability to store data in any format—structured, semi-structured, and unstructured—with the performance, security, and governance strengths of a traditional data warehouse. 

The lakehouse provides central access to data as-is, independent of format. Organizations can run a variety of analytics on it to improve decision-making, from dashboards and visualizations, real-time analytics and machine learning. Open interfaces enable data scientists, business analysts and others to use their favorite analytics tools to access and analyze lakehouse data.

Building a Modern Data Ecosystem

While becoming more data-driven is a formidable undertaking, it’s becoming a competitive requirement in the digital economy. The same IDC report found that more than 84% of teams who excel at using data in decision-making get answers in minutes or hours compared to only 3% companies who don’t. 

Culturally, companies need to commit to using data to drive decisions. At the IT level, they require unified data infrastructures, built around a data lakehouse, with the following characteristics:

Affordable, scalable, and reliable storageSecurity and governance A holistic, integrated view of company-wide dataBroad analytics capabilities, including the ability to process streaming data in real-time and machine learningAutomation, including continuous learning and adaptabilityAn online data catalog

With the Dell Validated Design for Analytics – Data Lakehouse, organizations can stop chasing data and start using it to create value for the organization instead. 

Want to learn more about data lakehouses and how to succeed with your data-driven initiative? Read the Dell Validated Design for Analytics – Data Lakehouse Solution Brief

***

Intel® Technologies Move Analytics Forward

Data analytics is the key to unlocking the most value you can extract from data across your organization. To create a productive, cost-effective analytics strategy that gets results, you need high performance hardware that’s optimized to work with the software you use.

Modern data analytics spans a range of technologies, from dedicated analytics platforms and databases to deep learning and artificial intelligence (AI). Just starting out with analytics? Ready to evolve your analytics strategy or improve your data quality? There’s always room to grow, and Intel is ready to help. With a deep ecosystem of analytics technologies and partners, Intel accelerates the efforts of data scientists, analysts, and developers in every industry. Find out more about Intel advanced analytics.

Data Management

The Keys to Become a Data-Driven Organization

Only 26.5% of organizations say they’ve reached their goals of becoming a data-driven organization, according to NewVantage Partners’ Data and AI Leadership Executive Survey 2022. Astonishingly, this leaves three-quarters of those surveyed indicating they’ve not met their goals in this area.

Fortunately, there are some bright points on the horizon. A recent Gartner survey says 78% of CFOs will increase or maintain enterprise digital investments. And Gartner forecasts worldwide IT spending will grow 3% in 2022. Further, a Gartner CDO survey indicated top-performing CDOs are significantly more likely to have projects with the CEO, and they engage in value delivery rather than enablement.

While this is great progress, perhaps one of the most important points of agreement is how to balance business value creation with risk and compliance mandates.

Business value creation vs. risk, security, privacy, and compliance

For many organizations, the conversation of balancing business value and security is cast through industry regulations. But it remains important for organizations to truly understand and agree on how and where you define your stance.

The crux is there’s no one size that fits all. But there are universal ways to mitigate risks and meet compliance mandates. For instance, if the use of PII data in certain analytical scenarios isn’t allowed, that doesn’t imply you should scrap the analytical project. You can mask or remove PII-related information and continue with your analytical projects.

Defining the value created from data is fairly nuanced. Many companies struggle to agree on a unified lens through which to view their data’s value. To simplify, your organization can view your data through the filter of four categories:

Direct attributed revenueIndirect attributed revenueCost savings and optimizationRisk and compliance failure avoidance

Balance data democratization and security at scale

Once your organization has defined guidelines and policies on how to treat regulated data, the biggest challenge is to enforce those policies at scale. A comprehensive data security and access governance framework can go a long way to help you frame your approach.

Perimeter-based security: In an on-premises world, your network is the gateway to the kingdom. If you lock that down, you may have the pretense of safety from the outside world. Internally, though, there’s still full access. The challenge is even larger in the cloud.Application security: The next level of defense is to provide authentication for accessing applications. In this model, getting access to the network only gets you so far unless your credentials allow you access to the application you’re trying to use.Data security: The last mile of your defense is data security. If someone gets through all the security layers, for example, you can still ensure access. Privacy is defined at the data level, so only authorized data is visible. Making sure fine-grained data access policies—as well as data masking and encryption—are applied down to the file, column, row, and cell is one of the most powerful ways to strengthen your security posture.

Enforcing these security protocols across your entire data estate is a massive challenge. And you can’t scale if executed in a siloed piecemeal fashion.

Universal data security platform

One of the emerging patterns for modern data infrastructure is that data governance and security processes need to become a horizontal competency across your entire data estate. This requires one of the most important C-suite dialogues between the CISO, CDAO, and CIO, since co-ownership across these groups is essential.

Universal data security platforms provide a central policy control plane that natively synchronizes policies into each distinct data service. Policies are created once and deployed everywhere. In addition, it provides a single view into your data estate, sensitive data locations, policies applied, and access events. Privacera works with Fortune 100 and 500 companies to reach their data security goals, including federal agencies and myriad types of enterprises across sectors. For example, Sun Life Financial teamed up with Privacera to secure and streamline their cloud-migration process, while seamlessly leveraging existing investments thanks to the open-standards framework. For more information on how to start or continue your data security journey, contact Privacera’s Center of Excellence.

Data and Information Security

To be a truly data-driven enterprise, organizations today must go beyond merely analyzing data. Rather, business experts and IT leaders must transform relevant data into compelling stories that key stakeholders can readily comprehend — and leverage to make better business decisions.

This vital skill is known as data storytelling, and it is a key factor for organizations looking to surface actionable information from their data, without getting lost in the sea of charts and numbers typical of traditional data reporting.

Following is a look at what data storytelling entails and how IT and analytics leaders can put it to work to make good on data’s decision-making potential.

What is data storytelling?

Data storytelling is a method for conveying data-driven insights using narratives and visualizations that engage audiences and help them better understand key conclusions and trends.

But that’s often easier said than done.

“Telling stories with data can be difficult,” says Kathy Rudy, chief data and analytics officer at global technology research and advisory firm ISG.

For Rudy, data storytelling begins with knowing your audience.

“Remember to start with who your main characters are, that is, the audience for your data story. What information is most important to them? Structure your data story so you anticipate the next question the audience will have by thinking like the reader of the story,” says Rudy, adding that, in her 20 years in benchmarking and data analytics, she has had to learn to tell a clear and concise story using data to validate ISG’s recommendations.

The first hurdle most data storytellers face is gaining acceptance for the validity of the data they present, she says. The best way to do this is to hold data validation and understanding sessions to get the question of data validity out of the way.

The goal of the data storyteller is to clear up all questions as to the source of the data, the age of the data, and so on, so that in subsequent views of the data, the storyteller isn’t continually defending the data, Rudy says.

“Don’t get overly technical or you will lose the audience,” she advises. “In the case of IT benchmarking, they don’t want to know about the technology stack, just that the data is relevant, secure, current, comparable, and accurate.”

Elements of data storytelling

Data storytelling consists of data visualization, narrative, and context, says Peter Krensky, a director and analyst on the business analytics and data science team at Gartner.

“With visualization, a picture is worth a thousand words,” he says. “How are you making the story visually engaging? Are you using a graphic or iconography? That doesn’t mean it can’t be a table or very dry information, but you’d better have a visual component.”

The narrative is the story itself — the who, what, where, why. It’s the emotional arc, Krensky says. “If it’s about sales forecasting for the quarter, are we doing great, or are people going to lose their jobs?”

Context is what the people hearing this story need to know. Why one sales representative is always outperforming all the other sales reps is an example of the context for a data story, Krensky says.

Grace Lee, chief data and analytics officer at The Bank of Nova Scotia (known as Scotiabank), says blending context and narrative requires a keen understanding of what makes a story compelling.

“The way that we think about stories, if we remove the data term, it needs a plot that you care about, it needs characters that you root for, and it requires a destination or an outcome that you believe in and aspire to,” she says.

Being able to put the data into context in the form of a narrative allows people to care more and to understand what the action is that comes out of it, Lee says. In addition to focusing on storytelling as a discipline, Lee’s team is also working to create more storytellers across the organization.

“The way we’re educating people around storytelling is really around action orientation, helping people create those narratives, providing more of the context, and allowing people to see the clear line between the data, the insight, and the action to come,” she says.

Lee sees the role of Scotiabank’s data and analytics organization as the storyteller for the enterprise because it’s only in the data that some of the insights about what customers need and want appear.

Key steps in data storytelling

Lars Sudmann, owner of Sudmann & Co., a Belgium-based consulting and management training network, offers insight into the steps that go into data storytelling.

Identify the ‘aha’ insights: One of the greatest pitfalls of data-based presentations is the “data dump.” Rather than overwhelm the audience with data and visualizations, CIOs and data analytics officers should identify one to three key “aha” insights from the data and focus on these. What are the surprising, absolutely key things one needs to know? Identify them and build your presentation around them, Sudmann says. Share the genesis story of the data: To tell a good story with data, a good starting point is the genesis, i.e., the origin of the data. Where does it come from? This is especially important when storytellers present data sets for the first time. Transform surprising turning points into engaging transitions: When storytellers present data and facts, they should share where the data/graphs/trendlines make “surprising” moves. Is there a jump? Is there a turning point? Doing so can provide compelling transitions to deeper analysis, for example: “Normally we would think the data does X, but here we see that it declined. Let’s explore why this happened.”Develop your data: One of the biggest issues in giving presentations today is that people throw heavy data on the screen and then play “catch-up,” with words, such as “This is a crowded slide, but let me explain.” “This might be difficult, but…” Instead, storytellers should develop their data step-by-step. “I am not a fan of fancy animations, but for instance in PowerPoint there is one animation that I recommend: the ‘appear’ animation,” Sudmann says. “With it one can harmonize what one sees and what one says and with that a data story can be built step-by-step.” Emphasize and highlight to bring your story to life: Once storytellers have identified the flow and key aspects of their data stories, it’s important to emphasize and highlight key points with their voices and body language. Show the data, point to it on screen, walk to it, circle it — then it comes to life, Sudmann says. Have a ‘hero’ and a ‘villain’: To make stories more engaging, data storytellers should also consider developing a hero, e.g., the “good tickets,” and a villain, e.g., “the bad tickets raised because of not reading the FAQs,” and then show their development over time, in different departments, as well as the “hero’s journey” to success, Sudmann advises. 

Data storytelling tips for success

Rudy is a firm believer in letting the data unfold by telling a story so that when the storyteller finally gets to the punch line or the “so what, do what” there is full alignment on their message.

As such, storytellers should start at the top and set the stage with the “what.” For example, in the case of an IT benchmark, the storyteller might start off saying that the total IT spend is $X million per year (remember, the data has already been validated, so everyone is nodding).

The storyteller should then break it down into five buckets: people, hardware, software, services, other (more nodding), Rudy says. Then further break it down into these technology areas: cloud, security, data center, network, and so on (more nodding).

Next the storyteller reveals that based on the company’s current volume of usage, the unit cost is $X for each technology area and explains that compared to competitors of similar size and complexity, the storyteller’s organization spends more in certain areas, for example, security (now everyone is really paying attention), Rudy says.

“You have thus led your audience to the ‘so what’ part of the story, namely, that there are areas for improvement,” she says. “The next question in your audience’s mind is mostly likely, ‘Why?’ And finally, ‘So what do we about it?’”

The rest of the story leverages a common understanding of the validity of the data to make recommendations for change and the actions necessary to make those changes, according to Rudy. Data in this story created the credibility necessary to establish a call to arms, a reason to change that is indisputable.

And taking the old adage “if a tree falls in a forest and no one is around to hear it, does it make a sound?” into consideration, it’s crucial for data storytellers to consider the medium various individuals are using to consume information and what times they’re accessing this information.

“The pandemic has definitely helped in the shift of allowing thought workers to work from home,” says Kim Herrington, senior analyst for data leadership, organization, and culture at Forrester Research. “And a lot of times you’re communicating with thought workers that are across the globe. So it’s important to think about the communication software that you’re using and the communication norms that you have with your team.”

Analytics, Data Science, ROI and Metrics

Companies have learned to thrive — and in some cases survive — by leveraging data for competitive advantage. But how many organizations are truly data-driven enterprises?

“Data is becoming increasingly valuable, especially from a business perspective,” says Lakshmanan Chidambaram, president of Americas strategic verticals at global IT consulting firm Tech Mahindra. “Afterall, data can tell us a lot about a company’s processes and activities. It shows whether one is moving in the right direction, identifies areas of improvement, and suggests an appropriate process to make those improvements.”

Here are some key traits of a data-driven enterprise, according to experts.

They operate with an organization-wide data strategy

To be a data-driven enterprise requires having a cohesive, comprehensive data strategy that applies across the organization. This encompasses technology and automation, including the use of artificial intelligence (AI). But it also includes culture, governance, cybersecurity, data privacy, skills, and other components.

“The market for data governance, storage, and analytical tools has grown considerably, yet enterprises are still struggling to wrap their arms around the scope of the challenge,” Chidambaram says. “CIOs, CTOs, and [chief administrative officers] must step back and establish an enterprise-wide strategy to harness the value of data for their enterprise and integrate AI to enable sales, marketing, and operational excellence.”

This includes ensuring that the data architecture provides both data professionals and non-technical decision-makers with the tools they need to move beyond instinctual and anecdotal decisions, Chidambaram says.

“Many corporate and government enterprises are leveraging data-driven insights for improving customer service, reducing operating expenses, creating new business streams, and achieving overall business efficiency,” Chidambaram says.

Getting an organizations’ leadership and workforce to commit to a data-driven approach is key to determining success, Chidambaram says. “Organizations must ensure that they [address] the following question to call themselves a truly data-driven organization: Is everybody willing to embrace data as part of the business culture?” he says.

They optimize resource allocation

It’s one thing to develop a data-driven strategy; it’s another thing entirely to effectively execute on the plan. That’s where having the right resources in place and updating them as needed is important.

“Once the strategy is defined, the people, process, and tools to support the strategy are critical for a data-driven organization,” says Kathy Rudy, partner and chief data and analytics officer at global technology research and advisory firm ISG.

For example, organizations need to have a process for building data catalogs; procedures and tools for data cleansing and data quality; defined data use cases and the right tools to support the use cases; effective and secure access to data for internal and external users; overall security to support the use cases; and a data center of excellence to support complex data requests

From a people perspective, being a data-driven organization means having a solid team of data analysts, data scientists, data engineers, and other professionals in place, and providing the necessary training when skills need to be updated.

They emphasize data governance

Data governance is another component of the overall data strategy that warrants extra attention. Governance encompasses data security, privacy, reliability, integrity, accuracy, and other areas. It’s essential to maintaining a data-driven operation.

Without data governance, you cannot trust that the data you are using is of high quality, is synced across data sets by a common taxonomy, or is secure,” Rudy says. “Data governance also provides the foundation for access to the data.”

ISG is often faced with disparate databases with differing taxonomies and ways of maintaining the datasets, Rudy says. “Once we established a centralized data governance methodology — with people, processes, and tools — we were able to develop new ways to use our data internally and externally for client delivery, products, and data monetization.”

The centralized approach also established proper security protocols for data access inside the business, Rudy says. “Many people think data should be democratized, though I’m not convinced of that,” she says. “Unless you truly understand the source for the data, how it was collected, the context of the data, and [how to] analyze data, improper use can lead to bad decisions.”

For example, when the ISG sales teams asked for account information, the data team began pulling reports and discovered there were multiple names for the same client. “This made it very hard to pull together a snapshot of business over time, what was selling, by whom, etc.,” Rudy says. “Lack of governance over our data led to dirty data in our system, and an incomplete picture of a client that might have led us to incorrectly design an account strategy.”

Responsible data use is paramount for data-driven organizations, says Deepika Duggirala, senior vice president of global technology platforms at TransUnion, a provider of financial services, healthcare, and insurance technologies.

“This means securing all data within an enterprise’s data ecosystem — both in motion and at rest — while maintaining the privacy of associates and consumers,” Duggirala says. “An enterprise must be able to evolve alongside growing data protection regulations, doing so by educating all associates on US and international data privacy and protection regulations, and building security and compliance into the initial design of all data storage and consumption. This mindset is how TransUnion makes trust possible and protects our data ecosystem and its compliance.” 

They establish a broad data mindset

Building a data culture and mindset is part of the overall data strategy, but it bears special mention because it truly helps bring the strategy to life.

“All aspects of decision-making are influenced by data,” Duggirala says. “Associates are fluent in its interpretation to better understand the market and make sound decisions. This is the core of TransUnion’s product development process — product managers, customer experience designers, and developers all leverage a different facet of our data to identify solutions that solve specific needs, define launch timelines, and ensure simple, intuitive features.”

At companies that are data-driven, “there is an organization-wide acknowledgement that data is at the heart of decision-making,” Rudy says. “So, when challenges are posed, questions are asked or strategy is designed, people automatically reach for data to support decision-making.”

At ISG, “from marketing and sales materials that describe our credentials, to client deliverables where data is used to substantiate recommendations, industry briefings where we back up our knowledge and expertise with data and facts, data truly is at the heart of everything we do,” Rudy says. “Data gives businesses a competitive advantage. We view data as circular. We are continuously in the process of collecting, validating, managing, curating, and analyzing data to drive insights for all our stakeholders.”

Data-driven organizations have many drivers, says Theresa Kushner, head of the Innovation Center for North America at consulting firm NTT Data. “This means that no matter where you sit in the organization, you can have access to the data you need to do your job,” she says. “Non-data-driven organizations are usually siloed in their approach to data management.”

NTT Data research shows that a minority of organizations say data is shared seamlessly across the enterprise. “In a data-driven enterprise this is not the case,” Kushner says. “Because these groups are directed by their leadership to make decisions based on data and because they have teams that pay special attention to key data sets, they can move quickly and drive their businesses using accurate, readily available data.”

Regular collaboration is key to having a data mindset. “Data is nothing without people sharing and using it,” Kushner says. “Effective data-driven cultures depend on efficient collaboration and open communication between owners of the data and its users. This trait of a data-driven organization supersedes all others such as training, certification, data governance, regular process updates.”

They make data collection a primary concern

Many AI projects are shelved in short order because data scientists cannot find the data that is needed for a proposed model, Kushner says. “Oftentimes this is because the data was never collected,” she says. “Data-driven organizations do not have this problem. They know which data domains are important and necessary to the running of the business, and they ensure that these datasets are protected and curated.”

For example, most companies have customer relationship management (CRM) systems that are used by sales to record and track opportunities, Kushner says. But the data in these systems is often incomplete for customers and their transactions, especially if data entry is the responsibility of the salesperson, she says.

“This means that when data scientists want to create a customer model that identifies those customers who will buy at a particular time or from a specific channel, the data they need might not be available or complete enough to support the model,” Kushner says. “Data-driven organizations, however, understand that this data is primary to running the business and as a result ensure that data management practices are thorough for key areas.”

In many cases, to ensure that data is entered appropriately, these organizations automate sales entry processes to free sales from tedious entry tasks. “Depending on the business type or industry, key areas may change,” Kushner says. “For example, manufacturers may find that managing the information on their suppliers more closely is their key data domain. No matter what industry, data-driven organizations have a plan for collecting, managing, and using key data.”

They foster strong collaboration between IT and the business

Data-driven enterprises tend to feature good working relationships between IT and business leaders. For example, when the CIO works closely with the finance department, a company can maximize the value of financial data.

“Delivering the right information, at the right time, in the right format to executives and managers requires a close partnership between the CFO and CIO,” says Lynn Calhoun, CFO at professional services firm BDO.

“This includes getting the finance and IT teams together to define information requirements, collaborating on setting up the right IT systems and architecture to meet those requirements, and working closely to implement and support agile systems and processes that can keep pace with today’s rapidly changing business environment,” Calhoun says.

In BDO’s case, “we work closely together to understand what the business ‘needs,’ not just what they ask for, which is usually constrained by what they know,” Calhoun says. That constraint limits the ability of the business leaders to achieve their goals, he says.

Analytics, Data Management, Data Mining