Six out of ten organizations today are using a mix of infrastructures, including private cloud, public cloud, multi-cloud, on-premises, and hosted data centers, according to the 5th Annual Nutanix Enterprise Cloud Index. Managing applications and data, especially when they’re moving across these environments, is extremely challenging. Only 40% of IT decision-makers said that they have complete visibility into where their data resides, and 85% have issues managing cloud costs. Addressing these challenges will require simplification, so it’s no surprise that essentially everyone (94%) wants a single, unified place to manage data and applications in mixed environments.

In particular, there are three big challenges that rise to the top when it comes to managing data across multiple environments. The first is data protection.

“Because we can’t go faster than the speed of light, if you want to recover data, unless you already have the snapshots and copies where that recovered data is needed, it’ll take some time,” said Induprakas Keri, SVP of Engineering for Nutanix Cloud Infrastructure. “It’s much faster to spin up a backup where the data is rather than moving it, but that requires moving backups or snapshots ahead of time to where they will be spun up, and developers don’t want to think about things like that. IT needs an automated solution.”

Another huge problem is managing cost—so much so that 46% of organizations are thinking about repatriating cloud applications to on-premises, which would have been unthinkable just a few years ago.

“I’m familiar with a young company whose R&D spend was $18 million and the cloud spend was $23 million, with utilization of just 11%,” Keri said. “This wasn’t as much of a concern when money was free, but those days are over, and increasingly, organizations are looking to get their cloud spend under control.”

Cloud data management is complex, and without keeping an eye on it, costs can quickly get out of control.

The final big problem is moving workloads between infrastructures. It’s especially hard moving legacy applications to the cloud because of all the refactoring, and it’s easy for that effort to get far out of scope. Keri has experienced this issue firsthand many times in his career. 

“What we often see with customers at Nutanix is that the journey of moving applications to the cloud, especially legacy applications, is one that many had underestimated,” Keri said. “For example, while at Intuit as CISO, I was part of the team that moved TurboTax onto AWS, which took us several years to complete and involved several hundred developers.”

Nutanix provides a unified infrastructure layer that enables IT to seamlessly run applications on a single underlying platform, whether it’s on-premises, in the cloud, or even a hybrid environment. And data protection and security are integral parts of the platform, so IT doesn’t have to worry about whether data will be local for recovery or whether data is secure—the platform takes care of it.

“Whether you’re moving apps which need to be run on a platform or whether you’re building net-new applications, Nutanix provides an easy way to move them back and forth,” Keri said. “If you start with a legacy application on prem, we provide the tools to move it into the public cloud. If you want to start in the cloud with containerized apps and then want to move them on-prem or to another cloud service provider, we provide the tools to do that. Plus, our underlying platform offers data protection and security, so you don’t have to worry about mundane things like where your data needs to be. We can take the pain away from developers.”

For more information on how Nutanix can help your organization control costs, gain agility, and simplify management of apps and data across multiple environments, visit Nutanix here.

Data Management

Evaluating and managing billions of dollars in IT spending across 400 tech providers in 200 countries provides valuable experience in verified ways to cut costs and accelerate IT financial management tasks. Want to tap into a wealth of cost-cutting knowledge gleaned from 60 IT cost management consultants who are engaged in hundreds of cost-reduction projects each year, saving companies as much as 20% or more on their IT spend? Here are the top lessons learned according to Tangoe’s cost management consultants. 

Companies Can Cut Costs by 10-40% Across Multiple IT Domains 

Creating an effective methodology for IT expense management and optimization is no longer a strategy used only by large enterprises in specific use cases. It’s an approach used widely by companies of all sizes and applied to the entire IT environment—cloud, mobile, network, and security. Although savings vary across each IT domain, an effective cost optimization program typically produces significant savings.  

At Tangoe, we commonly see companies: 

Save 20% on their IT costs overall Save 10-15% in telecom costs through service optimization and as much as 20-25% or more when combined with an effective contract negotiation consultancy Cut mobility costs by 15-30% while improving both IT productivity and the end-user experience  Save 15-40% in cloud costs, eliminating unnecessary services and reallocating underutilized IaaS and SaaS resources When investing in an IT expense management platform, on average new clients see triple-digit ROI within the first year 

While cloud cost optimization and FinOps may seem like a post-pandemic trend, the IT expense management (ITEM) industry is a mature market with more than 20 years of historically proven results. Known results allow ITEM providers to offer clients the advantage of a savings guarantee, and with the market heating up, providers are actually making guarantees a contractual commitment. 

Acting on Cost Savings Is Harder Than Simply Identifying Them 

In today’s information age, AI-powered analytic tools make it easier to crunch data and pinpoint millions of dollars in potential IT cost efficiencies. But then what? Opportunities are worth nothing if you can’t capitalize on them quickly. Actioning identified opportunities is where the real work begins and where speed to savings is key, as every month that goes by is a lost opportunity that increases the savings you’ll never see.   

With staffing tight and other priorities taking precedence, all too often we see identified savings go unrealized for months if not years.  

Given the criticality of quick response, leveraging a firm to implement identified savings makes sense. Better yet, the firm should be able to automate the process to confirm those savings are actually realized and continue to be achieved on an ongoing basis. For these reasons, we recommend asking about professional services (staff augmentation) when your IT team is too overstretched. 

To cut costs faster you also need an automated IT expense management platform integrated with the service portals and dashboards of the technology service providers themselves. This way, modifications and service changes can be made faster and with less manual work.  

Secret to Avoiding Waste: An Accurate Inventory of Services 

Rapidly changing times make for a rapidly changing corporate IT environment. It’s critical to ensure you know what IT assets you have now, understand how efficiently they’re being used, and then charge back those costs to the departments using the most resources.  

Cloud cost management is top of mind today because companies are wasting as much as 30% of their cloud resources and overspending in the cloud by as much as 50%—even 70%, according to Gartner. And it’s not just the cloud creating IT waste. Mergers, divestitures, corporate rightsizing, a return to travel and hybrid work environments are all contributing factors to misalignment between IT resources and business needs. Any time the company evolves, IT services need to come into alignment, and when assets aren’t right-sized waste, inefficiencies, and overpayments are the result. 

To get rid of IT waste, you must first identify it. Knowing what you have, where you are paying too much, and where assets are going unused requires gleaning intelligence from an accurate inventory of all mobile, cloud, and network services. Visibility is only as good as your system for tracking and categorizing costs. A disciplined inventory and vendor management program establishes a corporate catalog of providers and uses automation to collect granular account information, invoices, and service data. Insights can shed light on current trends in usage and efficiency as well as serve as a launchpad for cost control, policy decision-making, and security risk reductions.  

Migration Mismanagement Slows Technology ROI 

Change is the new normal. Whether it involves moving services to the cloud, shifting employees to more secure corporate-owned mobile devices, or transitioning services to optimize and modernize a network, the management and administration of technology migrations is everything.  

At Tangoe, we see the ROI on digital transformation initiatives decline (and even turn negative) because companies underestimate the time and resources needed to carry out change. Designing, managing, and monitoring transitions becomes a full-time job that can distract internal teams from more meaningful work. In the end, it’s more efficient and less expensive to augment those internal teams with outside resources or outsource enterprise-wide deployments together. Mismanaged technology migrations can significantly hinder a company’s digital innovation strategy. 

Careful consideration is needed when it comes to deciding how corporate resources are allocated. We see network service transitions, SD-WAN implementations, as well as migrations to cloud-unified communications as areas that benefit from staff augmentation. While we all know outsourcing can help curb costs, this is where we see consultancies payout in significant ways.  

Insider Knowledge Provides a Level of Confidence That Is Priceless 

A highly dynamic mobile, cloud, and network environment highlights the importance of obtaining insider intelligence. When corporate service transformation is on the line, IT spending decisions shouldn’t be made in a vacuum. IT budgeting decisions are far easier when consulting an authority on the latest pricing benchmarks for services or tips for negotiating telecom contracts in your favor. They evaluate how millions of dollars are spent (and misspent) every year, and they bring with them valuable insights into tech spending trends that can help you compare your corporate strategies against hundreds of other companies. Consultants are versed in helping tackle the big stuff: 

Fiduciary responsibility when IT budgets and spending are rising despite slow economic growth Reigning in cloud sprawl and cloud costs all while strengthening cloud security  Establishing governance after innovation has run amok  

After all, it’s the insider intelligence that helps CIOs and CTOs sleep at night. That confidence is worth its weight in gold. 

To learn more about IT expense and asset management services, visit us here.   

IT Leadership

Data governance definition

Data governance is a system for defining who within an organization has authority and control over data assets and how those data assets may be used. It encompasses the people, processes, and technologies required to manage and protect data assets.

The Data Governance Institute defines it as “a system of decision rights and accountabilities for information-related processes, executed according to agreed-upon models which describe who can take what actions with what information, and when, under what circumstances, using what methods.”

The Data Management Association (DAMA) International defines it as the “planning, oversight, and control over management of data and the use of data and data-related sources.”

Data governance framework

Data governance may best be thought of as a function that supports an organization’s overarching data management strategy. Such a framework provides your organization with a holistic approach to collecting, managing, securing, and storing data. To help understand what a framework should cover, DAMA envisions data management as a wheel, with data governance as the hub from which the following 10 data management knowledge areas radiate:

Data architecture: The overall structure of data and data-related resources as an integral part of the enterprise architectureData modeling and design: Analysis, design, building, testing, and maintenanceData storage and operations: Structured physical data assets storage deployment and managementData security: Ensuring privacy, confidentiality, and appropriate accessData integration and interoperability: Acquisition, extraction, transformation, movement, delivery, replication, federation, virtualization, and operational supportDocuments and content: Storing, protecting, indexing, and enabling access to data found in unstructured sources and making this data available for integration and interoperability with structured dataReference and master data: Managingshared data to reduce redundancy and ensure better data quality through standardized definition and use of data valuesData warehousing and business intelligence (BI): Managing analytical data processing and enabling access to decision support data for reporting and analysisMetadata: Collecting, categorizing, maintaining, integrating, controlling, managing, and delivering metadataData quality: Defining, monitoring, maintaining data integrity, and improving data quality

When establishing a strategy, each of the above facets of data collection, management, archiving, and use should be considered.

The Business Application Research Center (BARC) warns that data governance is a highly complex, ongoing program, not a “big bang initiative,” and it runs the risk of participants losing trust and interest over time. To counter that, BARC recommends starting with a manageable or application-specific prototype project and then expanding across the company based on lessons learned.

BARC recommends the following steps for implementation:

Define goals and understand benefitsAnalyze current state and delta analysisDerive a roadmapConvince stakeholders and budget projectDevelop and plan the data governance programImplement the data governance programMonitor and control

Data governance vs. data management

Data governance is just one part of the overall discipline of data management, though an important one. Whereas data governance is about the roles, responsibilities, and processes for ensuring accountability for and ownership of data assets, DAMA defines data management as “an overarching term that describes the processes used to plan, specify, enable, create, acquire, maintain, use, archive, retrieve, control, and purge data.”

While data management has become a common term for the discipline, it is sometimes referred to as data resource management or enterprise information management (EIM). Gartner describes EIM as “an integrative discipline for structuring, describing, and governing information assets across organizational and technical boundaries to improve efficiency, promote transparency, and enable business insight.”

Importance of data governance

Most companies already have some form of governance for individual applications, business units, or functions, even if the processes and responsibilities are informal. As a practice, it is about establishing systematic, formal control over these processes and responsibilities. Doing so can help companies remain responsive, especially as they grow to a size in which it is no longer efficient for individuals to perform cross-functional tasks. Several of the overall benefits of data management can only be realized after the enterprise has established systematic data governance. Some of these benefits include:

Better, more comprehensive decision support stemming from consistent, uniform data across the organizationClear rules for changing processes and data that help the business and IT become more agile and scalableReduced costs in other areas of data management through the provision of central control mechanismsIncreased efficiency through the ability to reuse processes and dataImproved confidence in data quality and documentation of data processesImproved compliance with data regulations

Goals of data governance

The goal is to establish the methods, set of responsibilities, and processes to standardize, integrate, protect, and store corporate data. According to BARC, an organization’s key goals should be to:

Minimize risksEstablish internal rules for data useImplement compliance requirementsImprove internal and external communicationIncrease the value of dataFacilitate the administration of the aboveReduce costsHelp to ensure the continued existence of the company through risk management and optimization

BARC notes that such programs always span the strategic, tactical, and operational levels in enterprises, and they must be treated as ongoing, iterative processes.

Data governance principles

According to the Data Governance Institute, eight principles are at the center of all successful data governance and stewardship programs:

All participants must have integrity in their dealings with each other. They must be truthful and forthcoming in discussing the drivers, constraints, options, and impacts for data-related decisions.Data governance and stewardship processes require transparency. It must be clear to all participants and auditors how and when data-related decisions and controls were introduced into the processes.Data-related decisions, processes, and controls subject to data governance must be auditable. They must be accompanied by documentation to support compliance-based and operational auditing requirements.They must define who is accountable for cross-functional data-related decisions, processes, and controls.It must define who is accountable for stewardship activities that are the responsibilities of individual contributors and groups of data stewards.Programs must define accountabilities in a manner that introduces checks-and-balances between business and technology teams, and between those who create/collect information, those who manage it, those who use it, and those who introduce standards and compliance requirements.The program must introduce and support standardization of enterprise data.Programs must support proactive and reactive change management activities for reference data values and the structure/use of master data and metadata.

Best practices of data governance

Data governance strategies must be adapted to best suit an organization’s processes, needs, and goals. Still, there are six core best practices worth following:

Identify critical data elements and treat data as a strategic resource.Set policies and procedures for the entire data lifecycle.Involve business users in the governance process.Don’t neglect master data management.Understand the value of information.Don’t over-restrict data use.

For more on doing data governance right, see “6 best practices for good data governance.”

Challenges in data governance

Good data governance is no simple task. It requires teamwork, investment, and resources, as well as planning and monitoring. Some of the top challenges of a data governance program include:

Lack of data leadership: Like other business functions, data governance requires strong executive leadership. The leader needs to give the governance team direction, develop policies for everyone in the organization to follow, and communicate with other leaders across the company.Lack of resources: Data governance initiatives can struggle for lack of investment in budget or staff. Data governance must be owned by and paid for by someone, but it rarely generates revenue on its own. Data governance and data management overall, however, are essential to leveraging data to generate revenue.Siloed data: Data has a way of becoming siloed and segmented over time, especially as lines of business or other functions develop new data sources, apply new technologies, and the like. Your data governance program needs to continually break down new siloes.

For more on these difficulties and others, see “7 data governance mistakes to avoid.”

Data governance software and vendors

Data governance is an ongoing program rather than a technology solution, but there are tools with data governance features that can help support your program. The tool that suits your enterprise will depend on your needs, data volume, and budget. According to PeerSpot, some of the more popular solutions include:

Data governance solutionDescription and featuresCollibra GovernanceCollibra is an enterprise-wide solution that automates many governance and stewardship tasks. It includes a policy manager, data helpdesk, data dictionary, and business glossary.SAS Data ManagementBuilt on the SAS platform, SAS Data Management provides a role-based GUI for managing processes and includes an integrated business glossary, SAS and third-party metadata management, and lineage visualization.erwin Data Intelligence (DI) for Data Governanceerwin DI combines data catalog and data literacy capabilities to provide awareness of and access to available data assets. It provides guidance on the use of those data assets and ensures data policies and best practices are followed.Informatica AxonInformatica Axon is a collection hub and data marketplace for supporting programs. Key features include a collaborative business glossary, the ability to visualize data lineage, and generate data quality measurements based on business definitions.SAP Data HubSAP Data Hub is a data orchestration solution intended to help you discover, refine, enrich, and govern all types, varieties, and volumes of data across your data landscape. It helps organizations to establish security settings and identity control policies for users, groups, and roles, and to streamline best practices and processes for policy management and security logging.AlationAlation is an enterprise data catalog that automatically indexes data by source. One of its key capabilities, TrustCheck, provides real-time “guardrails” to workflows. Meant specifically to support self-service analytics, TrustCheck attaches guidelines and rules to data assets.Varonis Data Governance SuiteVaronis’s solution automates data protection and management tasks leveraging a scalable Metadata Framework that enables organizations to manage data access, view audit trails of every file and email event, identify data ownership across different business units, and find and classify sensitive data and documents.IBM Data GovernanceIBM Data Governance leverages machine learning to collect and curate data assets. The integrated data catalog helps enterprises find, curate, analyze, prepare, and share data.

Data governance certifications

Data governance is a system but there are some certifications that can help your organization gain an edge, including the following:

DAMA Certified Data Management Professional (CDMP)Data Governance and Stewardship Professional (DGSP)edX Enterprise Data ManagementSAP Certified Application Associate – SAP Master Data Governance

For related certifications, see “10 master data management certifications that will pay off.”

Data governance roles

Each enterprise composes its data governance differently, but there are some commonalities.

Steering committee

Governance programs span the enterprise, generally starting with a steering committee comprising senior management, often C-level individuals or vice presidents accountable for lines of business. Morgan Templar, author of Get Governed: Building World Class Data Governance Programs, says steering committee members’ responsibilities include setting the overall governance strategy with specific outcomes, championing the work of data stewards, and holding the governance organization accountable to timelines and outcomes.

Data owner

Templar says data owners are individuals responsible for ensuring that information within a specific data domain is governed across systems and lines of business. They are generally members of the steering committee, though may not be voting members. Data owners are responsible for:

Approving data glossaries and other data definitionsEnsuring the accuracy of information across the enterpriseDirect data quality activitiesReviewing and approving master data management approaches, outcomes, and activitiesWorking with other data owners to resolve data issuesSecond-level review for issues identified by data stewardsProviding the steering committee with input on software solutions, policies, or regulatory requirements of their data domain

Data steward

Data stewards are accountable for the day-to-day management of data. They are subject matter experts (SMEs) who understand and communicate the meaning and use of information, Templar says, and they work with other data stewards across the organization as the governing body for most data decisions. Data stewards are responsible for:

Being SMEs for their data domainIdentifying data issues and working with other data stewards to resolve themActing as a member of the data steward councilProposing, discussing, and voting on data policies and committee activitiesReporting to the data owner and other stakeholders within a data domainWorking cross-functionally across lines of business to ensure their domain’s data is managed and understood

More on data governance:

7 data governance mistakes to avoid6 best practices for good data governanceThe secrets of highly successful data analytics teams What is data architecture? A framework for managing data10 master data management certifications that will pay off

Big Data, Data and Information Security, Data Integration, Data Management, Data Mining, Data Science, IT Governance, IT Governance Frameworks, Master Data Management

The effective management of mobile devices is a game of high risk. While every company is dependent on their devices to generate revenue, they also increase vulnerability to ransomware attacks costing an average $4.5 million and consume 34% of IT’s time and productivity. Keeping the corporate fleet securely up and running is top of mind for business leaders, and yet the job of management is becoming more difficult.

The burden of maintaining wireless technologies is more costly and complicated than traditional computers, because of the wide range of device types, operating systems, services, and applications that are unique to them. Consider cell phones and tablets, point-of-sale devices, wearable scanners in warehouses, diagnostic devices in healthcare facilities, and smart tags monitoring behavior and processes across a variety of industries. Most companies have thousands of wireless devices to manage, and that number can grow 10X for larger enterprises.

Trends in mobile-first strategies, remote work, artificial intelligence, and the Internet of Things (IoT) have more companies taking on responsibility for an ocean of devices and services.

Although these help companies digitally transform, devices create a mountain of IT and security work—not to mention expense. That explains why businesses are looking for standards to help them lighten the load.

Here are five best practices for managing your mobile strategy, the fleet itself, and the costs.

5 best practices for mobile device management

1. BYOD or corporate-owned: stay flexible in your device strategy

As mobile-first strategies have come under the spotlight, so too have the corporate policies around them. Bring Your Own Device (BYOD) approaches have increased in popularity due to their perceived low cost and convenience, but corporate-owned approaches are still common. So, which is better? Both have pros and cons:

BYOD Pros & Cons: Companies can save capital and employees value the freedom of this convenience (and the monetary reimbursements that come with it), but security concerns prevail.Corporate-Owned Pros & Cons: Corporate ownership brings more security and can streamline company-sanctioned applications but lacks in enabling remote work and creates an inflexible work environment making employees feel more controlled. 

Today’s dominant approach is to use BYOD for mobile phones and corporate ownership for laptops, but there is little confidence in it. Vanson Bourne research shows 81% of companies are considering changing their mobile strategy in the future. With best practices in the throes of change, there’s no right or wrong decision here. Executives are trying to balance the demands of mobile security with the needs of remote work and employee satisfaction, and it’s not easy. The best advice: Determine how well your current approach is working. Think of it as a trial run and consider what it would take to shift your stance.

2. Security & management: make fleet inventory the cornerstone of your approach

Security is a high priority concern when it comes to mobile, and security professionals are quick to tell you: “You can’t secure what you can’t see.” An accurate inventory makes observability and governance possible. Without a comprehensive list of connected devices, companies cannot ensure security coverage, applying mobile device management and unified endpoint management technologies that push updates to devices and help IT teams monitor and respond to security threats.

Most companies have a hard time keeping up with all those devices—and when phones get tossed into drawers as employees leave the company, who can blame them. It comes down to data cleanliness, requiring the resources to obtain information, track granular details, and inform decision making. When companies and their devices change daily, discipline and dedication are key to accuracy as are integrations and APIs that help automate inventory updates. 

Here are a few data fields every inventory should include:

Unique device identifierModel number, service type and operating systemSecurity requirements and unique applications installedStatus—active or inactiveCurrent owner and their locationAny accessories—case, screen protector, headset, etc.Vendor, service contract, and account numberAverage usage/cost per month or yearAssociated cost center or department  

3. Lifecycle management: don’t underestimate the duties of end-to-end administration

The range of responsibilities and ongoing work required for effective device management is grossly underestimated. Devices are often viewed as static assets, but management is not a set-it-and-forget-it activity. Instead, it’s a cyclical system. Successful mobile programs address the complete and repeated nature of the device lifecycle:

Planning: Needs assessment, contracts, procurement, configuration, deployment, activationManaging: Inventory, compliance, reporting, expenses, help desk supportRecycling: Repair and replace, decommissioning, reassignment

It’s easy to miscalculate the time investments necessary to address these ongoing needs. Analyst firm Nemertes finds that managing 500 devices requires three dedicated, full-time staff, and the skills are not as trivial as one might expect.

Challenges arise particularly when companies experience the high employee turnover rates—typical in today’s world. Complexity can also be an issue as IT teams must work across distributed dashboards and siloed systems. Cost optimization requires even more sophisticated information gathering and analysis. Standardized request forms and automated workflows can speed processes, as can mobile management services taking on all or part of the lifecycle.

4. Cost and ROI optimization: get to a fixed cost that makes budgeting easy

When it comes to maximizing return on investment for the mobile fleet, business leaders drive to two important goals. First: A low cost. Second: A fixed and predictable monthly cost that makes it easy to budget and forecast business needs. Getting there may warrant several of these key actions.

Lower Your Mobile Costs: Identifying cost savings requires a well-managed inventory of assets that never go unused and leveraging an expense management tool or service to quickly:

Compare your costs across the best prices in the industry.Compare invoices to usage, ensuring billing is accurate and late payments and fees are avoided—carrier-imposed surcharges continue to rise and have a material impact on costs.Evaluate your usage across tiered service plans, so you’re not overpaying for the unlimited data plan when a lower tier will do.Decouple the device hardware cost from the carrier’s monthly recurring charge—key in cost evaluation and contract negotiation.

Move from Capex to Opex: Device as a service (DaaS) providers can transition your ownership models to service models, so you can standardize costs and reach a predictable monthly price.

Forecast with Accuracy: Use historical data and predictive analytics to make data-driven estimates about future expenses.

5. Innovation: use emerging technology to automate mobile management 

Most companies recognize that they don’t have the ability to control, manage, and optimize their fleet without a set of advanced technologies. Leverage these innovations to eliminate much of the manual work required:  

Robotic process automation, bots, and workflow engines help accelerate processes in fulfilling orders, recycling assets, and paying invoices.Advanced analytics and artificial intelligence are key in auditing and normalizing complex data so you can benchmark industry-leading pricing, recognize unused resources, identify cost savings, and predict needs.Integration and a catalog of APIs with IT and financial applications as well as mobile providers and telecom carriers enable you to gain real-time insights, centralize data for accelerated decision making, and automate mobile services.Electronic data exchange is key in capturing and ingesting invoices, orders, and the latest pricing data from providers.

When relying on mobile management platforms for innovation, providers should also bring a level of industry intelligence gleaned from AI and advanced analytics. Whether they track a database of technology providers and their pricing, evaluate the way companies spend billions of dollars on technology, or leverage data to help clients with contract negotiations, they should be able to achieve business outcomes.

Keep Stepping Up Your Mobile Operational Excellence

An effort to simplify and optimize mobile management will drive operational excellence, and mature programs work first to maintain an accurate inventory, support security, and administer the complete device lifecycle. Taking programs to the next level means moving beyond asset management into expense management and innovation to further optimize mobile strategies.

If you’re just starting, focus on simplification by gaining visibility and streamlining manual processes. When you’re ready to advance, start leveraging emerging technologies to audit usage and align contract terms with payments. Reconciling usage and spend against terms is the secret to pinpointing inefficiencies and cost savings that make for the most well managed mobile fleet. 

To learn more about mobility management services, visit us here.  

Endpoint Protection, Master Data Management, Remote Access Security, Security, Security Infrastructure

Every futurist and forecaster I have talked to is convinced the transformative technology of the next seven years is artificial intelligence. Everyone seems to be talking about AI. Unfortunately, most of these conversations do not lead to value creation or greater understanding. And, as an IT leader, you can bet these same conversations are reverberating throughout your organization — in particular, in the C-suite.

CIOs need to jump into the conversational maelstrom, figure out which stakeholders are talking about AI, inventory what they are saying, remediate toxic misconceptions, and guide the discussion toward value-creating projects and processes.

A brief history of AI hype and impact

AI has been part of the IT conversation since the term was coined by Stanford University computer scientist John McCarthy in 1956. Conversations around AI have generally tracked alongside multiples waves of enthusiasm and valleys of disappointment for the technology. In 1983 the prevalent conversation regarding AI was “It’s coming, it’s coming!” thanks in part to Edward Feigenbaum and Pamela McCorduck’s The Fifth Generation: Artificial Intelligence and Japan’s Computer Challenge to the World. And then just a year later, in 1984, a subset of AI startup companies in Silicon Valley collapsed, spectacularly ushering in a period known as “the AI winter.” At that point, AI conversations, when they occurred, typically concluded with the determination “not yet.”

Around the turn of the century we — most of us unknowingly — entered the age of artificial narrow intelligence (ANI), sometimes referred to as “weak AI.” ANI is AI that specializes in onearea. John Zerelli, writing in A Citizen’s Guide to Artificial Intelligence,contends, “Every major AI in existence today is domain-specific” — i.e., ANI.

The general path forward for ANI has been that it moves into a given domain and 7 to 10 years later it becomes impossible to compete/perform that particular task/activity without AI. Executives need to have tactical conversations regarding which domains and activity areas — aka, in AI-speak, which definable problems and measurable goals — should be targeted with which ANI resources.

By 2009 we were surrounded by invisible ANI, in the form of purchase, viewing, listening recommendations; medical diagnostics; university admissions tasks; job placement; etc. Today ANI is ubiquitous, invisible, and fundamentally misunderstood. Ray Kurzweil, computer scientist, futurist, and director of engineering at Google, keeps telling people that if AI systems went on strike “our civilization would be crippled.”

Today the general population is not talking substantively about AI, despite the fact that ahead-of-the-curve high performers have concluded that one can never outcompete those who use AI effectively.

In The Age of AI: And Our Human Future, Henry A. Kissinger, Eric Schmidt, and Daniel Huttenlocher tell us that “AI will usher in a world in which decisions are made in three primary ways: by humans [which is familiar]; by machines [which is becoming familiar], and by collaboration between humans and machines.” Organizations need to have conversations detailing how critical decisions will be made.

Taking practical steps

Organizations need to have conversations with every employee to determine their preferences regarding what kind of AI assistance they need to maximize their performance/engagement.

One of the most important conversations about AI that is not happening enough today is how it should be regulated. In his still-relevant mega-best-seller Future Shock, my former boss Alvin Toffler correctly prophesied a technology-intensive future and counseled the need for a technology ombudsman, “a public agency charged with receiving, investigating, and acting on complaints having to do with irresponsible application of technology.”

Fast forward to 2017 when legal scholar Andrew Tutt wrote “An FDA for Algorithms,” in Administrative Law Review, explaining the need for “critical thought about how best to prevent, deter, and compensate for the harms that they cause” and a government agency specifically tailored for that purpose.

One of the conversations that each and every one of us has to have is with our elected representatives. What is their position, what is their understanding of AI — it’s impacts and potential harms.

Demis Hassabis, CEO of DeepMind Technologies, the company acquired by Google that created AlphaGo, the program that beat the world Go champion in 2016, cautions that AI is now “on the cusp” of being able to make tools that could be deeply damaging to human civilization.

Elon Musk, Martin Rees — Astronomer Royal, astrophysicist, and author of On The Future: Prospects for Humanity — and the late Stephen Hawking have each warned about misusing, misunderstanding, mismanaging, and under-regulating AI.

John Brockman, who has served as literary agent for most of the seminal thinkers in the AI space and is editor of Possible Minds: Twenty-Five Ways of Looking at AI,argues that “AI is too big for any one perspective.” The best way to expand one’s understanding of this incredibly important topic is engaging in conversations. And that includes within the walls of your business.

Don’t let your organization lead itself astray with an overeager approach to AI.

Artificial Intelligence

Technology mergers and acquisitions are on the rise, and any one of them could throw a wrench into your IT operations.

After all, many of the software vendors you rely on for point solutions likely offer cross-platform or multiplatform products, linking into your chosen ERP and its main competitors, for example, or to your preferred hyperscaler, as well as other cloud services and components of your IT estate.

What’s going to happen, then, if that point solution is acquired by another vendor — perhaps not your preferred supplier — and integrated into its stack?

The question is topical: Hyperconverged infrastructure vendor Nutanix, used by many enterprises to unify their private and public clouds, has been the subject of takeover talk ever since Bain Capital invested $750 million in it in August 2020. Rumored buyers have included IBM, Cisco, and Bain itself, and in December 2022 reports named HPE as a potential acquirer of Nutanix.

We’ve already seen what happened when HPE bought hyperconverged infrastructure vendor SimpliVity back in January 2017. Buying another vendor in the same space isn’t out of the question, as Nutanix and SimpliVity target enterprises of different sizes.

Prior to its acquisition by HPE, SimpliVity supported its hardware accelerator and software on servers from a variety of vendors. It also offered a hardware appliance, OmniCube, built on OEM servers from Dell. Now, though, HPE only sells SimpliVity as an appliance, built on its own ProLiant servers.

Customers of Nutanix who aren’t customers of HPE might justifiably be concerned — but they could just as easily worry about the prospects of an acquisition by IBM, the focus of earlier Nutanix rumors. IBM no longer makes its own servers, but it might focus on integrating the software with its Red Hat Virtualization platform and IBM Cloud, to the detriment of other customers relying on other integrations.

What to ask

The question CIOs need to ask themselves is not who will buy Nutanix, but what to do if a key vendor is acquired or otherwise changes direction — a fundamental facet of any vendor management plan.

“If your software vendor is independent then the immediate question is: Is the company buying this one that I’m using? If that’s true, then you’re in a better position. If not, then you immediately have to start figuring out your exit strategy,” says Tony Harvey, a senior director and analyst at Gartner who advises on vendor selection.

A first step, he says, is to figure out the strategy of the acquirer: “Are they going to continue to support it as a pure-play piece of software that can be installed on any server, much like Dell did with VMware? Or is it going to be more like HPE with SimpliVity, where effectively all non-HPE hardware was shut down fairly rapidly?” CIOs should also be looking at what the support structure will be, and the likely timescale for any changes.

Harvey’s focus is on data center infrastructure but, he says, whether the acquirer is a server vendor, a hyperscaler, or a bigger software vendor, “It’s a similar calculation.” There’s more at stake if you’re not already a customer of the acquirer.

A hyperscaler buying a popular software package will most likely be looking to use it as an on-ramp to its infrastructure, moving the management plane to the cloud but allowing existing customers to continue running the software on premises on generic hardware for a while, he says: “You’ve got a few years of runway, but now you need to start thinking about your exit plan.”

It’s all in the timing

The best time to plant a tree, they say, is 20 years ago, and the second best is right now. You won’t want your vendor exit plans hanging around quite as long, but now is also a great time to make or refresh them.

“The first thing to do is look at your existing contract. Migrating off this stuff is not a short-term project, so if you’ve got a renewal coming up, the first thing is to get the renewal done before anything like this happens,” says Harvey. If you just renewed, you’ll already have plenty of runway.

Then, talk to the vendor to understand their product roadmap — and tell them you’re going to hold them to it. “If that roadmap meets your needs, maybe you stay with that vendor,” he says. If it doesn’t, “You know where you need to go.”

Harvey pointed to Broadcom’s acquisition of Symantec’s enterprise security business in 2019 — and the subsequent price hikes for Symantec products — as an example of why it’s helpful to get those contract terms locked in early. Customer backlash from those price changes also explains why Broadcom is so keen to talk about its plans for VMware following its May 2022 offer to buy the company from Dell.

The risks that could affect vendors go far beyond acquisitions or other strategic changes: There’s also their general financial health, their ability to deliver, how they manage cybersecurity, regulatory or legislative changes, and other geopolitical factors.

Weigh the benefits

“You need to be keeping an eye on these things, but obviously you can’t war-game every event, every single software vendor,” he says.

Rather than weigh yourself down with plans for every eventuality, rank the software you use according to how significant it is to your business, and how difficult it is to replace, and have a pre-planned procedure in case it is targeted for acquisition.

“You don’t need to do that for every piece of software, but moving from SAP HANA to Oracle ERP or vice versa is a major project, and you’d really want to think about that.”

There is one factor in CIOs’ favor when it comes to such important applications, he says, citing the example of Broadcom’s planned acquisition of VMware: “It’s the kind of acquisition that does get ramped up to the Federal Trade Commission and the European Commission, and gets delayed for six months as they go through all the legal obligations, so it really does give you some time to plan.”

It’s also important to avoid analysis paralysis, he says. If you’re using a particular application, it’s possible that the business value it delivers now that outweighs the consequences of the vendor perhaps being acquired at some time in the future. Or perhaps the functionality it provides is really just a feature that will one day be rolled into the larger application it augments, in which case it can be treated as a short-term purchase.

“You certainly should look at your suppliers and how likely they are to be bought, but there’s always that trade off,” he concludes.

Mergers and Acquisitions, Risk Management, Vendor Management

Risk management and mitigation is a high priority for CEOs and other senior executives worldwide — including CIOs and cybersecurity executives. The fact is, it’s impossible to separate risk from technology implementations and the potential cybersecurity vulnerabilities they present.

One of the biggest challenges of risk management, as it relates to IT, is the emergence of a growing number of government and industry regulations regarding data privacy and security. The difficulty of complying with all the regulations — particularly for heavily regulated organizations such as financial services firms, healthcare institutions and government agencies — is daunting.

Some of the regulations that address specific sectors have been in place for a number of years. For example, in financial services the Gramm–Leach–Bliley Act (GLBA) requires financial firms to protect customer data and disclose all of their data-sharing practices with customers.

In the healthcare sector, the Health Insurance Portability and Accountability Act (HIPAA) requires the protection of sensitive patient health information from being disclosed without the patient’s consent or knowledge. Risk management and technology leaders in the industry have been grappling with HIPAA compliance since the law was enacted in 1996.

In the US federal government, agencies have to deal with the Federal Risk and Authorization Management Program (FedRAMP), a government-wide initiative that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services.

And in retail and other sectors, companies need to be compliant with the Payment Card Industry Data Security Standard (PCI DSS), a cyber security standard for organizations that handle branded credit cards from the major card companies. The PCI Standard, mandated by the card brands and administered by the Payment Card Industry Security Standards Council, was created to increase controls around cardholder data to reduce credit card fraud.

More recently, the General Data Protection Regulation (GDPR) was enacted in the European Union (EU) in 2018 to protect the privacy of data about EU citizens. GDPR’s primary aim is to enhance individuals’ control and rights over their personal data. And the California Consumer Privacy Act (CCPA) was enacted in the state in 2018 to enhance privacy rights and consumer protection for residents of California.

Many other states have pending legislation related to data protection and privacy, and some of these might be enacted in the near future.

Then there’s the American Data Privacy and Protection Act (ADPPA) a proposed federal online privacy bill that would regulate how organizations keep and use consumer data. The bipartisan bill is the first American consumer privacy bill to pass committee markup. ADPPA would regulate how organizations keep and use consumer data. It has several main principles, including data minimization, individual ownership, and private right of action. The burden of evaluating each organization’s programs would fall to the organization.

As the first federal user data privacy legislation, ADPPA would largely supersede state laws such as CCPA and Colorado Privacy Act.

We’re in the midst of an environment in which governments, organizations, consumers, business partners and indeed regulators are feeling increased risk aversion and a desire for increased security consciousness, which motivates regulatory change.

Regulators, in particular, want more transparency and increased controllability from organizations in virtually all industries regarding data and how it’s used.

How to manage the risks

With all of this data privacy regulatory activity going on, how can organizations ensure they remain in compliance?

One of the most important things is to be aware of any existing and emerging regulations that apply to the company. This goes without saying for regulated industries. But really, any business needs to devote resources to evaluating the regulatory scene, including keeping up on all the latest regulatory activities that apply to the organization.

Create a team that can assess and coordinate compliance activities. Whether this team is led by the head of risk management, compliance, audit, data governance or some other executive, the CIO and the CISO need to be involved because so much of data privacy involves the IT infrastructure. Other interested parties should include the legal and human resources departments.

Close and ongoing coordination among different facets of the organization is vital because data is such an all-encompassing entity within businesses today.

Another important organizational practice is to hire the necessary compliance experts. As with any technology-related skills today, it might be a challenge to find and retain people. If this proves to be impossible, there are countless consulting firms that handle data privacy issues for companies.

Of course, it’s also important to have access to the right tools and services to help ensure data privacy compliance. These tools should be capable of identifying vulnerability and compliance exposures within a very short period of time across widely distributed infrastructure components.

Some conduct vulnerability and compliance assessments against various operating systems, applications and security configurations and policies. They provide the data needed to help eliminate exposures, enhance overall security and simplify the preparation for audits.

Compliance functions are maturing, moving from a reactive and advisory role to becoming a proactive partner with the business, according to IT consulting and services firm Accenture.

A study the firm released in May 2022 showed that there’s an increased commitment to establishing a culture of shared compliance responsibility across the enterprise. The firm surveyed 860 compliance leaders and found that nearly half planned to upskill their compliance staff to drive a culture of compliance across the enterprise, and about 40% planned to invest in new technology to achieve this goal.

More than half of the respondents said they are using leading technologies to strengthen their compliance function, and 93% said new technologies such as artificial intelligence and cloud make compliance easier by automating human tasks, standardization, and making the process more effective and efficient.

Assess the risk of your organization with the Tanium Risk Assessment. Your customized risk report will include your risk score, proposed implementation plan, how you compare to industry peers, and more.

Risk Management

Risk management and mitigation is a high priority for CEOs and other senior executives worldwide — including CIOs and cybersecurity executives. The fact is, it’s impossible to separate risk from technology implementations and the potential cybersecurity vulnerabilities they present.

One of the biggest challenges of risk management, as it relates to IT, is the emergence of a growing number of government and industry regulations regarding data privacy and security. The difficulty of complying with all the regulations — particularly for heavily regulated organizations such as financial services firms, healthcare institutions and government agencies — is daunting.

Some of the regulations that address specific sectors have been in place for a number of years. For example, in financial services the Gramm–Leach–Bliley Act (GLBA) requires financial firms to protect customer data and disclose all of their data-sharing practices with customers.

In the healthcare sector, the Health Insurance Portability and Accountability Act (HIPAA) requires the protection of sensitive patient health information from being disclosed without the patient’s consent or knowledge. Risk management and technology leaders in the industry have been grappling with HIPAA compliance since the law was enacted in 1996.

In the US federal government, agencies have to deal with the Federal Risk and Authorization Management Program (FedRAMP), a government-wide initiative that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services.

And in retail and other sectors, companies need to be compliant with the Payment Card Industry Data Security Standard (PCI DSS), a cyber security standard for organizations that handle branded credit cards from the major card companies. The PCI Standard, mandated by the card brands and administered by the Payment Card Industry Security Standards Council, was created to increase controls around cardholder data to reduce credit card fraud.

More recently, the General Data Protection Regulation (GDPR) was enacted in the European Union (EU) in 2018 to protect the privacy of data about EU citizens. GDPR’s primary aim is to enhance individuals’ control and rights over their personal data. And the California Consumer Privacy Act (CCPA) was enacted in the state in 2018 to enhance privacy rights and consumer protection for residents of California.

Many other states have pending legislation related to data protection and privacy, and some of these might be enacted in the near future.

Then there’s the American Data Privacy and Protection Act (ADPPA) a proposed federal online privacy bill that would regulate how organizations keep and use consumer data. The bipartisan bill is the first American consumer privacy bill to pass committee markup. ADPPA would regulate how organizations keep and use consumer data. It has several main principles, including data minimization, individual ownership, and private right of action. The burden of evaluating each organization’s programs would fall to the organization.

As the first federal user data privacy legislation, ADPPA would largely supersede state laws such as CCPA and Colorado Privacy Act.

We’re in the midst of an environment in which governments, organizations, consumers, business partners and indeed regulators are feeling increased risk aversion and a desire for increased security consciousness, which motivates regulatory change.

Regulators, in particular, want more transparency and increased controllability from organizations in virtually all industries regarding data and how it’s used.

How to manage the risks

With all of this data privacy regulatory activity going on, how can organizations ensure they remain in compliance?

One of the most important things is to be aware of any existing and emerging regulations that apply to the company. This goes without saying for regulated industries. But really, any business needs to devote resources to evaluating the regulatory scene, including keeping up on all the latest regulatory activities that apply to the organization.

Create a team that can assess and coordinate compliance activities. Whether this team is led by the head of risk management, compliance, audit, data governance or some other executive, the CIO and the CISO need to be involved because so much of data privacy involves the IT infrastructure. Other interested parties should include the legal and human resources departments.

Close and ongoing coordination among different facets of the organization is vital because data is such an all-encompassing entity within businesses today.

Another important organizational practice is to hire the necessary compliance experts. As with any technology-related skills today, it might be a challenge to find and retain people. If this proves to be impossible, there are countless consulting firms that handle data privacy issues for companies.

Of course, it’s also important to have access to the right tools and services to help ensure data privacy compliance. These tools should be capable of identifying vulnerability and compliance exposures within a very short period of time across widely distributed infrastructure components.

Some conduct vulnerability and compliance assessments against various operating systems, applications and security configurations and policies. They provide the data needed to help eliminate exposures, enhance overall security and simplify the preparation for audits.

Compliance functions are maturing, moving from a reactive and advisory role to becoming a proactive partner with the business, according to IT consulting and services firm Accenture.

A study the firm released in May 2022 showed that there’s an increased commitment to establishing a culture of shared compliance responsibility across the enterprise. The firm surveyed 860 compliance leaders and found that nearly half planned to upskill their compliance staff to drive a culture of compliance across the enterprise, and about 40% planned to invest in new technology to achieve this goal.

More than half of the respondents said they are using leading technologies to strengthen their compliance function, and 93% said new technologies such as artificial intelligence and cloud make compliance easier by automating human tasks, standardization, and making the process more effective and efficient.

Assess the risk of your organization with the Tanium Risk Assessment. Your customized risk report will include your risk score, proposed implementation plan, how you compare to industry peers, and more.

Risk Management

Merger and acquisition (M&A) activity hit a record high in 2021 of more than $5 trillion in global volume. While the market has certainly slowed this year, it remains on par with pre-pandemic levels — quite a feat at a time of business uncertainty and inflation. But when it comes to corporate deal-making, risk lurks around every corner. The potential for overpaying, miscalculating synergies and missing potentially serious deficiencies in a target company is high.

With so much at stake, information is power. But while plenty of focus is centered on gathering financials, reviewing contracts, picking through insurance details and more, insight into IT risk may be harder to come by. Acquiring organizations need a rapid, accurate way to assess and map all of the endpoint assets in a target company, and then work quickly post-completion to assess and manage cyber risk.

The need for visibility

M&A deal volume may have fallen 12% year on year in early 2022, but the market remains bullish, driven by cash-rich private equity firms that are sitting on trillions of dollars, according to McKinsey. Still, security and IT operations are a growing concern for those with money to spend. It’s extremely rare for both sides of a deal to have similar standards for cybersecurity, asset management and key IT policies. That disconnect can cause major problems down the road.

Due diligence is therefore a critical step; enabling acquiring firms to spot potential opportunities for cost savings and synergies, whilst also understanding how risky a purchase a company may be. It benefits both sides. If an acquirer is unable to gain assurances around risk levels, they could theoretically call a deal off, or lower the offered acquisition price. Should they press on regardless, the organization may experience significant unforeseen problems trying to merge IT systems. Or it might unwittingly take on risk that erodes deal value over time – such as an undiscovered security breach that leads to customer class action suits, regulatory fines and reputational damage. 

These concerns are far from theoretical. After the discovery of historic data breaches at Yahoo, Verizon’s purchase price of the internet pioneer was adjusted down by $350m, or around 7% of deal size, back in 2017.  Marriott International was not so lucky when it bought hotel giant Starwood. It wasn’t until September 2018, two years after the acquisition and four years after the initial security breach, that an unauthorized intrusion was finally discovered. The breach turned out to be one of the biggest to date, impacting over 380 million customers, and led to an £18.4m ($21m) fine from the UK’s data protection regulator.

Getting due diligence right

In an ideal world, CIOs would be involved in M&A activity from the very start, asking the right questions and providing counsel to the CEO and senior leadership team on whether to proceed with a target. However, the truth is that this isn’t always the case. Such is the secrecy of deal-making that negotiations are usually limited to a small handful of executives, leaving some bosses on the outside. 

The best way CIOs can rectify this is to proactively educate senior executives about the importance of information security due diligence during M&A. If they succeed in embedding a security-by-design culture at the very top of the organization, those executives should be able to ask the right questions of targeted companies, to judge their level of risk exposure early on. They may even be inclined to invite the CIO in to help.

For most organizations, however, the first critical point at which due diligence can be applied is after an acquisition has been announced. This is where the acquiring company must gather as much information as possible to better understand risk levels and opportunities for cost reduction and efficiencies. SOC 2 compliance would make things run much smoother, providing useful insight into the level of security maturity at an acquired firm. But more likely than not, the acquiring company’s CIO will need to rely on their own processes.

Visibility is everything. They need accurate, current data on every single endpoint in the corporate environment, plus granular detail on what software is running on each asset and where there are unpatched vulnerabilities and misconfigurations. That’s easier said than done, and most current tools on the market struggle to provide answers to these questions across the virtual machines, containers, cloud servers, home working laptops and office-based equipment that run the modern enterprise. Even if they are able to provide full coverage, these tools may take days or weeks to deliver results, by which time the information is out of date.

Managing post-deal risk

The second opportunity for the CIO is once contracts are signed. Now it’s time to use a unified endpoint management platform to deliver a fast, accurate risk assessment of the acquired company’s IT environment. By inventorying all hardware and software assets, they can develop a machine and license consolidation strategy, eliminating redundant or duplicated software. The same tools should also enable CIOs to distribute new applications to the acquired company, scan for unmanaged endpoints, find and remediate any problems, and enhance IT hygiene across the board.

M&A is a high-risk, high-pressure world. By prioritizing endpoint visibility and control at every stage of a deal, organizations stand the best chance of preserving business value, reducing cyber risk and optimizing ROI.

Learn more about how Tanium can help manage risk and increase business value during mergers and acquisitions.

Risk Management

Merger and acquisition (M&A) activity hit a record high in 2021 of more than $5 trillion in global volume. While the market has certainly slowed this year, it remains on par with pre-pandemic levels — quite a feat at a time of business uncertainty and inflation. But when it comes to corporate deal-making, risk lurks around every corner. The potential for overpaying, miscalculating synergies and missing potentially serious deficiencies in a target company is high.

With so much at stake, information is power. But while plenty of focus is centered on gathering financials, reviewing contracts, picking through insurance details and more, insight into IT risk may be harder to come by. Acquiring organizations need a rapid, accurate way to assess and map all of the endpoint assets in a target company, and then work quickly post-completion to assess and manage cyber risk.

The need for visibility

M&A deal volume may have fallen 12% year on year in early 2022, but the market remains bullish, driven by cash-rich private equity firms that are sitting on trillions of dollars, according to McKinsey. Still, security and IT operations are a growing concern for those with money to spend. It’s extremely rare for both sides of a deal to have similar standards for cybersecurity, asset management and key IT policies. That disconnect can cause major problems down the road.

Due diligence is therefore a critical step; enabling acquiring firms to spot potential opportunities for cost savings and synergies, whilst also understanding how risky a purchase a company may be. It benefits both sides. If an acquirer is unable to gain assurances around risk levels, they could theoretically call a deal off, or lower the offered acquisition price. Should they press on regardless, the organization may experience significant unforeseen problems trying to merge IT systems. Or it might unwittingly take on risk that erodes deal value over time – such as an undiscovered security breach that leads to customer class action suits, regulatory fines and reputational damage. 

These concerns are far from theoretical. After the discovery of historic data breaches at Yahoo, Verizon’s purchase price of the internet pioneer was adjusted down by $350m, or around 7% of deal size, back in 2017.  Marriott International was not so lucky when it bought hotel giant Starwood. It wasn’t until September 2018, two years after the acquisition and four years after the initial security breach, that an unauthorized intrusion was finally discovered. The breach turned out to be one of the biggest to date, impacting over 380 million customers, and led to an £18.4m ($21m) fine from the UK’s data protection regulator.

Getting due diligence right

In an ideal world, CIOs would be involved in M&A activity from the very start, asking the right questions and providing counsel to the CEO and senior leadership team on whether to proceed with a target. However, the truth is that this isn’t always the case. Such is the secrecy of deal-making that negotiations are usually limited to a small handful of executives, leaving some bosses on the outside. 

The best way CIOs can rectify this is to proactively educate senior executives about the importance of information security due diligence during M&A. If they succeed in embedding a security-by-design culture at the very top of the organization, those executives should be able to ask the right questions of targeted companies, to judge their level of risk exposure early on. They may even be inclined to invite the CIO in to help.

For most organizations, however, the first critical point at which due diligence can be applied is after an acquisition has been announced. This is where the acquiring company must gather as much information as possible to better understand risk levels and opportunities for cost reduction and efficiencies. SOC 2 compliance would make things run much smoother, providing useful insight into the level of security maturity at an acquired firm. But more likely than not, the acquiring company’s CIO will need to rely on their own processes.

Visibility is everything. They need accurate, current data on every single endpoint in the corporate environment, plus granular detail on what software is running on each asset and where there are unpatched vulnerabilities and misconfigurations. That’s easier said than done, and most current tools on the market struggle to provide answers to these questions across the virtual machines, containers, cloud servers, home working laptops and office-based equipment that run the modern enterprise. Even if they are able to provide full coverage, these tools may take days or weeks to deliver results, by which time the information is out of date.

Managing post-deal risk

The second opportunity for the CIO is once contracts are signed. Now it’s time to use a unified endpoint management platform to deliver a fast, accurate risk assessment of the acquired company’s IT environment. By inventorying all hardware and software assets, they can develop a machine and license consolidation strategy, eliminating redundant or duplicated software. The same tools should also enable CIOs to distribute new applications to the acquired company, scan for unmanaged endpoints, find and remediate any problems, and enhance IT hygiene across the board.

M&A is a high-risk, high-pressure world. By prioritizing endpoint visibility and control at every stage of a deal, organizations stand the best chance of preserving business value, reducing cyber risk and optimizing ROI.

Learn more about how Tanium can help manage risk and increase business value during mergers and acquisitions.

Risk Management