Antonio Taylor landed his first IT job in 1999, having decided to leave his pre-law studies at college and get into tech instead.

He earned a Novell certification, believing it was a quick, effective way to get into a well-paying field with growth potential. Plus, he liked technology, saying, “Computers were always easy to me.”

Taylor’s gambit paid off: He has worked in IT ever since, earning a dozen or so certifications and multiple promotions during his career.

Now a hiring manager himself, Taylor still believes candidates don’t need a four-year degree to enter and advance in the IT profession. He says he has removed “degree required” and even “degree preferred” from many job postings, noting that IT pros can — and do — often develop the needed skills through certification programs, bootcamps, and even self-directed studies.

“A degree may not mean you have the experience and expertise needed for the tech job,” he says, adding he looks for candidates who can demonstrate they have the technical capabilities required for the positions being filled. “My questions in the interview are strictly around their abilities to do their job.”

Taylor is among a growing number of managers and executives dropping degree requirements from job descriptions.

Figures from the 2022 study The Emerging Degree Reset from The Burning Glass Institute quantify the trend, reporting that 46% of middle-skill and 31% of high-skill occupations experienced material degree resets between 2017 and 2019.

Moreover, researchers calculated that 63% of those changes appear to be “‘structural resets’ representing a measured and potentially permanent shift in hiring practices” that could make an additional 1.4 million jobs open to workers without college degrees over the next five years.

Despite such statistics and testimony from Taylor and other IT leaders, the debate around whether a college education is needed in IT isn’t settled. Some say there’s no need for degrees; others say degrees are still preferred or required.

Bob Dutile, a former CIO now serving as chief commercial officer at digital transformation solutions company UST, sums it up: “There are some who still prefer a college degree, seeing it — rightly or wrongly — as shorthand for showing that candidates have conscientiousness and the capacity to learn. But we and others have not found that it’s necessary.”

The argument for dropping degree requirements

CIOs, other tech leaders, and hiring managers from multiple organizations and industries say a good proportion of IT positions require competency in specific skills and not the academic breadth provided by a baccalaureate.

They list help desk roles, programmers, developers, designers, engineers, architects, analysts, and even some management positions as relying on technical skills, not a degree.

That’s not to say those positions don’t need people who can communicate effectively or think critically, they add. But they believe those skills as well as the needed technical competencies can be developed in myriad ways.

Moreover, they say the move to skills-based hiring brings several key benefits to both their organization and to workers. For the company, a skills-based hiring approach increases the number of qualified candidates applying and increases the gender, racial, ethnic, and economic diversity of the applicant pool. Meanwhile, workers have opportunities that they’re qualified to handle but otherwise would be shut out of, giving them not just a job but also a broader career path.

Broader talent pool

IBM is among the companies whose leaders have moved away from degree requirements; Big Blue is also one of the earliest, largest, and most prominent proponents of the move, introducing the term “new collar jobs” for the growing number of positions that require specific skills but not a bachelor’s degree.

Kelli Jordan, vice president of IBMer Growth and Development, says IBM has long taken that approach but became more deliberate about the strategy since 2016. It was then that the company started examining job descriptions and removing degree requirements when deemed unnecessary.

“We really focus on skills,” Jordan explains, adding that company leaders recognize that professionals can build skills through certifications, massive open online courses (MOOCs) and other such avenues.

Now more than half of the job openings posted by IBM no longer require degrees, Jordan says. And company managers and human resources teams continue to review position requirements, so an increasing number of jobs are falling into that no-degrees-needed category.

The hiring of people without degrees increased 35% since IBM started removing the four-year degree requirements.

Jordan says IBM’s approach helps the company compete for talent, a particularly important benefit while unemployment hovers around 4% and unemployment for technical occupations remains around 2%. She points out that by requiring a bachelor’s degree, companies shut out the 62% of Americans who don’t have that credential.

“You’re really limiting the opportunities for a large pool of people,” she adds.

Skills matter more than academics

Mike Calvo, CTO of Shipt, which operates an app-based shopping and delivery service, also brings that philosophy to hiring; he says the company has focused on hiring for skills since it was founded in 2014.

He says Shipt asks for either a four-year degree or relevant experience for most technology team roles. To ensure they get the right talent, Calvo says hiring managers “get pretty specific with skills, years of experience, technical capabilities, and experiences working in an environment similar to ours.”

Calvo, who oversees internal IT operations, says they’re looking for people who like to solve problems and are curious.

“That’s all way more important to us than where you got a degree, or if you have a degree, and we’ve hired a good number of people who have qualified from a knowledge perspective,” he says. “[College] has become so unimportant a factor in someone’s performance that people don’t talk about it; they don’t look at it. I can’t tell you the last time I looked at a candidate and said, ‘Oh, they have a degree.’”

The Birmingham, Ala.-based company partners with training programs to recruit professionals with the specific tech skills Shipt needs. For example, in 2021 it hired 25 graduates of a Pivot Technology School bootcamp and is onboarding another 17 from the school’s training program.

Real world versus theory

Others similarly stress that hands-on skills matter more than academics for many, if not most, technical positions.

Anant Adya, an executive vice president of cloud services company Infosys Cobalt, says he looks at attitude, testing candidates’ ability and willingness to learn when making hires. Adya’s company often recruits from community colleges and certification programs that focus on giving participants hands-on training in skills that align to existing market needs.

That’s not always the case with graduates from four-year academic programs, Adya says, adding that “we have found there was a gap between what’s required in the real world and what’s taught in [four-year] college programs.”

Tailoring talent to your needs

Proponents of skills-based hiring say the approach works best when it’s part of a comprehensive talent strategy — one that has a heavy emphasis on ongoing training and upskilling.

That’s what plays out at Thoughtworks.

Thoughtworks North America CEO Chris Murphy says the company has a long history of “hiring non-traditional talent for tech roles” including coding bootcamp graduates as well as people who have learned coding on the job or in their own time.

The company in 2005 started its Thoughtworks University program (TWU) to provide a one-year training program to ensure such hires can succeed at the company.

The case for degrees

Not all are convinced that dropping degree requirements is the way to go, however.

Jane Zhu, CIO and senior vice president at Veritas Technologies, says she sees value in degrees, value that isn’t always replicated through other channels.

“Though we don’t necessarily require degrees for all IT roles here at Veritas, I believe that they do help candidates demonstrate a level of formal education and commitment to the field and provide a foundation in fundamental concepts and theories of IT-related fields that may not be easily gained through self-study or on-the-job training,” she says. “Through college education, candidates have usually acquired basic technical knowledge, problem-solving skills, the ability to collaborate with others, and ownership and accountability. They also often gain an understanding of the business and social impacts of their actions.”

Intangibles matter

Zhu, who has a bachelor’s and master’s in computer science and a doctorate in operations research, says her own academic achievements “have played a huge role in my success.”

She adds: “The knowledge I acquired from my degrees equips me with strong technical and problem-solving skills; enables me to easily internalize business strategies, initiatives and challenges; have productive architecture-level discussions with IT staff; and allows me to make sound business decisions faster.”

Vision and commitment

Josh Lazar, who recently served for three years as CIO for Florida’s 18th Judicial Circuit, has a similar take.

“A four-year degree shows that a candidate can make a commitment to a long-term goal and achieve it. Further, the training you get within that setting can be more demanding and a person can gain expertise in a particular area,” says Lazar, who left his CIO role in February and is now CEO of TechThinkTank.

Leadership and learning skills

Others share that perspective, saying a bachelor’s degree demonstrates that candidates can think critically, handle complex problems, and persuasively communicate ideas; that they have studied a range of topics that help them manage and lead; and that they’ve fine-tuned their learning skills so they can more easily develop and upskill throughout their careers.

Of course, not all those with a baccalaureate possess such qualities — something proponents of requiring or preferencing degrees acknowledge.

However, they say someone with at least a bachelor’s is likely to have some or all of them — and that is reassuring.

“The CIO is placing a bet with everyone they hire, and they want a sure bet. And there’s still a sense of comfort when you have someone with a degree from a top-notch school or with an MBA,” says Mark Taylor, CEO of the Society for Information Management (SIM).

Not either/or but both

Dutile, from UST, says his company employs almost 30,000 engineers, most of whom work on software with some 60% deployed to work in the IT departments of the company’s Global 1000 clients. Dutile says nearly all of those clients want those workers to have a bachelor’s degree.

However, two of the companies do not have that requirement. “And their experience has been great,” he says.

That fits with evolving marketplace trends, where there’s more openness to skills-based hiring for many technical roles but a desire for a bachelor’s degree for certain positions, including leadership.

In fact, research shows that nearly all CIOs have college degrees. Online job site Zippia analyzed multiple sources and concluded that in the US 67% of CIOs have a bachelor’s, 20% have a master’s, and 2% have a doctorate. Only 8% have an associate’s degree, and 3% have what Zippia identified as “other degrees.”

Antonio Taylor’s own experience mirrors the ongoing discussion about whether and when a degree is needed in IT.

He believes certifications, bootcamps, and other such programs as well as work experience can give IT professionals the skills they need to succeed and advance in technical positions. But he also says, “Degrees do matter when looking for certain leadership roles in IT.”

Taylor returned to college, earning a bachelor’s degree in IT administration and management in 2017 and an MBA in 2018.He has held several management roles since earning his degrees and earned his latest promotion on March 1, moving from a director position to vice president of infrastructure, security, and services at Transnetyx

Careers, Hiring

Italian insurer Reale Group found itself with four cloud providers running around 15% of its workloads, and no clear strategy to manage them. “It was not a result we were seeking, it was the result of reality,” said Marco Barioni, CEO of Reale ITES, the company’s internal IT engineering services unit.

Since then, Barioni has taken control of the situation, putting into action a multi-year plan to move over half of Reale Group’s core applications and services to just two public clouds in a quest for cost optimization and innovation.

Multicloud environments like Reale Group’s are already the norm for 98% of infrastructure-as-a-service or platform-as-a-service users — although not all of them are taking control of their situation the same way Barioni is.

That’s according to a new study of enterprise cloud usage by 451 Research, which also looked at what enterprises are running across multiple public clouds, and how they measure strategy success.

Two-thirds of those surveyed are using services from two or three public cloud providers, while 31% are customers of four or more cloud providers. Only 2% had a single cloud provider.

Those enterprises’ cloud environments became even more complex when taking into account their use of software-as-a-service offerings. Half of those surveyed used two to four SaaS providers, one-third used five to nine providers, and one-eighth used 10 or more. Only 4% said they used a single SaaS solution, no mean feat given the prevalence of Salesforce, Zoom, and online productivity suites such as Microsoft 365 or Google Workspace.

The study, commissioned by Oracle, looked at the activities of 1,500 enterprises around the world using IaaS or PaaS offerings, or planning to do so within the next six months. The research was conducted between July and September 2022.

Three years on from the first COVID-19 lockdowns, it’s clear the pandemic was a significant driver of multicloud adoption for 91% of those surveyed. But now that the immediate necessity of the switch to remote operations and remote management has passed, enterprises are seeking other benefits as they build their multicloud environments.

Why build a multicloud infrastructure?

The two most frequently cited motivations for using multiple cloud providers were data sovereignty or locality (cited by 41% of respondents) and cost optimization (40%). Enterprises in financial services, insurance, and healthcare were most concerned about where their data is stored, while cost was the biggest factor for those in real estate, manufacturing, energy, and technology.

Next came three related concerns: business agility and innovation (30%); best-of-breed cloud services and applications (25%); and cloud vendor lock-in concerns (25%). Going with a single cloud provider could prevent enterprises from accessing new technology capabilities (such as the much-hyped ChatGPT, which Microsoft is using to draw customers to its Azure cloud services), leave them with a second-best service from a cloud provider less invested in a given technology, or allow the provider to hold them hostage and raise prices.

Traditional benefits of duplicating IT infrastructure were least important, with greater resiliency or performance cited by 23% of respondents, and redundancy or disaster recovery capabilities by just 21%.

But there are still many factors holding back multicloud adoption in the enterprise. Cloud provider management was the most frequently cited (by 34% of respondents), followed by interconnectivity (30%). It was a tie for third place, with data governance issues, workload and data portability, regulatory compliance, and ensuring security across public clouds all cited by 24%.

“The degree to which benefits outweigh challenges may depend on whether multicloud is part of a broader IT transformation strategy … or the extent to which it addresses particular cost, organizational or governance concerns,” wrote Melanie Posey, author of the study. Simply having multiple public cloud environments to meet different users’ needs may be good enough for risk mitigation and cost arbitrage for some enterprises, she wrote, while others will want integrated environments in which workloads and data can run across multiple public clouds.

Reality bytes

Reale Groupe is still straddling those two states as IT leader Barioni moves the company from relationships with four hyperscalers that just happened toward a greater reliance on two that he chose.

His choice of clouds — Oracle’s OCI and Microsoft’s Azure — was constrained by Reale’s reliance on Oracle’s Exadata platform. “Our core applications all run on Oracle databases,” he said.

While several cloud providers offered the packaged services for machine learning and advanced process management he was looking for, the choice of Microsoft to host the remaining business applications came down to latency, he said. Oracle and Microsoft have closely integrated their infrastructure in the regions most important to Reale, allowing the company to build high-speed interconnects between applications running in each cloud. Reale will move its first integrated applications to the cloud in March 2023, he said.

Multicloud management

Johnson Controls is further along in its multicloud journey. It makes control systems for managing industrial processes and smart buildings, some of which can be managed from the cloud-based OpenBlue Platform run by CTO Vijay Sankaran. He said that, while the company has a primary cloud provider, it has chosen to architect its platform to operate across multiple clouds so it can meet its customers where they are.

That multicloud move has meant extra work, connecting everything to a common observability platform, and ensuring all security events feed up to a single, integrated virtual security operations center so that the various clouds can be monitored from a single pane of glass, he said. While the overhead of adding more cloud providers is to be expected, the same problem exists even when dealing with a single hyperscaler, as different regional instances may have specific controls that need to be put in place, he added.

The study also asked enterprises what key outcomes they expected from a multicloud management platform. Only 22% cited the single pane of glass that Sankaran relies on. The top responses were cloud cost optimization (33%), a common governance policy across clouds and integration with on-premises infrastructure (both 27%), improved visibility and analytics (26%), and integration with existing toolsets (25%).

Cost control

Whether an enterprise chooses to spread its workloads across more public clouds or concentrate them on fewer, it all seems to come back to managing cost.

Reale Group’s Barioni has a plan for that involving a core team with a mix of competencies: some technology infrastructure experts, and some with a deep knowledge of accounting. Developers tend to aim for the best technical solution, which is often not the most cost-efficient one, he said.

When applications run on premises, computing capacity — and therefore cost — is limited by what the data center can hold, whereas there are few limits on the computing capacity of the cloud — or its cost. Bringing together the technically minded and financially minded will help Barioni balance cost and performance in this new, unconstrained environment. “Every day, you have to take decisions on prioritizing your workloads and deciding how to optimize the computing power you have,” he said. “It’s a completely new mindset.”

Multi Cloud

IT Operations management (ITOM) – a framework that gives IT teams the tools to centrally monitor and manage applications and infrastructure across multi-premise environments – has been the foundation of enterprise IT infrastructure and applications for the last 30 years. It has been the backbone that ensures technology stacks are operating optimally to provide timely business value and keep employees engaged and productive by maintaining the availability of core applications. But the recent acceleration of digital transformation across global industries and emergence of multi-cloud environments has introduced a new level of complexity.

While flexibility, elasticity, and ease of use are benefits that make starting with the cloud an enticing prospect, ongoing operation and management can quickly become difficult to oversee. As the business scales across infrastructure and applications deployments in a multi-cloud environment, so does the complexity inherent in diverse cloud operating models and tools, and distributed application architecture and deployment. Many IT professionals also lack technical or management skills in these areas.

According to a recent VMware survey of tech executives and their priorities for digital momentum, 73% reported a push to standardize multi-cloud architectures. The digital transformation and transition to multi-cloud environments to bring the best technology stacks that unlock business value will be a journey for most global enterprises in the coming years. The key is how to empower cloud-focused or cloud-native technology teams to realize the full potential of their transformational investments in multi-cloud environments.

This prompts a key question: is ITOM still valid when it comes to managing enterprise technology stacks that are increasingly categorized as multi-cloud environments?

IT operators for years had their favorite tools to manage infrastructure, applications, databases, networks, and more to support the needs of their business. But with the growing workload migration to multi-cloud environments, IT pros are now scrambling between siloed operations tools and cloud-specific tools provided by the key hyperscalers – AWS, Azure, or Google – for specific use cases. This tool sprawl is often exacerbated by the enterprises’ desire to pick the best cloud platform based on the problem that needs to be resolved. For a .NET based application migration to the cloud, for example, Azure might be the better choice, while an AI/ML large data lake analysis could be best suited to run on Google Cloud.

This is where ITOM is evolving to bring a holistic and comprehensive view across multiple disciplines of multi-cloud infrastructure and applications, integrating best practices from cost, operations, and automation management principles along with connected data. ITOM has been leveraged for decades in enterprise on-premises environments to bring disparate, federated, and integrated tools together to operate in a secure way. The key to success in accelerating digital transformation in multi-cloud consumption era is to have complete visibility of the environment, automation of mundane tasks, and proactive operations that utilize connected data from multiple domains in near real-time to drive preventive and proactive insights. This is possible to achieve with an integrated platform that brings different domains of operations, costs, and automation on a single integrated data platform.

There are different ITOM incumbents trying to “cloudify” their current solutions by tucking AIOps into the mix and integrating with application performance management (APM) offerings. On the other hand, there are also new market disrupters joining the race to provide a single integrated solution for the enterprises that unify multiple domains and data together to provide visibility, operations, cost, optimization, and automation across multi-cloud environments. All these scenarios suggest ITOM will solidify itself as even more relevant in cloud and modern IT operations management now and in the future – but to maximize its value, IT teams must move from cloud chaos to cloud smart management.

There are three primary characteristics of a cloud-smart IT operations management solution enterprises should look for:

Platform and API-based Solutions: Look for solutions that bring a set of common and integrated services together to monitor, observe, manage, optimize, and automate across infrastructure and apps. By using solutions that take a platform and API-first approach, IT teams are empowered to technology-proof their investments from a state of continuous invention and reinvention of the enterprise’s technology stacks and solutions. These types of offerings also help teams to connect the old legacy technology with more modern solutions as they progress along their digital transformation journeys while maintaining and growing their business.Integrated Data-driven Operations: A good ITOM solution should provide data-driven intelligence across multiple data domains to inform proactive decisions that leverage AIOps 2.0 principles. – AIOps must take a data-driven automation-first and self-service approach to truly provide value that frees resources to drive value-based development and delivery instead of chasing reactive problems. Global digital businesses are operating in multi-cloud environments, at the edge, and everywhere in between alongside the people, processes, and things that provide contextual customer experience. CloudOps will further provide rich diverse data to turn contextual connected data into business insights. It can’t manage an old school events-based command center but instead provides context across distributed and connected layers of technology by processing diverse volumes and variety of data that can be observed through business KPIs to drive actions to resolution. This will enable the modern digital businesses to constantly optimize the network and make informed decisions eliminating mundane manual tasks to improve productivity and innovation.Continuous Consumptions, Agility, and Control: We are moving from a static on-premises environment to dynamic cloud environments in the digital business, leveraging ephemeral workflows. This is where the right tools will drive automation of repetitive mundane tasks, enable governance and controls on cost, usage, and policy for the ever-changing business needs that demand elastic resources, data driven process accommodation, dynamic configurations, and consumption pricing.

The world of multi-cloud, edge, and on-premises are here to stay to drive the digital transformation journey of the enterprise. However, the pendulum of workloads moving between those discrete environments will continue to shift as business, compliance, and governance requirements change. ITOM and ITOps approaches are more relevant than ever in the world of multi-cloud hybrid environments with distributed ephemeral workloads (as they once were on on-premises environments). Still, it’s imperative that these operations management frameworks evolve with changing needs of business to ensure they’re able to simplify complex distributed technology stacks and cumbersome manual processes.

The goal is to drive contextual observable insights that lead to optimized usage and consumptions-based cultures by connecting different functions of business users, technology developers, and operators while enhancing complete end-to-end visibility of technology architectures. Only then can an organization benefit from modern ITOM that enables continuous change, compliance, and optimization to support a vibrant global business and its customers.

To learn more, visit us here.

Cloud Computing, IT Leadership