The construction industry was one of the first affected by Sweden’s recent economic deterioration, and housing construction has also slowed down over the past year.

“We notice the macroeconomic effects with both cost inflation and higher interest rates,” says Peab group CIO Klas Antoni. “That means we generally have an increased cost focus now and are keen that investments in IT are made in the right places and in the right way so we get results within a reasonable payoff time.”

This means, among other things, an increased focus on impact recovery, and that projects and various development initiatives are prioritized with more scrutiny.

“We’re more short-term oriented,” he says. “The business has to adapt more quickly because we have a little less patience, and we have to push out new things in a little less time.”

Difficult processes to digitize

If you look at the industry more long-term, it’s long been singled out for having a low degree of digitization because of a multitude of players and processes that aren’t easy to digitize, according to Antoni.

“All industries have different conditions, but the challenge in construction is it needs to be industrialized to a greater degree to get away from projects where things are being done for the first time,” he says. “A higher degree of standardization and process orientation makes it easier to digitize, thereby increasing productivity.”

But that doesn’t mean the construction industry and Peab stood still. On the contrary, a lot has happened and digitization is reaching further into the business.

“Today we have good digital tools that support us in many different parts of our production,” he says. “From advanced map systems and aerial scanning with drones, to very good tools that support our employees in their daily work both in planning and production.”

Structure the data

More machines and measuring points are also being connected to the network, and Peab sees a great deal of work to structure all data as a means to be more data-driven in the long term. To cope with this, the company’s data warehouse is in overdrive.

“We consolidate, wash, harmonize and normalize large amounts of data that come from our various systems,” says Antoni. “The data is then analyzed and presented in reports for different needs so it also creates value for the users.”

And although Peab has come a long way, there’s still a lot to do to create well-functioning infrastructure around the area. Not least, it’s a challenge to maintain good quality of all data over time.

Reporting requirements

In addition to being able to use data proactively in one’s own operations, there will also be increased demand for external reporting, requiring the construction industry in general to get its data in order.

“Against a background of new regulations, the construction industry also needs to take big steps to be able to report, for example, environmental and climate data in a good and uniform way,” Antoni says. “Then the data needs to be standardized and structured from the source all the way until it’s reported. That’s challenging because we have very long subcontractor chains. So we need to help each other in order to work uniformly and efficiently. Not cooperating on these issues will drive unnecessary costs for all of us, so the need for common standards is great.”

A generational change is also underway at Peab, where old legacy systems have been gradually replaced for a number of years.

Recently, a new salary system for the entire Swedish operation has been put into operation—an extra complex task in an industry with many collective agreements and some special forms of compensation that are unique to the industry. And old financial and follow-up systems for the entire contracting operation have been replaced.

A time and place for investment

Some investments must be made, but they shouldn’t be done all at once and preferably not while being held to ransom. “Instead, own the timetables, says Antoni. “There are many dependencies between different solutions, which make the whole thing even more challenging.”

Changes to things like payroll systems and financial systems are often largely driven by necessity, when they’re so old they’re no longer supported, or don’t work at all. This often means significantly increased costs with new systems compared to consistently low operating costs of the old ones, considering solutions sold today are often delivered only as cloud solutions based on subscription models.

Going toward the cloud

At the same time, all points lead to the cloud, and Peab essentially has a cloud-first strategy but will continue to live in a hybrid environment for the foreseeable future, says Antoni, before adding that the development is simultaneously moving toward the cloud environment regardless of what strategy is in the foundation.

“We’re moving more toward cloud deliveries,” he says. “Suppliers of course see advantages based on how solutions are developed and operated, but it’s in their interest for financial reasons as well. In some cases, there are good reasons for us as a customer to have a solution on-prem, whether for operational reasons, business-critical data, or third-country issues. But it’s less often as a customer that you have that choice.”

Of course, replacing some decades-old systems doesn’t always have to do with the fact they’re bad, but more simply approaching end of life.

Provide concrete support

At the same time, new systems don’t always add much new functionality more than replacing old ones. Older systems need to be managed, but Antoni would rather focus on getting new digital support that really creates value for employees and customers.

“We want to get good support for the employees,” he says. “It shouldn’t only be about new systems for reporting time, but that they receive more concrete support for planning, project management and project follow-up. Those parts have lagged behind with us, and that probably applies to many construction companies, not just Peab.

The company also announced last year it would start using solutions from solutions provider Dalux, where entire projects can be managed from beginning to end.

“It’s a good example of a tool that’s about to be rolled out to our employees—something that really makes their jobs easier, creates order, and contributes to our working more efficiently,” he says.

Automation and AI

Peab already uses a lot of automation in its administrative processes with relatively low threshold costs. And AI is, of course, something the company follows with great interest and Antoni follows developments with great interest. But AI still needs proper oversight in order to process data accurately and efficiently.

“We’ve previously done some evaluations of tools for predictive analyses, but we didn’t achieve the results we wanted due to the data quality we had to work with,” he says. So the fight is fierce for competence in historically new areas, and it’s both about changing and raising competence internally and recruiting over time, he states.

Consultant dependency can be broken

There may be another movement in the market where consultants look for more secure jobs when the economy fluctuates—something favorable for companies like Peab that would like to have greater in-house competence in certain areas.

“In recent years, many companies have been forced to build up a consultant dependency they didn’t want,” he says. “And although it may be easier to hire today, the economy as it stands means we’re waiting to hire.”

For the construction industry in general, it’s also important to broaden the base for recruitment by breaking the prejudices that exist and getting more people to apply there.

“It’s a matter of survival for us,” says Antoni. “We need to get access to the entire supply of available labor, which means we need to get more women into our industry. We still have some way to go in our production, although I’m happy we have a relatively good balance in our IT operations.”

CIO, Construction and Engineering Industry, Digital Transformation, IT Leadership

Economic instability and uncertainty are the leading causes for technology budget decreases, according to the IDG/Foundry 2022 annual State of the CIO survey. Despite a desire to cut budgets, data remains the key factor to a business succeeding – especially during economic uncertainty. According to the Harvard Business Review, data-driven companies have better financial performance, are more likely to survive, and are more innovative.[1]

So how do companies find this balance and create a cost-effective data stack that can deliver real value to their business? A new survey from Databricks, Fivetran, and Foundry that surveyed 400-plus senior IT decision-makers in data analytics/AI roles at large global companies, finds that 96% of respondents report negative business effects due to integration challenges. However, many IT and business leaders are discovering that modernizing their data stack overcomes those integration hurdles, providing the basis for a unified and cost-effective data architecture.

Building a performant & cost-effective data stack 

The Databricks, Fivetran, and Foundry report points the way for four investment priorities for data leaders: 

1. Automated data movement. A data pipeline is critical to the modern data infrastructure. Data pipelines ingest and move data from popular enterprise SaaS applications, and operational and analytic workloads to cloud-based destinations like data lakehouses. As the volume, variety and velocity of data grow, businesses need fully managed, secure and scalable data pipelines that can automatically adapt as schemas and APIs change while continuously delivering high-quality, fresh data. Modernizing analytic environments with an automated data movement solution reduces operational risk, ensures high performance, and simplifies ongoing management of data integration. 

2. A single system of insight. A data lakehouse incorporates integration tools that automate ELT to enable data movement to a central location in near real time. By combining both structured and unstructured data and eliminating separate silos, a single system of insight like the data lakehouse enables data teams to handle all data types and workloads. This unified approach of the data lakehouse dramatically simplifies the data architecture and combines the best features of a data warehouse and a data lake. This enables improved data management, security, and governance in a single data architecture to increase efficiency and innovation. Last, it supports all major data and AI workloads making data more accessible for decision-making.

A unified data architecture results in a data-driven organization that gains both BI, analytics and AI/ML insights at speeds comparable to those of a data warehouse, an important differentiator for tomorrow’s winning companies. 

3. Designed for AI/ML from the ground up. AI/ML is gaining momentum, as more than 80% of organizations are using or exploring the use of (AI) to stay competitive. “AI remains a foundational investment in digital transformation projects and programs,” says Carl W. Olofson, research vice president with IDC, who predicts worldwide AI spending will exceed $221B by 2025.[2] Despite that commitment, becoming a data-driven company fueled by BI analytics and AI insights is proving to be beyond the reach of many organizations that find themselves stymied by integration and complexity challenges. The data lakehouse solves this by providing a single solution for all major data workloads from streaming analytics to BI, data science, and AI. It empowers data science and machine learning teams to access, prepare and explore data at scale.

4. Solving the data quality issue. Data quality tools(59%) stand out as the most important technology to modernize the data stack, according to IT leaders in the survey. Why is data quality so important? Traditionally, business intelligence (BI) systems enabled queries of structured data in data warehouses for insights. Data lakes, meanwhile, contained unstructured data that was retained for the purposes of AI and Machine Learning (ML). However, maintaining siloed systems, or attempting to integrate them through complex workarounds, is difficult and costly. In a data lakehouse, metadata layers on top of open file formats increase data quality, while query engine advances speed and performance. This serves the needs of both BI analytics and AI/ML workloads in order to assure the accuracy, reliability, relevance, completeness, and consistency of data. 

According to the Databricks, Fivetran, and Foundry report, nearly two-thirds of IT leaders are using a data lakehouse, and more than four out of five say they’re likely to consider implementing one. At a moment when cost pressure is calling into question open-ended investments in data warehouses and data lakes, savvy IT leaders are responding as they place a high priority on modernizing their data stack. 

Download the full report to discover exclusive insights from IT leaders into their data pain points, how theyplan to address them, and what roles they expect cloud and data lakehouses to play in their data stack modernization.

[1] https://mitsloan.mit.edu/ideas-made-to-matter/why-data-driven-customers-are-future-competitive-strategy

[2]  Source: IDC’s Worldwide Artificial Intelligence Spending Guide, Feb V1 2022. 

Data Architecture

Chris Mills, Head of Customer Success, EMEA at Slack

The roles of the CTO and CIO have grown enormously in recent years, proving fundamental in facilitating the rapid shift from traditional working to hybrid working during the pandemic. But this was no short term shift—the value of the CTO and CIO continues to rise.

The next challenge on the horizon? Ensuring workers everywhere remain aligned, efficient and productive despite the economic turbulence organisations are buckling in for. Now, the spotlight is on tech leaders to once again steward businesses through another technological revolution—one in which the digital headquarters (HQ) is key.

Technology revolution 2.0

The majority of businesses are now adept at hybrid working, with many establishing policies to meet the needs of the workforce. This does not mean, however, that setups aren’t without a few wrinkles, with urgent issues including duplicate tools, bloating costs and unoptimised processes.

The digital HQ solves these challenges by uniting teams, partners, and the tools they use in a single digital space—making how work gets done simpler, more pleasant and more efficient. In fact, teams that use Slack as their digital HQ are 49% more productive.

In the digital HQ, bottomless email inboxes are replaced with Slack channels—a way of organising conversations based on topic, project, or initiative. While employees find information-sharing no longer tethered to inflexible meetings; instead, happening in Slack Huddles or Clips—free-flowing real-time or asynchronous audio and video messages that mean, on average, teams have 36% fewer meetings.

Another area the digital HQ really shines in its ability to drive productivity is through its ability to automate tasks—with Slack users launching 1.7 million automated workflows a day. For an example of this in action, businesses should look at Vodafone.

Automating for efficiency

Vodafone first started using the digital HQ as a foundation for modern engineering practices, but now uses it to enhance its collaboration worldwide. This has created opportunities for efficiency, in particular for the DevOps team, where release requests had the potential to be more streamlined.

With the digital HQ, the team is able to simplify release requests and use Slack’s Workflow Builder to automate a complex process. Developers now add the details of a release to a simple form, then used to populate a dedicated Slack channel so that the wider team has a real-time view of what’s going on.

Through the digital HQ, Vodafone has developed an efficient way of dealing with release requests that can number over 100 a month. They remain productive and focussed on the work, not the admin, while other teams retain visibility over an integral part of the business.

Slack

A HQ for challenging times

The pandemic demonstrated that, even in challenging times, productive and efficient ways of working are possible through technology. Vodafone is living proof that our technology evolution didn’t stop there, with the digital HQ providing a new foundation for the future of work.

The macroeconomic challenges we face will surely pass, and there’ll undoubtedly be something else close to follow. Yet the digital HQ is no one-trick pony. By supercharging not just our productivity, efficiency and collaboration, but our resilience too, with the digital HQ, businesses can prepare for the future—whatever it looks like.  

For more information on how Slack’s Digital HQ can help your business click here.

Application Performance Management, Remote Work, Workstations

Government organizations have the longest average buying cycle for technology when compared to other sectors, with procurement laws injecting complexity into the procurement process, according to a Gartner report.

The report, which is based on a survey of 1,120 executives with the inclusion of 79 public sector employees across the US, Canada, France, Germany, the UK, Australia and Singapore, showed that the average buying cycle for government entities is 22 months.

This is in contrast to at least 48% of all respondents saying that their buying cycle for technology averaged around six to seven months.

“Technology acquisition brings challenges to the public sector that do not commonly exist in other industries,” said Dean Lacheca, vice president and analyst at Gartner.   

“Each jurisdiction has its own procurement laws and policies, and within that, each agency or department can have its own interpretation of them. A failure to conform to the rules can have serious consequences, from unwanted publicity to personal risk of prosecution,” Lacheca added.

Some of the other reasons behind the delay include changes in scope, research and evaluation along with reaching an agreement around budgeting.

Many respondents also said that these delays occur before the beginning of the procurement process, with at least 74% of public sector respondents claiming that developing a business case for purchases takes a long time.

More than 76% said that scope changes requiring additional research and evaluation was also another major factor resulting in delays, Gartner said, adding that 75% of respondents listed reaching agreement around budgeting as a major concern for delays in the buying decision.

“While government buying cycles can be long, it is important to note that these time frames are not set,” said Lacheca.

“Initial planned timelines can be delayed as a result of a combination of both controllable and uncontrollable factors, especially when no external deadlines exist.”

Government procurement teams are large

A typical public sector buying team has 12 participants, with varying levels of participation in the process, Gartner said, adding that government C-level executives tend to be less involved in the technology buying process when compared to the private sector in order to avoid association with the process and creating the perception of political influence in the outcome.

This also makes government C-level executives less willing to defend the process if challenged by unsuccessful vendors or the media, the research firm said.

Further, the survey shows that public sector buying teams are more likely to be composed of lower-level operational staff, who act as subject matter experts providing recommendations to their C-suite.

At least 68% of public sector respondents claim that another reason for delay is their inability to obtain specific product or implementation requirements details from the provider, Gartner said.

The research firm adds that public sector organizations are significantly more likely to value references from existing clients than non-public sector buyers, partly because public sector organizations are rarely in direct competition and often share common challenges.

Government IT