Digital solutions and data analytics are changing the world of sports entertainment at a rapid clip. From how players train, to how teams make strategic decisions during games, to how venues operate and fans engage, sports organizations are turning to software engineers and data scientists to help transform the sport experience.

In Toronto, Maple Leaf Sports & Entertainment (MLSE), the largest sports entertainment organization in Canada and one of the largest in North America, is facing off with the future with a new digital solutions research and development program.

Created in conjunction with Amazon Web Services (AWS) and unveiled in January, SportsX is an incubator rooted in research, applied sciences, and product development charged with creating innovative digital solutions that give teams a winning edge, create extraordinary fan experiences, and create positive social and environmental impact.

“If we’re able to look into technologies that are going to impact the game, earlier, and we have people dedicated to those and not pulled away for other tasks that are more day-to-day, then we will stay ahead of the curve that’s about to impact us as an organization,” says Christian Magsisi, vice president of venue and digital technology at MLSE. “We want to continue to be at the forefront of the fan experience. And if we have a positive social and environmental impact on our community, then largely the first two things will also come to pass.”

Sports enters the analytics era

MLSE was founded by legendary hockey coach and businessman Conn Smythe in 1927 after he organized a group of investors to buy his hometown hockey franchise, the Toronto St. Patricks. The organization now owns the Toronto Maple Leafs (NHL), Toronto Raptors (NBA), Toronto FC (MLS), Toronto Argonauts (CFL), and their minor league and farm teams. It also owns Scotiabank Arena (home of the Maple Leafs and Raptors) and OVO Athletic Centre, and has investments in a number of other sports facilities.

The organization may have 96 years of history behind it, but Magsisi says digital technologies and analytics have been changing the business of sports to an astonishing degree in just the past few years. Magsisi joined the organization five years ago, and it has changed considerably in that time. The organization now has data engineers, data scientists, and is investing in cutting-edge technologies like quantum computing.

“In the early years here, it felt like we were a startup within MLSE because we didn’t operate, look, act like the rest of the organization,” he says. “We didn’t have software engineers, we didn’t have developers here at MLSE prior to us creating MLSE Digital Labs. Now we have a full-scale R&D program. Those were never concepts or job descriptions that were getting posted from MLSE. In a lot of ways, we felt like outsiders within our own organization, but we knew that was going to be the case, that we could usher in this new culture and organization.”

In the past several years, the amount of real-time data available to the organization has increased tremendously. Soccer, football, and basketball are all making use of computer vision for player and ball tracking that can be used to enhance the fan experience and provide actionable insights to coaches and players in-game. The NHL has gone a step further, embedding sensors in players’ sweaters and the puck itself.

“Getting live, real-time data that is actionable, that can provide insight to how we’re trying to execute our game strategy for that day, for that game, is more readily available to us now,” Magsisi says. “With hockey, we finally have tracking of the puck and the player, the XYZ coordinates of the players and the pucks. With that, there’s an almost infinite possibility of calculations that you can do in hockey that wasn’t available less than 18 months ago.”

This analytics advantage in hockey has yet to be fully realized in MLSE’s other major sports, he adds. “In soccer, football, and basketball, it’s computer vision. Latency has gotten a lot better over time, but the data is still challenging to go through.”

That said, advances in the field of biomechanics related to computer vision have Magsisi excited. Computer vision can currently be used to track the position of players and the ball, but new advances will enable computer vision to track the position of players’ limbs. For example, Magsisi says, the organization could track the trajectory of a ball as it’s released from a basketball player’s hands.

Betting big on the future

The idea behind SportsX is to capture, analyze, and build out the best ideas from key MLSE stakeholders, whether coaches, fans, partners, or employees, and the organization has built a dedicated SportsX web portal to support the effort. The solutions will support how teams play, how players stay healthy, how fans connect with teams and each other, and how franchises operate internally.

One of the first concepts developed by the program while under pilot was the NHL Extended Reality Stats Overlay, which uses extended reality to deliver broadcast and video game capabilities to people watching games in-person. Another concept is the Immersive Basketball Experience, which uses optical data to provide fans with a life-size augmented reality experience.

SportsX is leveraging a portfolio of cloud services from AWS, including artificial intelligence (AI), machine learning (ML), and deep learning cloud services.

All this has required MLSE to build one of the largest technology engineering teams in all of sports.

“That’s not by mistake or without intention,” Magsisi says. “We know that the most successful organizations in the world are investing billions of dollars in R&D. It’s no mistake that these are the same organizations that are constantly coming out with new features and products and stay on top of the revenue line.”

Magsisi says that part of the secret sauce for MLSE has been a commitment to staying forward-looking.

“In the early years [of MLSE Digital Labs], we took 30% of our budget and our resources and focused on projects that were not going to impact our business within that season,” Magsisi says. “That was a large move. That was a big change for the organization because we are a seasonal business and our opportunity to generate revenue is limited in a window. A lot of our focus is three-quarters of the year. For us to take resources that would have been responsible for delivering revenue within that three-quarters and dedicating it to the future is a big risk.”

But that risk has come with a commensurate reward. It’s become a statement by the organization about its priorities. The organization can’t ignore tactical improvements — investing in data availability, reporting, dashboards, and the like — but dedicating staff and resources to examining the business and thinking about where it could be in several years has paid dividends in agility.

“We still have to invest in today; we still have to deliver today,” Magsisi says. “But I think the shift to be able to invest in the future allows you to take a look at your business and ask, ‘Where can we help our organization,’ whether it’s our restaurants, food and beverage teams, or retail team.”

Over the past five years, Magsisi says the organization has launched well above 50 digital products. It’s gone from quarterly or even biannual releases to daily releases.

“Our software engineering development teams and analytic teams now have the ability to make a change and deploy it right into production, whether it’s for a coaching staff, for our players, or fans,” he says. “Those things were long processes in the past with a lot of levels of approval.”

Data Management, Digital Transformation, Media and Entertainment Industry

Supply chain disruptions have impacted businesses across all industries this year. To help ease the transport portion of that equation, Danish shipping giant Maersk is undertaking a transformation that provides a prime example of the power of computing at the edge.

Gavin Laybourne, global CIO of Maersk’s APM Terminals business, is embracing cutting-edge technologies to accelerate and fortify the global supply chain, working with technology giants to implement edge computing, private 5G networks, and thousands of IoT devices at its terminals to elevate the efficiency, quality, and visibility of the container ships Maersk uses to transport cargo across the oceans.

Laybourne, who is based in The Hague, Netherlands, oversees 67 terminals, which collectively handle roughly 15 million containers shipped from thousands of ports. He joined Maersk three years ago from the oil and gas industry and since then has been overseeing public and private clouds, applying data analytics to all processes, and preparing for what he calls the next-generation “smartport” based on a switch to edge computing in real-time processing.

“Edge provides processing of real-time computation — computer vision and real-time computation of algorithms for decision making,” Laybourne says. “I send data back to the cloud where I can afford a 5-10 millisecond delay of processing.”

Bringing computing power to the edge enables data to be analyzed in near real-time — a necessity in the supply chain — and that is not possible with the cloud alone, he says.

Laybourne has been working closely with Microsoft on the evolving edge infrastructure, which will be key in many industries requiring fast access to data, such as industrial and manufacturing sectors. Some in his company focus on moving the containers. Laybourne is one who moves the electrons.

Digitizing the port of the future

Maerk’s move to edge computing follows a major cloud migration performed just a few years ago. Most enterprises that shift to the cloud are likely to stay there, but Laybourne predicts many industrial conglomerates and manufacturers will follow Maersk to the edge.

“Two to three years ago, we put everything on the cloud, but what we’re doing now is different,” Laybourne says. “The cloud, for me, is not the North Star. We must have the edge. We need real-time instruction sets for machines [container handling equipment at container terminals in ports] and then we’ll use cloud technologies where the data is not time-sensitive.”

Laybourne’s IT team is working with Microsoft to move cloud data to the edge, where containers are removed from ships by automated cranes and transferred to predefined locations in the port. To date, Laybourne and his team have migrated about 40% of APM Terminals’ cloud data to the edge, with a target to hit 80% by the end of 2023 at all operated terminals.

As Laybourne sees it, the move positions Maersk to capitalize on a forthcoming sea change for the global supply chain, one that will be fueled by enhanced data analytics, improved connectivity via 5G/6G private networks, and satellite connectivity and industry standards to enable the interoperability between ports. To date, Maersk controls about 19% of the overall capacity in its market.

As part of Maersk’s edge infrastructure, container contents can be examined by myriad IoT sensors immediately upon arrival at the terminals. RFIDs can also be checked in promptly and entered into the manifest before being moved robotically to their temporary locations. In some terminals, such operations are still performed by people, with cargo recorded on paper and data not accessible in the cloud for hours or longer, Laybourne says.

Cybersecurity, of course, is another major initiative for Maersk, as is data interoperability. Laybourne represents the company on the Digital Container Shipping Association committee, which is creating interoperability standards “because our customers don’t want to deal with paper. They want to have a digital experience,” he says.

The work to digitize is well under way. Maersk uses real-time digital tools such as Track & Trace and Container Status Notifications, APIs, and Terminal Alerts to keep customers informed about cargo. Automated cranes and robotics have removed most of the dangerous, manual work done in the past, and have improved the company’s sustainability and decarbonization efforts, Laybourne notes.

“Robotic automation has been in play in this industry for many years,” he says, adding that the pandemic has shifted the mindset of business-as-usual to upskilling laborers and making the supply chain far more efficient.

“We have automated assets such as cranes and berth and then there’s [the challenge of] how to make them more autonomous. After the pandemic, customers are now starting to reconfigure their supply chains,” he says, adding that autonomous, next-generation robotics is a key goal. “If you think of the energy crisis, the Ukraine situation, inflation, and more, companies are coming to a new view of business continuity and future sustainability compliance.”

Top vendors such as Microsoft and Amazon are looking at edge computing use cases for all industries, not just transport and logistics. According to IDC, more than 50% of new IT infrastructure will be deployed at the edge in 2023.

Gartner calls implementations like Maersk’s the “cloud-out edge” model. “It is not as much about moving from the cloud to edge as it is about bringing the cloud capabilities closer to the end users,” says Sid Nag, vice president and analyst at Gartner. “This also allows for a much more pervasive and distributed model.”

Next-gen connectivity and AI on deck

Aside from its partnership with Microsoft on edge computing, Maersk is collaborating with Nokia and Verizon on building private 5G networks at its terminals and recently demonstrated a blueprint of its plans at the Verizon Innovation Center in Boston. The ongoing work is among the first steps toward a breakthrough in connectivity and security, Laybourne maintains.

“It’s technology that opens up a lot more in terms of its connectivity, and in some of our terminals, where we have mission-critical systems platforms, the latency that 5G can offer is fantastic,” he says, noting that it will allow the cargo to “call home” data every 10 milliseconds as opposed to weeks. “But the real breakthrough on 5G and LTE is that I can secure my own spectrum. I own that port — nobody else. That’s the real breakthrough.”

Garnter’s Nag agrees that private 5G and edge computing provide meaningful synergies. “Private 5G can guarantee high-speed connectivity and low latencies needed in industries where use cases usually involve the deployment of hundreds of IoT devices, which then in turn require inter connectivity between each other,” Nag says.

For Maersk, installing IoT sensors and devices is also revolutionizing terminal operations. In the past, the cargo in containers had to be inspected and recorded on paper. Looking forward, Laybourne says, the process will all be automated and data will be digitized quickly.

His data science team, for example, has written algorithms for computer vision devices that are installed within the container to get around-the-clock electronic eyes on the cargo and identify and possibly prevent damage or spoilage.

Edge computing with IoT sensors that incorporate computer vision and AI will also give customers what they’ve longed for some time, and most pointedly during the pandemic: almost instant access to cargo data upon arrival, as well as automated repairs or fixes.

“It can then decide whether there’s an intervention needed, such as maintenance or repair, and that information is released to the customer,” the CIO says, adding that cameras and data collection devices will be installed throughout terminals to monitor for anything, be it theft, lost cargo, or potentially unsafe conditions.

Maersk has also been working with AI pioneer Databricks to develop algorithms to make its IoT devices and automated processes smarter. The company’s data scientists have built machine learning models in-house to improve safety and identify cargo. Data scientists will some day up the ante with advanced models to make all processes autonomous.

And this, Laybourne maintains, is the holy grail: changing the character of the company and the industry.

“We’ve been a company with a culture of configurators. So now we’ve become a culture of builders,” the digital leader says. “We’re building a lot of the software ourselves.

This is where the data scientists sit and work on machine learning algorithms.”

For example, his data scientists are working on advanced ML models to handle exceptions or variations in data. They are also working on advanced planning and forecasting algorithms that will have an unprecedented impact on efficiencies. “Traditionally, this industry thinks about the next day,” the CIO says. “What we’re looking at actually is the next week, or the next three weeks.”

The core mission won’t change. But everything else will, he notes.

“We’re still going to have the job of lifting a box from a vessel into something else. Are we going to have autonomous floating containers and underseas hyperloops? I don’t think so,” Laybourne says, claiming the container industry is well behind others in its digital transformation but that is changing at lightning-fast speed. “Loading and unloading will still be part of the operation. But the technologies we put around it and in it will change everything.”

Cloud Computing, Edge Computing, Internet of Things, Supply Chain

Agility may be the defining feature of today’s contact centers.

In the past, speed was the name of the game. How could a contact center be as efficient as possible, maximizing the call volume each agent could handle and minimizing average handle time? While these factors still play a role in contact center operations, customer experience (CX) now takes center stage. That CX is fluid and omnichannel, and that means your contact center must be as agile as possible.

It’s fitting, then, that DevOps, an operational approach that grew out of the Agile software development practices of the 1990s, is finally grabbing the attention of more contact center executives. As more contact center leaders have embraced cloud technology, a nimbler and fleet-footed approach to CX development and delivery has become possible.

How the cloud clears the way for DevOps

Contact centers are rapidly shedding legacy, on-premises technology in favor of the cloud. The global market for Contact Center as a Service (CCaaS) offerings is expected to grow by 26.1% annually from 2022 to 2027, expanding from $17.1 billion to $54.6 billion.

While there are many reasons for this shift, it ultimately boils down to customer experience and cost management. The cloud allows contact centers to scale more efficiently, delivering more flexible and personalized CX solutions while driving down the cost of service. It enables a more seamless blend of human agents with artificial intelligence technology within an omnichannel service environment.

The cloud does more, though. It also clears the way for a DevOps approach in the contact center. Consider a few key changes that come with cloud migration:

Silos break down and communication opens up. When a contact center isn’t bound to a physical location, collaboration and communication become essential ingredients to delivering customer service.Automation becomes easier. Cloud contact center technology is built on a modern architecture making it more compatible with the small, frequent changes that are the hallmark of DevOps. Automation of key processes, from CI/CD to testing, becomes key to achieving the DevOps vision.

All these changes lay the groundwork for a fuller shift toward DevOps in the contact center. Because it is rooted in Agile software methodology, DevOps relies on the open, collaborative culture that’s inherent to an Agile approach. It’s built on breaking down the silos between development and operations so that these two departments can work together throughout the development cycle to deliver faster, better software releases.

In the contact center, it’s much easier to embrace this type of mindset shift when the overhead of legacy technology is no longer in the way. And when contact center leadership embraces this shift, it lays the groundwork for even more powerful results.

Continuous testing in DevOps: The perfect recipe for CX assurance

At the heart of the DevOps mindset is the commitment to continuous iteration and continuous development (CI/CD). Rather than separating development and testing into distinct stages, DevOps intertwines them throughout the entire lifecycle of software development.

This shift from a develop-test-release approach to a more fluid interchange between the three relies on continuous testing. Instead of isolated checks at the end of development, which often lead to major delays or misses in discovering software issues, testing “shifts left” to happen much earlier in the cycle. This enables development teams to discover problems much earlier, solve them more efficiently, and reduce the costs associated with premature deployment.

When done well, the result is a massive improvement in execution. According to the 2021 Accelerate State of DevOps report by Google Cloud DevOps Research and Assessment (DORA), organizations with elite DevOps teams outperform those with low-end DevOps teams by leaps and bounds. Consider a few stark comparisons:

Elite groups are 6,570 times faster at making software changes and restoring service outages.They deploy code updates 973 times more often.The changes they deploy are one-third less likely to fail.

What do these kinds of performance improvements mean for the contact center? By deploying CI/CD, automated testing, and other DevOps approaches, contact centers can reduce outages, improve voice quality, enhance chatbot performance, and more. And while it’s possible, in theory, to do this without a wholesale embrace of the DevOps mentality, making a full mindset shift will drastically enhance what’s possible for contact center CX.

Changing the mindset in your contact center

Embracing a DevOps approach in your contact center requires a change in mindset. You can no longer think of development and upgrades to your system as separate from CX — they must become an integral part of it. At Cyara, we call it “DevCXOps.”

Our CX assurance platform helps the world’s most prestigious brands accelerate the shift toward DevOps because the entire system is built on CI/CD. Through powerful automation, our customers deploy functional, regression, and performance tests, along with live CX monitoring, to gain a comprehensive view of the customer experience.

This kind of automated continuous testing and monitoring allows development and operations teams to ensure that CX software — whether an existing version or a new update — performs at optimal levels. Instead of catching service issues in scattershot fashion or after they’ve already impacted CX, Cyara helps catch them proactively so they don’t become serious problems.

One of our customers, a major Canadian bank, uses Cyara to get a unified picture of its contact center operations and an end-to-end view of the customer journey that allows them to catch issues in real-time and resolve them more quickly.

That’s only one example — countless more contact centers have used Cyara to jumpstart their shift to a DevOps way of doing business. If you’re ready to deploy DevOps in your contact center, you don’t have to do it alone. Take a look at this whitepaper  to learn more about how Cyara can help.

Digital Transformation

Elaborating on some points from my previous post on building innovation ecosystems, here’s a look at how digital twins, which serve as a bridge between the physical and digital domains, rely on historical and real-time data, as well as machine learning models, to provide a virtual representation of physical objects, processes, and systems.

Keith Bentley of software developer Bentley Systems describes digital twins as the biggest opportunity for IT value contribution to the physical infrastructure industry since the personal computer, and they’re used in a wide variety of industries, lending enterprises insights into maintenance and ways to optimize manufacturing supply chains.

By 2026, the global digital twin market is expected to reach $48.2 billion, according to a report by MarketsAndMarkets.com, and the infrastructure and architectural engineering and construction (AEC) industries are integral to this growth. Everything from buildings, bridges, and parking structures, to water and sewer lines, roadways and entire cities are ripe for reaping the value of digital twins.

Here’s a look at how digital twins are disrupting the status quo in the infrastructure industry — and why IT and innovation leaders at infrastructure and AEC enterprises would be wise to capitalize on them.

Redrafting the business model

For decades in the AEC industry, work has been performed on a project-by-project basis using computer-aided design (CAD) and more recently building information modeling (BIM) software to create specific 2D and 3D deliverables. The industry is now moving toward integrated suites of tools and industry clouds, which open the door to new business models, industry ecosystems, and more collaborative ways of working.

As the use of digital twins advances, new possibilities for annuity revenues are opening up as well for AEC firms to manage and maintain infrastructural digital twins for their clients.

These new business models are disrupting the infrastructure industry and reconfiguring opportunities as the industry adjusts to new ways of working. Digital twins will likely do for the infrastructure space what various platform models have already done for music, books, retail, and gig economy services.

Due to the cloud-based, platform business model, possibilities will open up not only for operations and maintenance services around core digital twin models, but for value-added digital services wrapped around these twins such as visualization, collaboration, physical and cybersecurity, data analytics, and AI-enabled preventative maintenance.

Plus, infrastructure developers can partner with digital twin providers and the surrounding ecosystem of service providers to benefit from the sale of the physical asset as well as the provisioning of ongoing digital services via digital twin models. Over time, these subscription-based services could add a significant amount to the original sale price. For example, a real estate project of 100,000 square feet could net $1 million in add-on revenues over five years from digital twin-related services, and nearly 80% of an asset’s lifetime value is realized in operations.

Digital twin use cases and ROI

The full suite of digital twin use cases encompasses many areas, but one of the largest is in helping infrastructure become more efficient, resilient, and sustainable. With 70% of the world’s carbon emissions having some link to the way infrastructure is either planned, designed, built, or operated, digital twins can help with visibility and insights for real-time decisions. Using our earlier example, if a 100,000 square foot building has $200,000 in annual maintenance costs, the digital twin may save 25% from that and add additional value of $160,000 in terms of environmental, security, and useability benefits like booking of meeting rooms, space utilization analytics, and process visibility.

Another use case relates to worker safety. Bridge inspectors, for instance, often still suspend themselves from ropes, but with drone-based bridge inspections, such as those by Manam that capture photogrammetry used to assemble a 3D digital twin, they can now move much of the inspection process into the office. This saves time and greatly reduces injury risk. With each state in the US often having tens of thousands of bridges to inspect, the ROI for state Departments of Transportation becomes highly significant. Bridge inspectors still need to go out into the field with tools, however, but the 3D model provides an additional technique for rapid visual inspection, detailed analysis, and even AI-detected defects.

And from a security perspective, a digital twin for the Capital One Arena in Washington D.C., for instance, acts as a proving ground for the latest innovations in intelligent building sensor suites to help first responders rapidly prioritize search and rescue areas when emergencies occur.

A real-time system of record

By addressing the full lifecycle from construction to operations and maintenance, infrastructure digital twins provide a system of record and a single source of truth for all parties involved. The former BIM approach was the system of record during the plan, design, and build phases of a project, but it typically stopped once delivery was made to building operators.

As a living system of record, the digital twin merges the visual and geometric representation of the asset, process, or system with the engineering data, IT data, and operational data (such as IoT and SCADA) all in a real-time representation of the physical asset.

Without digital twins, architects often have no visibility into the operational side of their designs, something that could be valuable for feedback and continuous improvement in order to modify and refine designs over time.

For owners and operators, the digital twin provides an up-to-date virtual model they can view anytime from anywhere. They also have visibility into how these assets are performing including past, present, and future indicators.

Visualization and the metaverse

For complex systems such as buildings, visualization — including renderings, videos, and AR/VR/XR — is an indispensable element to clearly unlock the benefits of digital twins by communicating plans and ideas. AR inspection in particular helps site managers immediately flag mistakes for time and cost savings. They can also scan QR codes onsite to inspect the digital twin data associated with any physical equipment in the facility, such as HVAC systems or mechanical, electrical, and plumbing (MEP) equipment. And in VR mode, they can perform remote inspections of all data layers built into the digital twin model via fly throughs.

“We’ve seen an uptake in live digital twins in recent months,” says Martin Rapos, CEO of 3D BIM developer Akular. “In addition to the master integration of building data to break IoT and other building systems silos, there’s increased need for advanced visualization, where the data needs to be geolocated and accurately tagged on 2D or 3D files. The use of VR, MR and mobile devices in working with the digital twin is on the rise as well, allowing builders and asset operators to bring the digital twin from the office to the site, which is what the industry has been trying to achieve for years.”

As also discussed in my previous post, integrating visualization tools and capabilities into digital twin solutions is key to the technology stack and overall ecosystem so customers can better visualize and collaborate around design or operational decisions regarding their physical assets. Compared to other industries, infrastructure has been slow to digitally transform. But over the next two years, the shift to digital twins will likely move to early mainstream and propel the industry forward, so CIOs and executives working in the industry should watch these developments closely and structure their own digital twin strategies for how best to unlock their potential.

Digital Transformation, Infrastructure Management