It’s more important than ever for IT and business teams to be on the same page. Yet, many organizations struggle with a communication gap between these two groups. Why is this such a common issue? And how can dashboards be the bridge to solve it? Let’s break it down.

The Communication Gap Between IT and Business

1. Misaligned Objectives – One of the biggest reasons IT and business teams struggle to communicate is that they often have very different goals. IT tends to focus on metrics like system uptime, network performance, and cybersecurity. Meanwhile, the business side is more interested in things like revenue growth, customer satisfaction, and market share. With both teams speaking different “languages,” it’s no surprise that they often end up talking past each other, leading to misaligned priorities.

2. Too Much Jargon – Let’s be honest: IT can be a jargon-heavy world. Terms like “network latency” or “data redundancy” are second nature to IT professionals but can sound like gibberish to someone on the business side. Business leaders need information that directly ties into business outcomes—not a lesson in technical terminology. When served this jargon ridden reporting, it reinforce the business perception of IT as a back office function disconnected from a leadership role in the organization.

3. Too Long; Didn’t Read (TLDR) – I recently reviewed a client dashboard where the CIO had 10 minutes to report in the monthly management review meeting IT. It was ten, very detailed, slides. Realistically, there wasn’t sufficient time in 10 minutes to cover even 2-3 slides, and the key business concern – are we on progress to deliver to the business strategy was never addressed succinctly. That answer had to be read from multiple different slides.

4. Why Should I Care – IT reports are often dense with technical data but lack business context. A report on server downtime, for instance, may not explain how that downtime is affecting customer experience or revenue. Without that essential context, it’s hard for business leaders to grasp the true impact of what’s going on in IT, which leads to less informed decision-making. I recently saw a client report which reported on “94% deflection” – a business user would not understand what that meant or even care!

The Role of Effective Dashboards

So, what’s the fix? One key solution lies in effective dashboards that can translate complex IT data into business-relevant insights. When done right, dashboards can serve as a powerful tool to bridge the communication gap.

How Dashboards Help

Bridging the Gap: Dashboards translate technical metrics into business insights. Instead of presenting a laundry list of technical KPIs, dashboards present data in a way that aligns with business goals—making it easier for business leaders to understand how IT performance affects them.

Key Features of an Effective Dashboard

  • Clarity and Simplicity: A dashboard should be clean and easy to understand. Focus on the most critical metrics and strip away unnecessary data clutter. Less is more when it comes to business-facing dashboards.

In my previous example with the 10-slide presentation, I asked the team a straightforward question about the four strategic initiatives: are they on schedule or behind? The response was that three were on track, and one was delayed. However, I only found this out by asking directly—there was no clear indicator, like a simple green, amber, or red flag, to make this status immediately visible. This was the base-level information that the business wanted to know at a glance.

  • Contextual Information: Data is useful, but not sufficient. People need to understand why the data matters. How does what is being reported impact them?
  • Visual Appeal: “People eat with their eyes, not just their mouths,” means it’s got to look good too. Charts, graphs, and colors can make complex data easier. A well-designed dashboard helps non-technical users understand key information.
  • Know Your Audience: A dashboard should be tailored to the specific needs of its audience. Different stakeholders—executives, department heads, or IT managers—need differing levels of detail and focus. For example, a high-level executive might want a quick overview of strategic KPIs, while a department manager may need more granular data on operational performance. Understanding who will be using the dashboard helps ensure it presents the right information in the right format.

Often, I see IT departments presenting what THEY feel is important at the expense of what their audience actually cares about.

  • Understand What Outcome You Are Trying to Get: A dashboard is not just a tool for presenting data—it’s a form of marketing from which IT hopes to achieve a beneficial outcome. Whether the goal is to gain executive buy-in, influence decisions, or highlight the value IT brings to the business, the dashboard should be designed with this in mind. It’s about showcasing IT’s impact in a way that drives action, whether that means securing more resources, aligning priorities, or improving collaboration. By understanding what outcome you’re trying to achieve, you can ensure the dashboard tells the right story and promotes the desired business result.

Any opportunity to communicate with the business is an opportunity to reinforce that IT is a strategic business partner and not simply bits and bytes.

Types of Dashboards

1. Strategic Dashboards: Designed for executives, these dashboards present high-level metrics that are directly tied to business objectives priorities, and strategies, but don’t forget to call out successes.

2. Tactical Dashboards: These are meant for middle managers who are overseeing specific projects or departments. They offer a more detailed look at operations but still focus on IT performance or OKRs.

3. Operational Dashboards: Used by IT teams, operational dashboards monitor day-to-day technical metrics like system health and security. While these dashboards are more technical, they can be connected to business goals when integrated into the larger dashboard framework.

Conclusion

IT and business alignment is critical for an organization’s success, yet it’s an area where many organizations fail. By addressing the root causes of the communication gap—complexity, jargon, lack of context, TLDR, etc.—and implementing dashboards that speak directly to business needs, organizations can turn IT from being perceived as wire-pullers or propeller heads into a partner.

In short, if IT is going to speak to the business effectively, the conversation needs to change—and dashboards are a powerful way to make that happen.


Related Resources:

Daniel Saroff - GVP, Consulting and Research Services - IDC

Daniel Saroff is Group Vice President of Consulting and Research at IDC, where he is a senior practitioner in the end-user consulting practice. This practice provides support to boards, business leaders, and technology executives in their efforts to architect, benchmark, and optimize their organization's information technology. IDC's end-user consulting practice utilizes our extensive international IT data library, robust research base, and tailored consulting solutions to deliver unique business value through IT acceleration, performance management, cost optimization, and contextualized benchmarking capabilities.

Cloud computing has been heralded as the panacea for modern IT challenges, promising scalability, flexibility, and cost savings. However, as the cloud landscape matures, many organizations are finding that the reality of cloud adoption does not always align with their expectations. This has led to a growing trend of repatriating workloads back to on-premises or private cloud environments. In this article, we will explore the reasons behind these missed expectations and why some businesses are choosing to bring their cloud workloads back home.

The Promise vs. Reality of Cloud Computing

Cost Overruns

One of the primary drivers for cloud adoption has been the promise of cost savings. However, many organizations are finding that their cloud spending is exceeding initial estimates. According to IDC’s Cloud Pulse 4Q 2023 survey, close to half of cloud buyers spent more on cloud than they expected in 2023, with 59% anticipating similar overruns in 2024. The complexities of cloud environments, coupled with unforeseen external influences, make it challenging to forecast costs accurately. Factors such as the increasing cost of third-party services, energy costs, and the financial implications of new technologies like GenAI are contributing to these budget blowouts.

Performance and Latency Issues

While cloud providers offer robust infrastructure, not all workloads are suited for the cloud. Performance and latency issues are common complaints, particularly for applications that require real-time processing or have stringent performance requirements. For instance, technical and AI-related workloads often experience performance bottlenecks in public cloud environments, prompting organizations to consider repatriation.

Security and Compliance Concerns

Data security and regulatory compliance are critical considerations for any organization. Despite the advanced security measures offered by cloud providers, many businesses remain concerned about data breaches and compliance with industry regulations. This is particularly true for sectors like finance and healthcare, where data privacy is paramount. As a result, production data and backup/disaster recovery processes are among the most repatriated elements of workloads.

Complexity in Management

Managing a multi-cloud or hybrid cloud environment can be incredibly complex. Organizations often struggle with integrating different cloud services, managing data across multiple platforms, and ensuring consistent security policies. This complexity can negate the perceived benefits of cloud adoption, leading some businesses to reconsider their cloud strategies.

The Repatriation Trend

What is Repatriation?

Repatriation refers to the process of moving workloads from public cloud environments back to on-premises or private cloud infrastructure. This trend is part of a broader industry movement towards hybrid multi-cloud IT strategies, where organizations seek to optimize their workload placement across various environments.

Drivers for Repatriation

Several factors drive the decision to repatriate workloads:

  1. Cost Management: As mentioned earlier, unexpected cost overruns in the cloud can make on-premises solutions more attractive. By repatriating workloads, organizations can gain better control over their IT spending.
  2. Performance Optimization: For workloads that require high performance and low latency, on-premises infrastructure can offer superior performance compared to public cloud environments.
  3. Security and Compliance: Repatriating sensitive data and critical applications can help organizations meet stringent security and compliance requirements more effectively.
  4. Operational Control: Having workloads on-premises allows for greater control over IT operations, enabling organizations to tailor their infrastructure to specific needs and optimize resource utilization.

The Scale of Repatriation

While repatriation is a growing trend, it is not a wholesale migration. According to IDC’s Server and Storage Workloads Survey, only 8-9% of companies plan full workload repatriation. Instead, most organizations repatriate specific elements of their workloads, such as production data, backup processes, and compute resources.

Larger Organizations Leading the Way

Larger organizations are more active in repatriating workloads compared to smaller businesses. This is due to their greater resources, larger workloads, and more complex IT environments. Economic factors and comprehensive workload strategies also play a role in driving repatriation activities among large enterprises.

Conclusion

The initial promise of cloud computing has not been fully realized for many organizations, leading to missed expectations and a growing trend of workload repatriation. Cost overruns, performance issues, security concerns, and management complexities are some of the key factors driving this shift. While the cloud remains a vital component of modern IT strategies, businesses are increasingly adopting a hybrid approach, optimizing their workload placement across public cloud, private cloud, and on-premises environments.

As the cloud landscape continues to evolve, organizations must carefully assess their cloud strategies, balancing the benefits of cloud adoption with the realities of their specific needs and challenges. By doing so, they can make informed decisions about where to best deploy their workloads, ensuring they achieve the desired outcomes without compromising on cost, performance, or security.


References:

Daniel Saroff - GVP, Consulting and Research Services - IDC

Daniel Saroff is Group Vice President of Consulting and Research at IDC, where he is a senior practitioner in the end-user consulting practice. This practice provides support to boards, business leaders, and technology executives in their efforts to architect, benchmark, and optimize their organization's information technology. IDC's end-user consulting practice utilizes our extensive international IT data library, robust research base, and tailored consulting solutions to deliver unique business value through IT acceleration, performance management, cost optimization, and contextualized benchmarking capabilities.

As part of our Smart Cities research, we have been documenting the expanding role of architecture, engineering, and construction (AEC) firms, commercial real estate (CRE) companies, and developers as key orchestrators of these initiatives.

The Ellinikon project in Greece provides a compelling example of this burgeoning ecosystem. The Ellinikon is set to become Europe’s largest urban redevelopment initiative, transforming the Athens’ former international airport into a green Smart City district on the Athens Riviera.

The project is not spearheaded by the municipal government but by Lamda Development. Although the government has provided coordination for this megaproject, it has not provided any financial backing.

Covering over six million square meters, the multibillion-euro project aims to set new global standards for Smart Cities.

We sat down with Manthos Papamatthaiou, Lamda Development’s business development director for Smart City and ICT, Dimos Panagiotis, business development senior manager, and Paraskevi Panagopoulou, business development associate, to learn more about the project and the organization’s Smart City expertise.

[The responses below are some of the highlights of the interview. The full interview is available here for subscribers to Worldwide Smart Sustainable Cities, States, and Spaces: AI, Cloud, and Edge Strategies.]

What is your vision for The Ellinikon project?

Our overarching goal, as outlined in The Smart Ellinikon Vision, is to create “a state of-the-art smart district that pioneers the future of home, work, and entertainment; utilizing technology to deliver sustainability and serve the people of tomorrow.” We are following an integrated approach where solutions merge seamlessly into daily life without causing disruption. These solutions are designed to be outcome-focused rather than technology-driven, ensuring that each bit of technology either adds value and enhances the experience of residents, tourists, and employees or serves our sustainability and environmental protection targets. As part of this initiative, we have already completed the technology master plan and are entering the build phase of digital infrastructure and smart use cases across various domains such as smart infrastructure, mobility, energy, and waste management.

At the heart of our project lies sustainability, guided by The Ellinikon Sustainable Development Strategy and Lamda Development’s ESG goals and commitments. This approach aligns with the expectations of future residents and visitors as well as EU regulations.

Which cities are you looking to for examples of best practice?

We’ve thoroughly studied all major Smart City initiatives worldwide, focusing on both the success stories and lessons learned. The 15-minute city idea from Paris, for instance, significantly influenced the masterplan developed by Foster + Partners. The location suits perfectly the “city within a city” concept, with the mountains behind, the sea in front, and excellent connectivity to downtown Athens. One takeaway from other Smart City projects is the importance of having the right internal skills to ensure seamless operations.

How did you build the business case and determine the ROI of integrating smart technologies into the urban redevelopment project?

To establish the business case, we cooperated with international consultants like Deloitte and AFRY and conducted a feasibility study to define the sizing, costing, and benefit of each solution. We leveraged a large pool of relevant data from comparable projects to shape a solid set of assumptions. The feasibility study successfully quantified both the direct and indirect benefits of building a Smart City. The indirect value was reflected as a tangible premium in the real estate value prices, attributed to the appeal of having a residence or a business located within a smart district. This value of living in one of the leading smart districts in Europe ends up with a very interesting figure that significantly supported our decision-making processes.

When did Lamda Development establish its Smart City team?

The department was established over three years ago, at the urban planning phase and prior to any construction works. We started by defining the principles and vision for Smart Ellinikon, and then moved on to the identification of opportunities. What began as just a high-level concept has now evolved into an ambitious, approved, large-scale project. We are currently refining the design and initiating the implementation of the city’s “digital layer.” Given the breadth of available solutions and the potential for value creation, it’s clear that investing in dedicated Smart City teams early on is a wise move for all large-scale developers.

 

The Ellinkion’s Smart City team is ambitious. The project represents both a major advance in urban planning and a testing ground for the latest innovations in Smart City technologies. By incorporating digital technologies across every layer of the development — from energy management and transportation to waste management — the initiative aims to set new global standards for Smart Cities.

As demonstrated by Lamda Development, the role of developers, AEC firms, and CRE companies is shifting within the urban innovation ecosystem. IDC has found that 40% of AEC and 50% of CRE companies have established dedicated technology and innovation departments — numbers that are set to increase significantly over the next two years. These organizations are also increasingly partnering with technology companies in arrangements such as preferred partners in support of urban innovation initiatives.

As exemplified by Papamatthaiou and his team, building in-house Smart City expertise is becoming more common among developers. These organizations should be seen as key players in the Smart City ecosystem.

We expect these companies to have a noticeably large presence at this November’s Barcelona Smart City Expo, which we will attend.

 

Read the full interview here

Further reading: IDC Government Insights: Worldwide Smart Sustainable Cities, States and Spaces: AI, Cloud and Edge Strategies

Louisa Barker - Senior Research Manager, IDC Government Insights, Europe - IDC

Louisa Barker is a senior research manager in the European IDC Government Insights team, leading research on smart, sustainable, and resilient cities and communities. She has international experience providing analysis, policy advice, and consultancy to the public sector on disaster risk management, urban building and planning regulation, and smart cities. Previous roles have included Urban Resilience Consultant at the World Bank, focused on projects in the Caribbean and East Africa, and as a researcher at technology and innovation accelerators such as the Future Cities Catapult and the University College London City Leadership Laboratory. She is also a Specialist Advisor to the International Building Quality Centre.

AI is the latest focus of the corporate world’s pursuit of innovation. Executives are understandably eager to harness AI’s potential for efficiency, cost-cutting, and a competitive edge. But here’s a radical notion: Perhaps we shouldn’t approach AI projects with the haste of a start-up chasing its first unicorn valuation.

The “move fast and break things” ethos, once Silicon Valley’s battle cry, is about as appropriate for AI implementation as using a sledgehammer for neurosurgery. You might make an impact, but the collateral damage could be catastrophic.

Let’s be clear: AI isn’t just another IT project you can cobble together with clever coding and optimistic projections. It’s a sophisticated, data-dependent set of technologies that demands respect, thorough preparation, and patience. However, while meticulous preparation is essential, it should not paralyze organizations from embarking on their AI journey. Finding a balance is key.

The Data Foundation: Quality Over Quantity

Imagine your company has invested heavily in AI technology, assembled a crack team of data scientists, and your board is salivating for results. There’s just one snag — your data is a mess. It’s like building a Formula One car and fueling it with crude oil.

AI’s effectiveness is directly proportional to the quality of data used in its implementation If your company’s information is fragmented across incompatible systems, riddled with errors, and as organized as a toddler’s playroom, your AI project is doomed from the start.

Building a robust data foundation isn’t glamorous. It doesn’t generate exciting headlines or impressive slides. But it’s the bedrock of successful AI initiatives. This means time and resources must be dedicated to data cleaning, integration, and governance. It means creating a unified, reliable data source for your AI. This preparatory work may delay your AI launch, but it ultimately delivers value across your entire organization.

Still, organizations shouldn’t wait indefinitely before launching AI initiatives. Many successful companies have begun with targeted use cases while simultaneously improving their data quality. This dual approach allows them to learn and adapt as they go.

Knowledge: The Critical Superpower

Ask yourself: Does your organization truly understand AI? We’re not talking about buzzword-laden superficiality. I mean a deep, nuanced comprehension of AI’s capabilities, limitations, and pitfalls. Without this understanding, you’re navigating treacherous waters blindfolded.

Building AI literacy isn’t just about sending your tech team to conferences; it involves fostering company-wide understanding. Educate everyone from the C-suite to frontline staff on AI’s real-world applications and limitations. Tackle ethical implications head-on and establish robust governance.

It also involves ensuring compliance with regulations such as Article 4 of the EU’s AI Act. This article states that providers and deployers of AI systems shall take measures to ensure a sufficient level of AI literacy of their staff. This highlights the importance of tailoring education to the technical knowledge and experience of staff involved in operating these systems.

This educational journey takes time and resources, but it shouldn’t deter organizations from initiating AI projects. A phased approach enables companies to build knowledge while actively engaging in practical applications of AI.

Preparing Your Workforce: Beyond Technical Skills

Here’s where many companies falter: They focus solely on technical AI skills, neglecting the broader organizational and cultural shifts necessary for successful AI adoption.

Effective AI integration requires more than just data scientists and machine learning engineers. It demands a workforce that can collaborate with AI systems, interpret their outputs, and make informed decisions based on AI-generated insights.

This means cultivating a range of “AI-adjacent” skills:

  1. Critical Thinking: Employees must be able to question AI outputs and understand their limitations.
  2. Data Literacy: A basic understanding of data analysis and statistics is crucial across roles.
  3. Ethical Reasoning: Staff need to recognize and address potential biases or ethical issues in AI systems.
  4. Adaptability: As AI reshapes job roles, employees must be willing to evolve and learn continuously.

Truly strategic AI implementation may require organizational restructuring. Traditional hierarchies may need to flatten, allowing for more rapid decision-making based on AI insights. Cross-functional teams become essential, breaking down silos between IT, data science, and business units.

Cultural shifts are equally critical. Foster a culture of experimentation and learning from failure — this is essential when working with evolving technologies. Encourage transparency about AI’s capabilities and limitations to build trust. Address fears of job displacement directly, emphasizing AI as a tool to augment human capabilities, not replace them.

Importantly, these changes can’t be afterthoughts; they should be integral to your AI strategy from day one. Involve HR, change management specialists, and department heads in planning.

In a World of Tortoises and Hares, Be a…

Imagine two companies: The hare races to implement AI everywhere without proper preparation. The tortoise methodically builds its data foundation, educates its workforce, and carefully plans its strategy.

Initially, the hare makes headlines with rapid implementations. However, over time it grapples with inconsistent results due to poor foundational work. Meanwhile, the tortoise rolls out its first meticulously planned project after thorough preparation.

Fast forward a few years. The hare has scaled back its ambitions due to high-profile failures. But the tortoise enjoys consistent improvements in efficiency driven by well-implemented solutions.

What if neither the tortoise nor the hare resonates with your organization?

Enter the bat — a creature that thrives in darkness and is adept at navigating complex environments using echolocation.

Just as bats use their acute senses to adapt quickly and effectively to their surroundings, organizations should embrace a flexible approach to AI implementation. This means being agile enough to pivot based on real-time feedback while ensuring a solid foundation is in place. Bats can fly swiftly when needed — but they also take time to explore and understand their environment.

The moral? In AI, being Batman, aka Bruce Wayne, is often the winning strategy.

The Virtue of Thoughtful Progress

In a business world obsessed with speed, advocating for patience might seem naïve. But with AI, it’s essential for long-term success.

Effective AI implementation often isn’t about being first. It’s about building the strongest foundation while understanding technology deeply and integrating it effectively into business processes and culture. It’s about creating sustainable solutions that deliver real value — not just flashy demos.

To companies feeling pressured to jump into AI: Resist the urge to rush blindly forward or become paralyzed by over-preparation. Focus on getting your data right while simultaneously exploring use cases that allow you to learn iteratively. Plan carefully; execute methodically; prepare for a marathon, not a sprint.

The winners in this race won’t be those who move fastest but those who skillfully navigate between thoughtful preparation and timely execution.

Ewa Zborowska - Research Director, AI, Europe - IDC

Ewa Zborowska is an experienced technology professional with 25 years of expertise in the European IT industry. Since 2003, she has been a member of the IDC team, based in Warsaw, researching IT services markets. In 2018, she joined the European team with a specific emphasis on cloud and AI. Ewa is currently the lead analyst for IDC’s European Artificial Intelligence Innovations and Strategies CIS.

As many casual tech observers might have recently seen, Samsung released a new device type at their latest Unpacked event (July 10th): the “Samsung Galaxy Ring”. This signals a shift in the Smart Ring space.

The Smart Ring space is currently dominated by Smart Ring producing specialists like Oura and Ultrahuman but now the larger brands are beginning to compete by incorporating rings into their existing ecosystems of devices. It also potentially marks the shift from Smart Rings being a small segment of the wearables technology landscape to becoming a main stay in the space.

In the last full year of data, 2023, IDC recorded Global Ring sales of 880,000 units, with Oura representing 80% and Ultrahuman in second with 12%. We are forecasting this to rise to 1.7 million in 2024 and 3.2 million in 2028, equating to a year-over-year growth rate of 29.5%. For comparison, the total global Smartwatch sales in 2023 were roughly 161 million devices, forecasted to rise to 175 million by 2028, an average year-over-year growth rate of 1.7%.

This reflects the greater maturity of the Smartwatch market and the lengthening replacement cycles of Smartwatches as the upgrades become more iterative. So, for now, Smart Rings will be a small but fast-growing part of the wearable device market. Get access to the latest performance data for the wearables market—including Smart Rings and Smartwatches—with IDC’s Wearables Devices Tracker. You can learn more about the product with this resource or see the data in action and explore more sample insights here.

The early reception of the new Galaxy Ring appears to be one of moderate interest, with consumers seemingly liking the new form factor the ring offers. The ring form does have a distinct advantage of being more sleek and less obtrusive than Smartwatches, which is especially significant when sleep tracking. Many Smartwatch wearers dislike wearing a watch to bed but would think nothing of keeping their rings on. There is also a subset of consumers, especially those with smaller wrists, that dislike wearing bulky Smartwatches like those offered by Apple and Samsung. Some brands, such as Garmin, do offer female specific slimmer designs with the Lily range of watches, but the ring format might be another option. We also have a significant section of the population who prefer the premium Analog watches, like your Rolex or Omegas.

From conversations in the industry, it is clear that many of the large players within the wearable devices space are watching Samsung Ring sales with interest, and are exploring the possibility of producing their own Smart Rings. So should the Galaxy Ring prove to be a success we will likely see many other players jumping into the market, like we saw with the release of the first Apple Smartwatches .

The Substitution Problem

Unfortunately for the Wearables market as a whole, Smart Rings look set to compete directly with Smartwatches as many of the features they offer are directly comparable. Take the Galaxy Ring, for example, offering sleep tracking, heart rate monitoring, activity tracking and wellness monitoring. These are all things offered by their Galaxy line of watches, and whilst Samsung has discussed their watches and Ring working together saying “Wearing the Galaxy Ring with a Galaxy watch, … will maximize its shared health features while also extending the Ring’s battery life”. From a consumer’s point of view, given the release price of the Samsung ring was $400 and a medium spec Galaxy watch can set you back the same amount; it leads us to question just how many consumers will have $800 burning a hole in their pockets, and a desire to get two devices that do essentially the same thing. Though the Galaxy Ring is priced comparably to its biggest competitor, Oura’s Ring 4, which has a base model price of $349, but requires a monthly $5.99 subscription.  Samsung, as of now, hasn’t made any announcements of subscriptions being needed.

So, it appears inevitable that in the medium to long run Smart Rings will eat into the Smartwatch share of the wearables market. The extent to which they do so is yet to be determined, and there will undoubtedly be people out there who wouldn’t have bought a Smartwatch but will buy a Smart Ring.

Conclusions

Smart Rings are a device type that has the potential to flourish in the next few years; the extent to which it does will be determined by the number of big players that launch their own rings and if the largely positive reception continues. But as Smart Rings flourish, we will likely see that these wearable makers are, to some extent, taking market share from themselves. As their own Smart Ring sales rise their Smartwatch sales will likely fall. That all being said, bring on the Smart Ring revolution.

Frederick Stanbrell - Data & Analytics Analyst - IDC

Frederick Stanbrell joined IDC in 2022, as an associate research analyst based in London, leading the European Wearables tracker. As head of the European Wearables tracker he collates guidance, tracks market trends and provides insight and forecasts into the region, companies and individual countries. Before joining IDC, he studied an undergraduate degree in Economics from the University of Greenwich, obtaining a first. During this time he was also a prominent member of the University of Greenwich Cricket team.

Enterprise applications are the foundation of modern business operations. In 2023, the market expanded by 12%, reflecting its continued importance. 

We can attribute this market growth to the following drivers: 

  • AI and Generative AI: The integration of AI and generative AI (GenAI) is transforming enterprise applications. From predictive analytics in CRM systems to personalized recommendations in ecommerce platforms, these technologies are making applications more intelligent and insightful. 
  • Cloud Dominance: Cloud technology is the present and the future of enterprise applications. Its ability to support technologies such as AI, machine learning, and the Internet of Things ensures that businesses can continue to evolve and adapt to internal and external needs and requirements. 
  • Ongoing Investments in Digital Transformation: Continuous digital transformation efforts are driving the adoption of enterprise applications, with organizations modernizing outdated systems and implementing innovative solutions across all business functions.

The Imperative of Enterprise Application Modernization 

With more legacy systems reaching the end of their lifecycle — if only from a support perspective — and older platforms faltering under demand for increased agility, flexibility, and resilience, the modernization of enterprise applications has swiftly ascended organizations’ priority lists. In EMEA, application modernization has become a central focus, with an impressive 96% of surveyed organizations planning to undertake this essential transformation.

Challenges in Modernization

Each organization possesses a distinct approach to application modernization; no singular path forward exists. The complexities involved in modernizing applications across an enterprise demand a variety of tailored strategies — and this will likely remain the case.

Specific routes to modernization differ significantly by market, sector, and organization, with varied strategies emerging to address the diverse needs of different departments and their respective applications.

The Role of Cloud

While these strategies may differ, cloud technology stands out as the unifying force, having rapidly established itself as the preferred framework for both new and existing enterprise applications. Organizations are exploring multiple routes to modernization, with nearly half of those surveyed in EMEA expressing a desire to lift and shift their existing applications to the cloud.

Moreover, 43% aim to migrate to new cloud-based versions of their current applications, while 42% are eager to embrace entirely new cloud solutions.

Cloud: The Catalyst for Enterprise Transformation

Cloud technology is increasingly the foundation of organizations’ efforts to modernize their business applications. It empowers enterprises to swiftly adapt to shifting business needs, deploy updates seamlessly, and leverage cutting-edge technologies such as AI/GenAI, advanced analytics, and next-gen security.

Growth of Cloud-Based Applications

Cloud computing has massively fueled the growth of enterprise applications throughout EMEA. According to IDC’s May 2024 release of its software and public cloud services forecast, the enterprise apps market will continue to expand hugely — from $27.2 billion in 2019 to an estimated $63.7 billion in 2028 — reflecting a significant shift from on-premises to cloud-based applications.

The proportion of enterprise applications in public cloud surged from 36% in 2019 to an astonishing 68% in 2023, reflecting a marked acceleration in the growth of public cloud usage throughout the region. By migrating to cloud, companies can enhance performance, bolster security, and ensure superior disaster recovery. Cloud’s role as a crucial component of modern IT strategies has solidified.

Factors Driving Cloud Adoption

Additionally, the rise of remote work — especially during the pandemic — alongside regulatory compliance needs and the integration of emerging technologies, has further catalyzed cloud adoption. With the ongoing establishment of local cloud datacenters and numerous partnerships formed by enterprise application providers, we predict that cloud-based enterprise applications will continue to expand, reaching nearly 77% of all enterprise applications in 2028. 

Regional Insights: EMEA Market Dynamics

From a subregional perspective, Western Europe commands a dominant position in the region, holding EMEA market share of 88%. This dominance can be attributed to the strong presence of companies like Visma and DATEV, which primarily focus on financial applications, payroll management, and HCM within Western European markets. Global giants such as SAP, Oracle, and Sage also maintain significant footholds in this subregion. 

In contrast, each of the Central & Eastern Europe (CEE) and the Middle East & Africa (MEA) subregions accounts for EMEA enterprise applications market share of 6%.

However, CEE has recorded a decline in spending due to several challenging macroeconomic factors.

The protracted Russia-Ukraine War, which began in February 2022, has introduced notable economic instability in the subregion. High inflation rates have diminished the spending power of businesses, while stringent monetary policies have adversely impacted software investments, resulting in a significant decline in 2022 and sluggish recovery in 2023.

On the flip side, MEA has experienced substantial growth over the past four or five years, largely driven by the emergence of cloud and cloud-based services in 2018 and 2019. This growth has been bolstered by considerable investments from cloud providers.

Organizations and large family-owned enterprises in MEA, previously reliant on monolithic legacy applications, have begun adopting SaaS solutions for non-critical workloads such as HCM, procurement, and asset management. They have thus far been successful in their overall application modernization efforts, with a growing number of businesses re-architecting, re-platforming, and re-engineering their in-house legacy systems.

Navigating the Complexity of Cloud Migration

As organizations consider their enterprise application investments and modernization opportunities and the advantages of cloud migration, they are exploring the most effective implementation paths for their new cloud-based applications. In the past, heavily customized implementations were the preferred route, as organizations sought to tailor their applications to better suit their operations.

However, a noticeable shift is occurring as organizations increasingly embrace a more standardized implementation approach.

Approximately 31% of respondents plan to rely solely on standard functions and configurations for their new cloud-based applications, while 42% are contemplating only essential modifications — typically, those focused on sector-specific functionality — to maintain otherwise predominantly standardized implementations.

The Benefits of Standardization

The preference for standardized approaches is motivated by clear and tangible benefits. For instance, as new security patches are released, organizations with more standardized implementations can rapidly update their systems without necessitating further testing. This capability ensures the most robust protections are in place as quickly as possible, without requiring additional investments in IT, security, or implementation capabilities. 

Accessing New Features Quickly

Access to new features is another crucial factor. As the pace of change accelerates, organizations are eager to leverage new functionalities — such as GenAI and sustainability tools — immediately upon their release.

Typically, these enhancements are first introduced in cloud-based SaaS versions, with vendors maintaining slower update cadences for on-premises editions. Consequently, a more standardized implementation offers the quickest and most straightforward access to these emerging features and functionalities.

This inclination for less customized implementations ensures that organizations stay current with vendor innovations — ultimately, helping them maximize the benefits of their enterprise application modernization investments. 

Conclusion

Enterprise application modernization is essential for businesses competing in the digital age. Cloud technology is key to this process, enabling agility and innovation. While modernization approaches vary, a trend is clear toward standardized solutions that maximize the benefits of cloud and AI, including GenAI.

To be successful, increase efficiency and competitiveness, and future-proof operations, organizations must take a comprehensive approach to modernization, considering not only specific business needs and appropriate technologies, but also data governance and sustainability.

This blog serves as a summary of the valuable insights shared during our recent webinar on enterprise application modernization. If you found this discussion on the transformative power of cloud technology and AI interesting, we invite you to access the full webinar on demand.

Ashok Patel - Research Manager, European Enterprise Applications - IDC

Ashok Patel is a research manager in IDC’s European enterprise applications team. Prior to joining IDC, he led the Market Trends programme at Source Global Research, providing insights into the latest trends and developments across the professional services market, and has previous experience exploring clients’ perceptions of consulting firms. Prior to working in professional services, Ashok was an editor and consultant in the commodities market, as well as working in the automotive industry.

In three years, I anticipate that around 40% of global engineering-oriented manufacturing companies will leverage digital twins within the industrial metaverse to enhance collaboration and accelerate time to value.

What leads me to this prediction? Let’s start with IDC’s definition of the industrial metaverse as a highly immersive environment that seamlessly integrates the physical and digital worlds, fostering shared presence, interaction, and continuity across engineering, operations, supply chains, and business functions.

In engineering domain, the industrial metaverse functions as a cloud-native, multi-domain platform for 3D visualization and collaboration, bringing products to life through integrated, physically accurate simulations. It acts as a “digital twin of digital twins,” utilizing real-time data from multiple domains such as mechanical, electrical, and software interactions.

Building this environment requires collaboration among key players, including hyperscalers and providers of simulation platforms, 3D visualization, and digital business tools. New partnerships are constantly emerging, involving major companies in digital infrastructure, cloud computing, engineering platforms, visualization technologies, and artificial intelligence — all working together to push the industrial metaverse beyond the traditional digital twin model.

What Do the Numbers Tell?

Product innovation remains a key business priority for engineering-focused manufacturing organizations, as highlighted in IDC’s 2024 Global Manufacturing Industry Core Survey (Figure 1).

Figure 1: Question: What are your company’s top business priorities over the next 2 years?

According to IDC’s 2023 Global Product and Service Innovation Survey, 25% of manufacturing respondents considered industrial metaverse technology to be “very important” for product and service innovation. This number was even higher among engineering respondents, with 34% rating it as very important.

Furthermore, 38% of respondents from companies with over 1,000 employees in IDC’s 2024 Global Manufacturing Industry Core Survey stated that Industrial Metaverse technology plays a “moderate to very high” role in supporting their company’s achievement of key operational KPIs.

 As a result, the adoption of the industrial metaverse in engineering-focused manufacturing organizations is anticipated to grow steadily over the next three years.

In Conclusion

My advice for early adopters is to keep a close eye on hyperscalers, leading technology vendors, and the startup ecosystem — to stay up-to-date with the rapidly evolving landscape of industrial metaverse development. Additionally, remember that integrating real-world data with IT data to create advanced simulation and collaboration tools requires time and careful planning.

Building and nurturing digital communities and ecosystems is essential, as they will be key to future success in the industrial metaverse. Lastly, recognize that the value of the industrial metaverse extends beyond product design and engineering, reaching areas like operations, maintenance, quality, procurement, and the supply chain, among others.

As the January 2025 deadline for the EU Digital Operational Resilience Act (DORA) approaches, financial institutions and ICT providers across the European Economic Area (EEA) must urgently assess their readiness, address regulatory gaps, and implement the necessary tools and processes to ensure compliance and safeguard digital resilience.

On January 17, 2025, the EU Digital Operational Resilience Act will take effect across all European Economic Area countries. It will impact financial institutions and their ICT service providers even beyond these borders in certain circumstances.

With only three months remaining, more than 20,000 financial entities must comply with DORA’s regulatory requirements. However, in IDC’s European Security Technologies and Strategies Survey 2024 (May 2024), 49% of respondents stated, “We are aware of DORA but have not yet undertaken exploratory work,” and 14% admitted, “We are not aware of DORA.”

Since then, progress has hopefully been made, driven by active market debates and numerous educational initiatives aimed at increasing awareness. Still, with just a few weeks before the deadline, financial entities and ICT providers must assess their current standing and identify the efforts required to bridge the gaps. Now is the time to prioritize, plan, and comply:

  • PRIORITIZE the gaps that need addressing
  • PLAN for tools and process improvements (extending beyond January 2025)
  • COMPLY with the deadline

Let’s take a step back to recap the scope and objectives of this EU regulation.

Scope: Harmonization and Augmentation

DORA introduces two key innovations:

  • Harmonization: DORA harmonizes regulatory requirements across different financial industries, covering banking, insurance, capital markets players, and adjacent players such as credit rating agencies. This harmonization eliminates fragmentation across jurisdictions by implementing a regulation (not a directive), ensuring common requirements across all member countries.
  • Augmentation: DORA marks a paradigm shift, bringing ICT third-party providers under the direct scrutiny of the European Financial Supervisory Authorities. As previously discussed in our blog, EU regulators have acknowledged the growing dependency of financial organizations on ICT and cloud service providers. Given that digitalization and operational resilience are two sides of the same coin, implementing a robust digital operational resilience framework significantly enhances security for banking operations. By placing critical third-party ICT service providers under direct supervision, regulators have reshaped the dynamics between financial entities and their ICT partners.

For financial entities, DORA provides the framework for tighter collaboration with ICT partners to ensure end-to-end operational robustness. For ICT partners, DORA is not just a new regulatory burden, but an opportunity to deepen relationships with clients and explore new business avenues, as financial entities are required to conduct market research and define alternate solutions for each critical function.

Objectives: Mitigating Systemic Risk

The primary objective of DORA is to address the systemic risk posed by critical ICT service providers in the financial industry. By involving European supervisory authorities (e.g., EBA, ESMA, EIOPA) directly, regulators aim to mitigate this risk and enhance the overall digital resilience of the financial sector.

DORA’s requirements fall under five pillars:

  • Risk management
  • ICT third-party risk management
  • Digital operational resilience testing
  • Mandatory incident reporting
  • Voluntary information and intelligence sharing

Additionally, financial entities must define clear exit strategies to mitigate systemic risk in the event of operational issues with an existing ICT partner. Each entity must identify and choose alternative solutions and service providers to ensure the smooth transfer of critical services, if necessary.

For ICT vendors, DORA is a double-edged sword: While it opens up new opportunities and makes the market more fluid, it also imposes additional compliance obligations.

It is important to note that many DORA requirements are not new to large institutions, particularly significant banks subject to the ECB’s Single Supervisory Mechanism. The principle of proportionality still applies under DORA. Nonetheless, its impact is extensive, as evidenced by the IDC survey, wherein 38% of respondents cited digital operational resilience testing as their biggest challenge, while 33% identified ICT third-party risk management as a major hurdle.

Final Steps: Self-Assessment and Planning

With the deadline approaching, each institution must conduct a self-assessment to identify gaps. Where significant gaps remain, organizations must prioritize efforts to meet compliance requirements. Meanwhile, financial entities should plan for the adoption of new tools and processes, such as integrated procurement solutions, to enhance third-party governance and address DORA holistically as part of their ongoing journey toward digital operational resilience.

 

Are you ready for DORA? Discover the 10 critical steps financial entities must take before the regulation comes into effect in January 2025: IDC PlanScape: Last-Call DORA Compliance Checklist to Achieve Digital Operational Resilience

Maria Adele Di Comite - Research Director, IDC Financial Insights Corporate and Retail Banking - IDC

Maria Adele is Research Director for IDC Financial Insights European research team and is responsible for the IDC Financial Insights Corporate Banking Digital Transformation Strategies program. She has strong competencies in financial services strategy, cybersecurity, and regulatory evolution. She has been living and working in three different countries (Germany, Belgium, and Italy) and she speaks 5 languages. She is an expert in B2B business strategy, with significant experience in Financial Services, System Integration, and Consulting.

The time has come for the contact center and customer handling environment to jettison the term, ‘deflection.’  There is an inherent prejudice in the word when used in the context of customer service, that is, it is an operational view.   

The word itself comes with some psychological baggage. I hate to resort to the dictionary, but I must in this case, as it is illuminating. Deflect carries the meaning to ‘redirect from oneself blame or guilt.’ Ouch. Were the adopters of this word subconsciously interpreting a customer’s issue with a product as pointing blame at the organization for a deficient product?  Hmm.

Regardless of the underlying evolution, it’s time the industry stopped using this language in customer handling.  The idea of ‘deflection’ belies the thinking that it is very much a cost savings behavior; move this customer’s inquiry from the costly telephony agent channel to a less costly channel such as self-service or a digital channel. This thinking has no place in a customer handling environment.  

I’ve heard many arguments in support of the routing to alternative channels away from voice. Among them ‘customers like to help themselves,’ or ‘we are getting them to a channel where there isn’t a wait,’ and ’customers don’t like talking to an agent.’ It must be acknowledged that these assertions are partially true. With an increase in the volume of inquiries, the difficulty in hiring & training agents, and the high attrition rate of agents means that contact centers are always woefully understaffed and in a perpetual state of training. Providing alternative channels is a sound strategy. 

However, the idea that customers don’t want to talk to agents is not exactly true. IDC’s recent CX Path Survey, which surveyed individuals familiar with their contact center, gave some interesting insight into why customers choose agent-assisted channels such as Web chat, SMS and telephone. In the multi-response question 50.5% of the organizations indicated that ‘comfort with talking to a human’ was the number one reason for selecting an agent-assisted channel closely followed by ‘convenience’ at 47.3% and ‘complexity of interaction,’ at 43.1%.  While use of AI and automation can address aspects of ‘convenience’ and ‘complexity of interaction’ with channel availability and generative AI for specific content, replacing ‘comfort with talking to a human’ is more difficult. 

Another thought on the motivations behind ‘deflection;’ specifically cost savings. ‘Deflection’ to less costly channels may not be the panacea to the cost problem. Again, according to the IDC CX Path survey, 44% of the respondents indicated that on average their customers moved through three channels before achieving a final resolution to an issue. Of the respondent base 22.1% indicated two channels, 20% four channels and 8.4% five or more channels.  Astoundingly, only 3.2% of the respondents estimated that a typical customer used only one channel before reaching a resolution.  

Consider the customer’s perspective at the point they reach the resolution, either by themselves or with an agent after escalation. The customer has used a number of resources and still ultimately may have ended with an agent-assisted channel and potentially voice. There is a cost in this scenario in terms of customer tolerance/happiness etc., as well as the cost of putting these channels in place. 

What is my solution then? Don’t have multi-channels? Don’t move customers to other channels if agents will be the ultimate answer? No. Stop thinking about ‘deflection’ and focus on ‘the path of resolution’. I know, I know, everyone is trying to resolve problems. My thesis is to consider all channels part of the solution and leverage it as part of the journey. If a customer escalates, that isn’t a bad thing in and of itself. It is a bad thing if the flow through the channels wasn’t leveraged. Each channel should collect information that is context rich, and that context should move with the customer as they move across channels. Assume your customers are going to move through multiple channels. Each channel should perform its own triage so that when the customer reaches the agent, the agent is fully prepared to quickly help resolve the issue.

But what should the new word be that better represents a customer-centric viewpoint of the contact center?    

It needs to reflect:

  • You deserve our time. You paid money to us and bought our product; you have a right to service. This is a business transaction.
  • We want to help you. Really, you showed us your confidence in buying our product. We want to honor that.
  • We want to help you as quickly as possible, in the channel you prefer, at the time you prefer. At this moment, it is about you and not about us.
  • If you start with self-service, that is ok, but if it doesn’t resolve or satisfy your issue we will capture all that you have communicated and carry it along to the next channel with context to resolve the issue as quickly as we can.
  • If we do a good job in honoring our commitment to service a product that you deserve timely service on, then we hope to have earned your continued business. Only then does it become a ‘relationship.’ (Don’t get me going on loyalty.  That is one-sided. We’ll save that for another day.)

Back to the question of what to call it? What is a better name? We used to have What-You-See-Is-What-You-Get in publishing applications – WYSIWYG – it was pronounceable. 

  • Right-Channel-Right-Time – RCRT?
  • Efficient Handling?
  • Simply ‘Resolution?’

We’ll work on the name. Please send suggestions. The point is, deflection is out, and ‘positive customer handling with the customer in mind’ is in.

On October 3, I had the privilege of participating in a thought-provoking panel at DTX London 2024. The discussion revolved around one of the most pressing questions in the telecommunications industry: 5G versus Wi-Fi: Which technology will drive the future of connectivity?

As a 5G/mobility analyst at IDC in Europe, I was invited to join Paul Ridge, Director Consultant at 4C Strategies, and Dan Jones, Technologist at Hamina Wireless, to explore the opportunities, challenges, and future landscape of these two critical technologies.

The European Telecom Market: Facing Stagnation and Seeking New Growth

Before diving into the core debate, it’s essential to acknowledge the broader telecom market dynamics. The European telecom sector faces an array of challenges, including stagnating revenues, intensified competition from both traditional telcos and OTT players, and strict regulatory pressures.

Working closely with our telco colleagues around the world, IDC covers these issues across a range of research programs. As shown in IDC’s European 5G program (European 5G and Internet of Things Monetization and Adoption Strategies), price wars have squeezed margins, leaving telcos struggling to raise prices while shouldering the costs of 5G and fiber rollouts.

Telecom operators are pivoting toward service diversification, investing heavily in digital services, and shifting their strategies to seek new revenue streams beyond connectivity. This is where the conversation around 5G and Wi-Fi becomes especially significant.

5G: A Game-Changer for Telecoms

5G, particularly standalone (SA) networks, offers a lifeline for operators seeking to overcome revenue stagnation and expand into new business models. During the panel, I emphasized five key aspects of 5G SA that make it a cornerstone of future connectivity.

  1. Clean-Slate Architecture: 5G SA doesn’t rely on legacy technologies. This enables optimized network design, enhanced innovation, and greater flexibility.
  2. Cloud-Native Core: With a cloud-native foundation, operators can scale services dynamically, implement tailored network slices, and respond in real time to evolving user needs.
  3. Mobile Private Networks (MPNs): These enable businesses to deploy their own secure, private 5G networks that offer enhanced security, control, and reliability. MPNs also enable enterprises to run mission-critical applications independently from public networks.
  4. Network Slicing: This enables the creation of virtual, customized networks that cater to specific application requirements, such as ultra-reliable connectivity for autonomous vehicles or low-latency service for Smart Cities.
  5. Support for Key Traffic Types: The flexibility of 5G SA accommodates enhanced mobile broadband (eMBB), massive machine-type communications (mMTC), and ultra-reliable low-latency communication (URLLC), optimizing the network for a wide variety of use cases.

5G’s potential is immense — but its deployment in Europe has been slower than anticipated. To date, just 18 operators have launched 5G SA networks in Western Europe, and only a handful have commercialized network slicing capabilities.

Wi-Fi: The Complementary Force

5G offers compelling advantages, but Wi-Fi continues to be a dominant force, especially in residential and enterprise environments. The ubiquity of Wi-Fi, its ease of deployment, and lower cost make it an attractive option for fixed-location connectivity. However, Wi-Fi has limitations in mobility, security, and reliability — which is where 5G shines.

During the panel, we discussed how Wi-Fi remains ideal for specific customer requirements, such as indoor environments or smaller businesses with less demanding connectivity requirements. However, when mobility, low latency, and security are paramount, 5G emerges as the superior choice.

When 5G Outshines Wi-Fi

The results of IDC’s European 5G/IoT Survey 2024 highlighted that organizations are increasingly demanding mobile connectivity that extends beyond fixed locations. Nearly 70% of respondents said yes when asked, “Does your organization need mobile connectivity that extends beyond a fixed campus or location for anything other than personal devices?”

Businesses are looking for mobile solutions that enable them to monitor supply chains, manage remote operations, and ensure connectivity in dynamic environments.

Security remains a top concern, with 33% of survey respondents identifying enhanced security for data transmission and communication as their primary challenge. This has led to an increasing preference for keeping data in-house: Almost 49% of businesses cited trust and security concerns as a key reason for this choice.

This is where 5G MPNs come into play, offering businesses the security and control they need to manage sensitive data while generating new revenue streams through advanced digital services. According to IDC’s forecasts, the European MPN managed services market is expected to expand to a value of $818 million in 2028, with the MPN professional services market (including integration and consulting) projected to reach $615 million the same year.

In industries where deploying MPNs may not be feasible — such as public transportation or emergency services — 5G SA network slicing offers a flexible, secure alternative. With 5G network slicing, operators can create customized virtual networks, guaranteeing service-level agreements (SLAs) and ensuring reliable service for applications like connected ambulances or public transport vehicles. More than one-third (36%) of respondents in IDC’s European Telco Survey 2024 identified network slicing as a key driver of implementing 5G SA.

The reasons why businesses might choose 5G over Wi-Fi in certain campus or short-range scenarios include:

  • Security and End-to-End Control: 5G operates on licensed spectrum, offering higher levels of security compared to Wi-Fi, which uses unlicensed spectrum and is more vulnerable to interference and attacks. 5G networks enable operators to control and secure every part of the network from end to end, making it ideal for industries in which data protection is crucial.
  • Mobility: When mobility is important — such as in scenarios involving moving machines or vehicles — 5G excels due to its ability to maintain seamless connections during handovers between cells. Wi-Fi struggles with handover scenarios, leading to potential service drops when devices move across different access points. This makes 5G the better option for uninterrupted service in mobile environments.
  • Reliability in Aggressive Radio Environments: In radio-aggressive environments like factories, with machines and boxes creating interference, 5G’s micro-diversity and advanced signal handling capabilities make it more reliable than Wi-Fi. 5G’s ability to handle dark zones (areas with poor signal coverage) through reconfiguration also ensures consistent performance. Wi-Fi, however, may struggle in these areas.
  • Ability to Offer SLAs: 5G allows network operators to guarantee SLAs, providing commitments on performance, uptime, and latency. Wi-Fi cannot consistently offer these assurances. This is especially important in industrial applications requiring high reliability and low latency. 5G can provide predictable and measurable outcomes.
  • Control Over Different Parts of the Network: In 5G networks, operators have full control over network slices, traffic, and reconfigurations. This is essential in environments like manufacturing, where specific areas may need different levels of service or control. Customization at this level is difficult to achieve with Wi-Fi.

Blending 5G and Wi-Fi: The Future of Connectivity

Ultimately, the future of networking lies in the integration of 5G and Wi-Fi technologies. Each serves a distinct role in addressing the varying demands of consumers and businesses. Smartphones, for example, effortlessly switch between Wi-Fi and 5G depending on network quality, and this hybrid approach will likely become the norm across multiple industries.

Looking ahead, the combination of 5G’s robustness and Wi-Fi’s accessibility will enable a more flexible, efficient, and connected future. Telecom operators will continue leveraging both technologies to build the next generation of networks that deliver high-speed, secure, and reliable connectivity for all.

 

For more info on addressing growth in the telco space, please register for the following webcast: Addressing the telco growth imperative in EMEA

Masarra Mohamad - Senior Research Analyst, European 5G Enterprise Strategies - IDC

Masarra Mohamed is a senior research analyst specializing in analysing the connectivity and communications services markets, focusing on the changing networking requirements, trends, and competitive dynamics that support enterprises in their digital transformation. She explores how enterprise network strategies evolve to enable cloud, AI, and security.