A couple of months back, I was on my way to a medical appointment and the service loaner vehicle that I was driving suffered significant tire damage due to an unforeseen road hazard condition. The car detected “the collision”, and the vehicle immediately offered the option to connect with an Emergency Response Specialist.

The agent confirmed that I was safe and dispatched a tow service for the vehicle to be towed back to the dealership. Unfortunately, this was about the only high point of the experience! My requests for a replacement vehicle were met with rude and indifferent responses from the emergency agent as well as customer service personnel at my dealership. I was forced to stay with my vehicle until the tow service arrived despite mentioning that it was crucial for me to make my doctor’s appointment. No alternative transportation options were offered, and I had to wait for someone from my family to finally drop me off at the doctor’s.

This is a classic, and unfortunately, all too common example of an experiential disconnect. Where the brand failed on customer experience (CX) was not being able to connect the different pieces of my loaner vehicle journey to my expected outcome. Past insights such as my tenure as a customer, service/sales referrals, past service records, amount spent at the dealership, long standing relationship with my service advisor etc., felt neglected.

Intelligent orchestration allows enterprises to bring forward relevant insights from customer, organization, and other stakeholder relationships like the insurance company, tow service, or rideshare companies and apply these insights to the current interaction. Making this complex connection across numerous different contexts within a single customer interaction and correlating it to understand and meet a customer outcome, is where generative AI (GenAI) shines.

GenAI enables deeper and more accurate contextual awareness in customer engagements and more accurate recognition of customer intent. As a result, experiences are closer to customer desired outcomes.

With its ability to apply generative foundation models that can be trained on extraordinarily large amounts of data, GenAI is positioned well to strengthen foundational capabilities that power intelligent experience orchestration. These include:

Capturing richer customer context: Context defines a customer’s immediate need(s), and refers to the customers’ updated preferences, prior and current actions, behavior, sentiment, intent, location, goal/purpose, and circumstances.

Customer conversations are built on semantics, structures of concepts and observations that need to be inferred – an area where GenAI vastly outpaces predictive and interpretive AI.

GenAI was born to find patterns and correlations in unstructured data (e.g., sentiment, intent, emotion etc.), driven by trained sets of vast collections of textual data organized by discovered proximities of usage.

Improved context continuity: Context continuity means to be able to bring forward relevant insights from past customer, organization, and ecosystem, data, actions/inactions, and outcomes and apply these insights as relevant to the current engagement scenario.

The 2023 Future of Customer Experience survey found that just about a fifth of enterprises globally have the capability to maintain context continuity for all customers across all their brands. Other factors, which will continue to impact enterprises’ ability to manage context continuity, will be the massive proliferation of channels as well as the rise of connected customer data and insights. With GenAI, customer journeys can be infused with insights across multiple modes/channels.

Anchor customer journeys on outcomes (vs. outputs): In the earlier service incident example, the brand’s loaner vehicle journey only focused on making sure that the vehicle could be returned to the dealership – i.e., the output, and inward looking, organizational output at that. While required, this action begs the question if customer outcomes were even considered as part of the design?

GenAI inherently makes use of a declarative approach where a goal is the starting point. Combined with foundational models, trained on a vast knowledge base of richer contextual customer data, GenAI can even assist enterprises to begin journey design with specific customer outcomes. In addition, GenAI’s active learning capability can adjust customer interactions and journeys to meet these outcomes by actively accounting for changes in real time in customer needs, emotions, and intent in each interaction.

Improved system of connected insights: Optimizing orchestration depends on industrializing a system of connected insights – i.e., consuming a continuous stream of accurate and authentic customer intelligence.

The ability to consume vast amounts of unstructured data across channels/modalities offers enterprises a low-cost way to make collection of insights a byproduct of their customer engagements and not a separate process. Industrializing insights also includes wiring a deep understanding of customers into the company’s day-to-day actions – essentially, customer insights driving the business model.

Real-time journey automation and optimization: A key part of delivering intelligent orchestration is the fundamental automation capability required to connect data, tasks, and outcomes together. At its core, automation comprises of data and process connectivity and correlating insights to determine and execute actions.

With customer journeys becoming more non-linear, GenAI can increase the adaptability of customer journeys to be more dynamic. The model can evolve new responses based on changing customer/business events, while keeping the customer outcome constant. For instance, the exception noted in the earlier service loaner vehicle example. GenAI’s LLMs can suggest alternative journey steps or even navigational pathways (multiple journey steps), essentially redesigning journeys dynamically.

While GenAI excels at connecting and orchestrating insights at scale, it will not solve for the most crucial gap plaguing customer experience transformations – delivering value parity. Parity in the value exchange with customers means that the customer and the organization, equally, get something meaningful out of the exchange. Customer value parity is crucial since an imbalance can lead to loss of customer trust, and often, ends in customer attrition.

The age of AI Everywhere promises to offer enterprises significant experience-based market differentiation. However, to grow profitably in a highly competitive digital economy, enterprises must capitalize on intelligent orchestration, anchor on customer desired outcomes, and aim to achieve value parity between customers and brands.

Visit our Future of Customer Experience website to learn more about achieving customer empathy at scale and get more insights on our thought leadership for how intelligent customer experiences can drive profitable growth.

Sudhir Rajagopal - Research Director, Future of Customers and Consumers - IDC

Sudhir is Research Director for the CMO Advisory Service, focused on creating and executing programs and research to help companies make data-informed decisions about marketing. Sudhir's research and advisory focuses on how organizations must consider transforming their marketing function with AI at the center. In his role, Sudhir monitors the continual innovation of technologies, business strategy, and customer experiences to empower marketing leaders to make decisions on marketing strategy and operationalization.

There should be no surprise that there were plenty of device-related announcements at Mobile World Congress in Barcelona last week, with this year’s AI buzzword featured prominently across them. For those who weren’t able to attend or just need a quick summary, here is a quick catchup, particularly on the phone side:

  • Vendors like Honor and Xiaomi showed off their flagship phones, complete with a 7B parameter on-device LLM and eye-tracking in the case of the former and AI-enhanced photography in the case of the latter.
  • Silicon providers Qualcomm and MediaTek were in the same hall, similarly showcasing on-device AI with the former announcing its AI Hub with 75 models for partners to use, while the latter demonstrated an impressive SDXL Turbo image creation on-device.
  • Industry heavyweights Samsung and Google were there too. Even though they had already made their announcements in the weeks prior for a head start on the industry, the event still provided good hands-on time for attendees to try features like Circle to Search and Live Translate.

On-device AI is obviously still a very emerging development whose use cases still need to be solidified, but we nonetheless expect 15% of the total smartphone market to be AI-enabled this year, even if most of those phones will probably sell more on the premise that they are simply flagship phones. Either way, vendors like Samsung in particular will likely keep a strong position in the market given its established presence and brand name.

PCs usually aren’t the center of attention at MWC, especially given the opportunity to showcase them at the Consumer Electronics Show in Las Vegas the month prior. But this is a critical year for the PC industry as it also talks up the idea of AI PCs. Intel used this moment to introduce its commercially-focused vPro versions of its Core Ultra chips, complete with design wins from Lenovo and Dell. Even phone vendors were in play: Honor rolled out its MagicBook Pro 16 while China’s Transsion showed off a Core Ultra-based notebook under its Tecno brand name, complete with a gigantic 99.99Wh battery. IDC expects 20% of PCs this year to be AI-enabled.

A notable – if longer-term – narrative that emerged was the idea of abstracting away from apps so that users interact with a goal-driven AI layer of Large Action Models without having to manually go in and out of apps to get tasks done. Deutsche Telekom worked with Brain Technologies on its Concept AI Phone, while Honor and ZTE also talked it up under their respective Magic Portal “intent-based UI” and NebulaOS names. This can be very theoretical with lots of work needed in the years ahead, but it is an intriguing concept that could spur unconventional hardware like the Humane AI Pin and Rabbit R1, not to mention where Meizu seems to be pivoting to. Rabbit and Meizu were not present at the event, but Humane got plenty of attention through demos at the Qualcomm booth.

AI wasn’t the entire story though. For good measure, here are a few other eye-catching items for us:

  • Lenovo showed off its ThinkBook Transparent Display as well as Motorola’s bendable Adaptive Display, both of which were just concepts. Samsung finally showed off its Galaxy Ring hardware; it was encased in glass but still drew crowds.
  • Transsion had not just a large Tecno booth and launch, but also a small offsite media event for Infinix where it showed off technologies like wireless charging. ZTE unveiled a range of devices including an aggressively priced US$599 Nubia Flip as well as a Nubia Music phone featuring a large speaker that caters to common use cases in developing markets. Qualcomm also mentioned a plan for $99 gigabit 5G phones but withheld more details until later in the year.
  • A number of offsite vendors used the week to introduce their latest efforts, including HMD’s Barbie co-branding, OnePlus’ Watch 2 with dual OS’es and chipsets for long battery life, and OPPO’s improved Air Glass 3. Nothing provided a glimpse of its upcoming Phone (2a), which uses a custom processor developed through MediaTek’s DORA program.

Bryan Ma - Vice President - IDC

Bryan Ma is Vice President of Client Devices research, covering mobile phones, tablets, PCs, AR/VR headsets, wearables, thin clients, and monitors across Asia as well as worldwide. Based in Singapore, Bryan provides insights and advisory services for both vendors and users, and coordinates his team of analysts in building IDC's core market data, analysis, and forecasts in these sectors. Bryan has been quoted in a number of publications, including The Wall Street Journal, The Economist, The Financial Times, BusinessWeek, The South China Morning Post, and The New York Times. He has been a featured speaker at numerous industry conferences and appears frequently as a guest commentator on television networks such as CNBC, Bloomberg, and the BBC.

Governments across Europe, the Middle East, Africa (EMEA) and beyond are busy experimenting with and scaling AI and GenAI (generative artificial intelligence) use cases. The French and U.K. central governments’ GenAI-powered virtual assistant projects — in one case targeted at civil servants and the other at citizen chatbots — show the high level of interest and the early stages of maturity. Also in France, a large language model (LLM) is being introduced to improve the processing of legislative proceedings.

According to IDC EMEA’s 2023 Cross-Industry Survey, the government sector currently has the second-lowest level of adoption of GenAI in comparison to other industries (ahead of only agriculture). But the government sector has the highest percentage of organizations that plan to start investing in it over the next 24 months. Some government entities are taking a more cautious approach, putting restrictions on the use of commercial GenAI platforms, while considering developing their own LLMs.

This phenomenon is not new in the public sector. For several reasons, governments usually have a slower rate of adoption of new technologies.

One is that the public sector is obligated to guarantee access to their services to everyone. Government bodies thus require longer to test innovative technologies in order to deliver inclusive outcomes. Legal requirements can also constrain technology procurement, as can limited capacity and competencies.

The current AI investments are all critical steps toward realizing the benefits of data and AI in government — but they are not sufficient. Beyond operational use cases like virtual assistants, summarizing council meetings, expediting code development and testing for software applications, flagging risks of fraud in procurement and tax collection, and drafting job requisitions, governments need to think of the long-term impacts of AI and GenAI.

They need to think of what will happen when AI is used pervasively across industries and is widely accessible by individuals on their smartphones — when the potential benefits and risks of AI will impact government operations well beyond the current stage of maturity and affect the government’s role in society.

The Potential Impact of AI and GenAI on Future Government Operations and Policy

AI has been used in government — particularly by tax, welfare, public safety, intelligence, and defense agencies — for more than a decade. But the advent of GenAI indicates that existing AI applications only scratch the surface of what’s possible.

Government Operations

From a government operations perspective, AI- and GenAI-powered chatbots are just the beginning. European and United Arab Emirates government officials that we recently spoke with are already thinking about how the next generation of virtual assistants could entirely replace government online forms and portals.

For example, a natural language processing algorithm trained to recognize languages, dialects, and tones of voice could enable citizens to apply for welfare programs, farming grants, business licenses, and more just by sending voice messages.

An AI-powered system combining an automatic speech recognition system and an LLM model would comb through voice messages to identify the entity (individual or business) making the request and the key attributes, then feed the data to an eligibility verification engine. No forms would need to be filled in manually.

This scenario is not too far off. A regional government we spoke with is already collecting voice samples to test such a system for farming grant applications.

But multiple questions are raised. Legal and technical questions like: How and where should voice data be collected and stored to comply with GDPR? How can a citizen’s or business owner’s identity be verified through a voice message in compliance with GDPR and eIDAS? How can the government remain transparent and accountable for its decisions if there is not even a digital front end?

It also raises business and operational questions like: Will such a system really replace online forms — or instead become an additional channel that segments of the population use, thus pushing the volume of requests to a level that causes delays in government responses? Will the pervasive use of GenAI in the private sector multiply that volume effect?

Will lawyers’ pervasive use of GenAI incentivize them to file more proceedings, even ones they do not expect to win, because it is so easy that they may as well try? How will government business, legal, operational, technical, and functional capabilities evolve to cope with these challenges?

Policy

From a policy perspective, the spectrum of open questions is expanding by the day. One of the most critical questions, and one that many are thankfully already asking, is about the impact of AI-powered automation on the job market.

If workers are displaced by AI-powered automation, there is no silver bullet. Training programs are not fast enough and may not work for everybody.

Universal basic income can be part of the recipe. But how much is affordable and what is the right level of income? Will the government need to consider employing more people to cushion a drop in employment in other industries?

If so, are roles requiring both expertise and empathic interactions, such as education, healthcare, and social care, the right public sector domains to do so? If new jobs appear on the market, how does that impact worker social protection policies?

In a year when half of the global population will be asked to cast a vote, the impact of AI on democracy is also called into question. AI is already generating a surge in misinformation and increasing risks of polarized political positions.

What if the attempt of mainstream media to protect copyrights from web crawlers used to feed LLMs unintentionally opens the door for bad actors to make even more misinformation available to train GenAI? Does the government need to establish counter-misinformation authorities or issue laws and guidelines that hold the private sector accountable to do so?

If a government authority is established, how can it ensure public oversight and independence from the already existing cyberunits of defense and intelligence departments, which have a different mission? In France, a recent debate over media independence and balanced journalism might be settled by AI analyzing speeches, attendees, and ensuring pluralism. But who would train a democratic judge of pluralism?

What about the government’s ability to regulate private markets? What if AI and GenAI accelerate medical science through analysis of vast amounts of real-world health data that have been historically hard to collect and prepare for algorithm training? What if, for example, such an acceleration in medical sciences finds a cure that diabetics can use to treat their disease once and for all, instead of having to take medication for the rest of their lives? What would be the impact on the revenue model of pharma companies? Will governments have to change intellectual property rights entirely, to make sure that pharma companies invest in such treatments and make them affordable to all diabetics people around the world?

The same goes for cultural companies and intellectual properties. What would be the role of governments to ensure that culture workers can continue to participate in the entertainment industry and in the creativity and identity of a country through their art?

Finally, what are the ethical implications of using AI in warfare? There are already systems that can alert snipers of targets. What is their impact on the rules of engagement on the battlefield and on the accountability of the individual soldier and the chain of command?

These are big questions that require technology, legal, policy, ethical, and process experts to come together. They cannot be left to the chief information officer or the chief data officer. And they require civil service and policymaking leaders to engage openly with the public, with academic and private sector experts, to avoid the risks of being influenced (or perceived being influenced) only by lobbyists. They require international collaboration. They require measuring the value of AI not just in terms of productivity, but also in terms of fairness, robustness, responsibility, and social value.

Remi Letemple - Senior Research Analyst, IDC Government Insights - IDC

Remi Letemple leads IDC’s Worldwide Sustainable Transportation and Smart Vehicles Strategies service, where he provides strategic guidance and thought leadership on the future of mobility and transportation. Operating at a global level, he is recognized as a subject matter expert in smart mobility and transportation technologies—including connected, autonomous, shared, and electric mobility—enabled by software-defined vehicle (SDV) architectures, over-the-air (OTA) updates, cloud and edge platforms, and AI, including generative AI.

The Mobile World Congress serves as the catwalk for showcasing the latest and most advanced mobile devices and technologies. Brands offering smartphones, tablets, wearables, PCs, smart home, mixed reality, and more, brought forward the best-in-class products to highlight their latest innovation and offer a hint into the future of technology.

However, the technological landscape has changed significantly in recent years. Consumers are becoming more environmentally conscious, prompting the brands to reevaluate their strategies to align with more sustainable practices.

While sustainability wasn’t prominently featured at the show, it was a recurring topic in most conversations. This year, we witnessed brands moving beyond the traditional approach of simply incorporating recycled materials and reducing carbon footprints. Instead, they are now planning, designing, and manufacturing devices with sustainability at the core.

Over the past few years, manufacturers have invested in key initiatives to make their products more sustainable. Some of the key approaches are:

  • Recycled Materials: The majority of manufacturers are increasing the percentage of recycled components in their products, ranging from packaging to the chassis of the devices.
  • Device Reparability: This involves making spare parts available and improving the design of their products to facilitate easier repairs.
  • Energy Efficiency: Device manufacturers are actively working on reducing power consumption, by employing machine learning algorithms to efficiently allocate the right amount of processing power.
  • Extend Devices’ Lifespan: Manufacturers are extending the OS support, offering trade-in programs, and promoting the second life of their products.

Among the smartphone makers, brands like Fairphone have introduced modular smartphones to the market, emphasizing ethical sourcing, fair labor practices, and the environmental impact of their products. PC makers have also been pushing the bar over the years. Examples include:

  • Acer’s Vero National Geographics Edition laptop is made of post-consumer recycled plastic.
  • HP’s Ocean-Bound Plastics offers laptops and notebooks made of plastic collected from the oceans.
  • Lenovo’s commitment to recycled plastics, incorporating post-industrial recycled content plastic and recycled metals into its products.
  • Dell’s Concept Luna is a proof-of-concept developed with Intel to design laptops in a way that makes components more accessible, replaceable, and reusable.
  • Apple’s usage of recycled aluminum and materials. The Mac laptops use 100% recycled aluminum in the enclosures and cases of most products.

Most brands are clearly positioning themselves as green choices in the PC market. The strategy seems to be shifting from proof-of-concepts around modularity, reparability, and the inclusion of recycled materials, to designing products in a way that allows for easy component replacement. This approach is driven by the understanding that designing products with repairability at the core, using features like easy-to-remove screws instead of extensive gluing, makes repairs more feasible. Modern PCs are well known for complicated designs that make them harder to repair when they stop working, forcing the user to buy a new one.

One of the most interesting announcements at MWC was the partnership between iFixit and Lenovo to expand PC repairability. iFixit is renowned for its repair resources and product teardowns. The company advocates the right-to-repair movement, which has come into legislation in the European Union. Last year iFixit partnered with HMD to make spare components available for HMD smartphones and this year expanded into new collaborations in the PC space.

Lenovo and iFixit collaborated to co-design the ThinkPad T14 Gen 5 and T16 Gen 3, making them significantly easier to repair. Starting with a rear cover, more accessible screws, an easy-to-repair keyboard, reinforced clips, a direct battery connector, open-DIMM slots, and swappable port modules were incorporated – all practical and easy to replace. Almost everything can be repaired or replaced, but what is important in this announcement is the fact that iFixit helped Lenovo with the design of the laptops to ensure that all these parts can easily be repaired and replaced.

The complexity of accessing the interior of PCs and replacing components has been a significant challenge for users. Lenovo and iFixit address this concern by providing information about each component through a QR code for users to obtain proper information. The only components that cannot be easily replaced are the motherboard and most of the ports.

The upcoming availability of the new ThinkPad machines in April this year indicates a shift from the conceptual phase to the real implementation of sustainable practices at the core of the business.

Discover IDC’s Current Insights on Devices:

Francisco Jeronimo - Vice President, Data & Analytics - Devices - IDC

Francisco Jeronimo is VP for Data and Analytics at IDC EMEA. Based in London, he leads the research that covers mobile devices, personal computing devices, emerging technologies and the circular economy trends across EMEA. His team delivers data on personal computers, tablets, smartphones, wearables, PC monitors, PC gaming, enterprise Thin Client devices, smart home, augmented reality and virtual reality, and sales of used devices. He provides in-depth analysis of the strategies and performance of the key industry players.

On October 19th, 2023, AMD announced new processors for the workstation and high-end desktop (HEDT) markets. The processors are based on 5nm Zen 4 architecture and offer up to 96 cores and 192 threads of performance.

The Ryzen Threadripper PRO 7000WX series of processors, which are designed for professionals and businesses that demand top-tier performance, reliability, expandability, and security, feature AMD PRO technologies and eight channels of DDR5 memory.

Meanwhile, the Ryzen Threadripper 7000 series signals AMD’s return to the HEDT market, offering overclocking capabilities and the maximum clock rates possible on a Threadripper-based CPU. Power, performance, and efficiency are all made possible by 5nm technology and Zen 4 architecture. The Threadripper 7000 series provides ample I/O channels for desktop users, with up to 48 PCIe Gen 5.0 lanes for graphics, storage, and more.

The new processors were available from OEM and system integrator (SI) partners, including Dell Technologies, HP, and Lenovo, as well as do-it-yourself (DIY) retailers, from November 21st, 2023.

On November 13th, 2023, AMD announced the Radeon PRO W7700, a new workstation graphics card that offers high performance, reliability, and top-notch price/performance ratios for professional applications. The new card bridges the gap between the high-end Radeon PRO W7800 (32GB GDDR6) and the entry-level Radeon PRO W7600 (8GB GDDR6). The 16GB VRAM graphics card supports DisplayPort 2.1, AI acceleration, and hardware-based codecs for video editing and production.

This review will focus on the AMD Ryzen Threadripper 7980X processor, with additional coverage of the AMD Radeon PRO W7700 professional graphics card.

Test System Details

AMD Ryzen Threadripper 7980X Processor

The AMD Ryzen Threadripper 7980X processor (non-pro) signals AMD’s return to the HEDT market, offering overclocking capabilities and the maximum clock rates possible on a Threadripper series CPU.

Power, performance, and efficiency are all made possible by 5nm technology and Zen 4 architecture, which are available for the DIY market and SI partners. The Threadripper 7000 series provides ample I/O channels for desktop users, with up to 48 PCIe Gen 5.0 lanes for graphics, storage, and more.

AMD Radeon PRO W7700

With 16GB of Error Correction code (ECC) memory, the AMD Radeon PRO W7700 easily handles data-intensive operations. In terms of visual fidelity, the card features the New Radiance Display Engine, which supports 12-bit high dynamic range (HDR) color and recreates over 68 billion unique colors with high precision.

The Radeon PRO W7700 GPU’s major feature is its 48 unified RDNA 3 compute units, 48 second-generation ray accelerators, and 96 Al accelerators. The card has 16GB of GDDR6 ECC memory and four DisplayPort 2.1 (UHBR 13.5) connectors. The connectors, which provide up to 52.2 Gbit/s total bandwidth, are designed for 10K displays with 60Hz refresh rates, 2x8K displays, or 4x4K displays with Display Stream Compression technology.

AMD’s new dual media engine offers hardware-accelerated support for AV1 encoding, with the Radeon PRO W7700 capable of delivering 7680×4320 video at 60fps (8K60). The media engine supports two AVC and HEVC streams that can be encoded or decoded simultaneously. For live broadcasters, AMD has included many capabilities that increase both performance and quality.

Memory and Motherboard

We installed the Ryzen Threadripper 7980X processor on a Gigabyte TRX50 AERO D motherboard, alongside the G.SKILL Zeta R5 Neo DDR5-6400, CL32-39-39-102, 1.40V, 128GB (4x32GB) kit with AMD EXPO memory overclocking and ECC support enabled.

AMD Ryzen Threadripper CPUs only support DDR5, LRDIMM, and 3DS RDIMMs. Threadripper 7000 processors can handle up to 8 channels/2TB on PRO motherboards (based on 8x256GB DIMMs) and up to 4 channels/1TB on HEDT motherboards (based on 4x256GB DIMMs), with support for both single-rank and dual-rank at 5200Mhz and a single DIMM per channel. ECC is enabled, although its functioning varies depending on the motherboard. The maximum official transfer rate varies by DIMM configuration, like with other AMD Ryzen CPUs.

Other Components

The Windows 11 main storage device was a 1TB GIGABYTE AORUS NVMe Gen4 solid-state drive. AMD provided a 360 all-in-one water cooler; however, it did not completely cover the CPU surface. Instead, we used the Arctic Freezer 4U-M, an 8x6mm direct contact heatpipe tower cooler with 2x120mm fans in push/pull mode. This cooler is intended for the most powerful server and workstation CPUs with up to 96 cores and a thermal design power of up to 350W.

The be quiet! STRAIGHT POWER 11 Platinum 850W power supply powered the system. A 34″ Dell Gaming S3422DWG monitor — a Quad-HD 3440×1440 display with a 144Hz refresh rate, FreeSync, 10-bit colors, and HDR support — was also utilized.

Benchmarks

Blender Benchmark

Blender Benchmark version 4.0.0 was used to assess the AMD Ryzen Threadripper 7980X processor’s rendering performance. With a score of 1708.66, the processor’s performance ranked among the top 28% of benchmarks running the same workloads. Given the inclusion of GPU results, the CPU performed brilliantly.

In terms of GPU results, the AMD Radeon PRO W7700 ranked in the top 27% of benchmarks, with a slightly elevated score of 1883.80. This reflects how strong the processor is at GPU-level rendering, which is fantastic news for studios that rely on CPUs for production.

IndigoBench

IndigoBench v4.4.15 is another standalone benchmark based on Indigo 4’s rendering engine and the industry-standard OpenCL.

With a total score of 47.54 million samples per second, the Threadripper 7980X ranks fourth among the top CPU performances when using normal settings and no overclocking. The processor also outperforms the Threadripper 3990X and Pro 5995WX by 30% and 33%, respectively, demonstrating a significant generational jump.

PCMark 10

PCMark 10 is a comprehensive benchmarking tool that covers the wide variety of tasks performed in the modern workplace. Web browsing, videoconferencing, spreadsheet and word processing, photo and video editing, and rendering and visualization are some of the tasks tested by the tool.

The 8,772 score the test platform achieved was better than 98% of all results produced by PCMark 10.

CINEBENCH

The 2024 edition of Cinebench now includes a GPU benchmark that takes advantage of Redshift, Cinema 4D’s default rendering engine. The Radeon PRO W7700 scored 9,504, nearly matching the Radeon Pro W6800, which scored 9,643 (according to the test database). This result demonstrates the level of sophistication of RDNA 3 computation, given the Radeon Pro W7700 has half the infinity cache and dedicated graphics RAM of the W6800.

Based on the 92,817 Cinebench R23 result, the AMD Ryzen Threadripper 7980X CPU is nearly three times faster than the Ryzen 9 7950X. This result demonstrates that the Threadripper is in a class of its own and is a much-needed high-performance solution.

3DMark CPU Profile

This test stresses the CPU at various levels of threading while reducing the GPU burden, ensuring that GPU performance is not a limiting factor. It takes advantage of sophisticated CPU instructions sets supported by different processors, including Advanced Vector Extensions 2 (AVX2). It also leverages the straightforward, highly efficient simulations provided by the SSSE3 code path.

With standard settings and no overclocking, the AMD Ryzen Threadripper 7980X CPU score of 25,374 qualifies for 3DMARK’s MAX Threads Hall of Fame. It ranks among the top 100 benchmark scores ever recorded, and holds 25th place among the world’s most skilled overclockers.

V-Ray 6 Benchmark

The V-Ray Benchmark, which uses the V-Ray 6 render engines, was used to gauge the system’s rendering speed.

With a vsamples score of 120,247, the AMD Ryzen Threadripper 7980X CPU is nearly twice as fast as the Threadripper Pro 599XWX and 3990X, representing a considerable generational leap.

SPECworkstation

The SPECworkstation 3.1 Benchmark fully assesses workstation performance across a variety of professional applications.

The AMD Ryzen Threadripper 7980X CPU scores are higher across all application groups, except for apps that rely more on the processor (such as financial services). This exception is due to the use of the Radeon PRO W7700, a midrange professional graphics card. Higher results across all application groups could be achieved with the use of the Radeon Pro W7800 or W7900.

Gaming

Since many professional gamers and streamers utilized HEDTs in the past to support multitasking — playing games, encoding and recording gameplay, and streaming to several web platforms — the Threadripper’s gaming performance was evaluated on this professional test platform. Professionals that enjoy playing games would undoubtedly prefer not to invest in another gaming PC after paying a premium for this test platform.

Shadow of the Tomb Raider ran at an average 61 frames per second (fps) at 1440p, with a minimum of 42fps. The highest graphical settings, as well as AMD’s FidelityFX CAS package, were enabled. Surprisingly, the use of XeSS for upscaling while running the game test boosted performance by 10% at the same settings, achieving a minimum of 50fps and an average of 66fps. This might be a demonstration of the RDNA3 architecture’s AI acceleration capabilities and the Radeon Pro W7700’s AI accelerators.

Far Cry 6 ran at an average 104fps at 1440p, registering a minimum of 92fps. All DirectX Raytracing (DXR) and FidelityFX Super Resolution (FSR) features were enabled during testing.

Cyberpunk 2077 ran at an average 36fps at 1440p, registering a minimum of 28fps. Ultra-ray tracing presets and FSR 2.1 features were automatically enabled.

The fact that the gaming results were 100% GPU bound indicates that the CPU was never a bottleneck and that employing top-tier gaming cards can improve gaming performance.

IDC Opinion and Conclusion

When AMD announced the Threadripper 5000 series in the Pro-only category, primarily for OEMs, the enthusiast community was left feeling let down. However, we are pleased that AMD did not abandon those customers for too long. AMD brought this category back to life after realizing — as its competitor had already done — that this is a prestigious and necessary niche market that cannot be satisfied by high-end consumer CPUs.

We are also pleased to see that the HEDT refreshment with the Ryzen 7000 platform supports the newest and greatest in networking and connectivity with excellent I/O support, including PCIe5 and DDR5 ECC registered memory modules (RDIMM/RDIMM-3DS), in addition to USB4 Type-C, 10 gigabit ethernet (10GbE), and Wi-Fi 7.

In the past, it was impossible to reach extremely high speeds while remaining stable and controlling voltage and temperature. However, this CPU is so quick, snappy, and opportunistic as it can surge up to 5.1 GHz when just a few cores are on demand, and 4.1 to 4.7 GHz when all cores are stressed, which is incredible.Furthermore, attaining rates of up to 6400MHz is another productivity breakthrough as it was previously difficult to overclock ECC RAM above the norm.

Aside from its intense performance, efficiency is the most striking aspect of the processor. Under full load, the Threadripper 7980X’s power consumption did not go over 340W. High-end consumer CPUs with fewer cores use the same amount of energy.

Although the Radeon Pro W7700’s power output stayed under 140W, we were not as satisfied with its clock speed, and thought there was potential for a higher frequency that was purposefully regulated. With our 850W platinum power supply, we had no trouble operating the system overall, and were even able to install it in a midi tower case.

We would love to see more partner solutions for cooling to fully cover the processor’s integrated heat spreader as well as motherboard support for extreme high-end use cases that require up to seven or eight graphics cards. The Threadripper 7000 series is more than capable of handling booming AI, machine learning, and training solutions — as well as media production and automotive rendering workloads — when needed on desktop platforms.

AMD should consider a SI certification scheme, similar to AMD Advantage in gaming. By doing so, it can provide customers with reliable and better experiences on an all-AMD platform that features the Threadripper and the Radeon PRO. This strategy will strengthen trust in the AMD brand and help SIs compete against OEMs with ISV-approved devices.

In conclusion, the AMD Ryzen Threadripper 7980X reigns supreme among HEDT CPUs. It delivers great performance straight out of the box, with most cores running at the highest clock speeds in a very energy efficient manner.

Mohamed Hakam Hefny - Senior Program Manager - IDC

Mohamed Hefny leads market research in EMEA on professional workstation PCs and solutions. He also reports on professional computing semiconductors, processors, and accelerators (CPUs and GPUs), as well as breakthroughs and trends related to the market. In addition, Mohamed is actively involved in AI PC taxonomy and research. He participates in business development projects, contributes to consulting activities, and provides IDC customers with analysis, opinions, and advice.

From its inception, the telecommunications industry has leveraged automation to enhance services and user experiences. As AI takes center stage, IDC surveys have shown that the primary use case for telco AI will be the improvement of customer experience (CX).

Telco AI: What’s Already Been Done?

The advent of telco AI can be seen as early as the beginnings of mechanical telephone switching in the 1890s. The introduction of the mechanical switch revolutionized the way callers connected, leading to faster connections and effectively managing the exponentially increasing complexity of connections as landline phone penetration skyrocketed.

“That’s not AI — that’s just automation!” you may cry. But the impact on the workforce of manual switch operators was profound. And this shares some similarities to the transformative effect that generative AI (GenAI) applications are having on creative professionals today.

In recent history, the visible face of AI in telecoms is the ubiquitous digital customer service agent — the chatbot. Examples like Vodafone’s TOBi, launched in 2017, showcase the initial steps toward automated customer interactions.

These applications, however, often struggle when customers deviate from predetermined scripts. Beneath the surface, telecom networks rely heavily on AI and automation to optimize services, rout network traffic, monitor anomalies, and analyze customer interactions to recommend tailored product bundles.

What Telco AI Use Cases Will Be Big in 2024?

The successful launch of OpenAI’s ChatGPT in 2022 significantly elevated industry expectations for AI applications. Throughout 2023, experimentation accelerated, particularly in telecom CX, software coding support, and knowledge management.

In 2024, these use cases are set to expand into production environments, with continued exploration of how predictive and generative AI can support existing telecoms use cases.

Two key CX use cases are customer-facing chatbots that have enhanced natural language understanding, and AI customer sentiment analysis and personalization. By leveraging large language models (LLMs) and retrieval augmented generation (RAG) capabilities, chatbots will be able to answer customer questions like, “Why is my bill higher this month?” Such capability was extremely rare previously. Telcos like BT, DT, Orange, and Vodafone are examples of telcos exploring these capabilities.

Beyond CX, AI will bolster coder productivity with solutions like Microsoft Github Copilot and Amazon CodeWhisperer. Investment will go toward internal chatbots and knowledge management tools across departments, including sales, HR, legal, and network operations.

How AI Will Shape Telco CX by 2030

Looking to 2030, AI’s role in telecoms will become even more customer-centric. For example, energy efficiency solutions, currently focused on macro-networks, could be extended to customer devices, prolonging battery life.

Direct changes in customer interactions will manifest in advanced chatbots offering complete digital sales experiences. These chatbots will craft personalized packages based on customer preferences and budgets, eliminating the need for human intervention.

Moreover, this evolution in chatbots will align with the rise of metaverse environments that will incorporate visual representations of AI agents and use features like AI-driven body language to boost customer engagement in a 3D environment.

In summary, 2024 sees the telecoms industry again at the forefront of significant transformations, propelled by AI’s ability to automate tasks and deliver an elevated customer experience. At IDC, we will continue to cover the development of AI technologies and the telecoms industry in depth, with some of our most recent reports focusing on the telecoms GenAI value chain and the AI-driven evolution of telco CX platforms.

Chris Silberberg - Research Manager, Communication Service Provider Operations and Monetization - IDC

Chris Silberberg is Research Manager for IDC's global Communication Service Provider Operations and Monetization research. Chris' core research coverage includes the evolution of telco monetization, customer experience, orchestration, and assurance capabilities. Telcos are at a crossroads, double down as utility providers or become digital service power houses. Both strategies demand communication service providers fundamentally transform their IT capabilities to enable customer first experiences, autonomous operations, and the capacity to innovate monetization models at scale.

Many B2B marketers are wondering if they can focus on both customers and data at the same time. Can you be customer-focused and data-focused at once?

The quest to resonate with customers with B2B marketing content, while leveraging data-driven insights has become paramount. Content marketing services now prioritize building relationships with the audience and providing value over traditional sales tactics. This shift in focus aims at creating more meaningful connections with customers. By focusing on relationships and value, companies can better engage with their target audience and build trust. This approach ultimately leads to increased brand loyalty and customer retention.

Using analytics helps marketers better understand consumer behavior and adjust their B2B customer journey strategies accordingly. With all these new developments, can a company balance focusing on customers and using data for marketing effectively?

Let’s dive into the depths of these two methodologies and explore the potential for synergy:

Understanding Customer-Centric Marketing

Customer-centric marketing focuses on prioritizing the customer’s needs, preferences, and experiences in all marketing efforts. It’s important to understand the audience, empathize with their challenges, and create valuable solutions for their B2B customer journey. In B2B tech, trust and relationships are crucial due to complex solutions. Building trust and relationships is very important.

The Essence of Data-Centric Marketing

Data-centric marketing uses data analytics to understand consumer behavior, trends, and preferences for making informed decisions. By meticulously analyzing metrics, marketers can uncover patterns, identify opportunities, and optimize their strategies for maximum impact. From tracking website interactions to monitoring social media engagement, data serves as the compass guiding marketing efforts towards greater effectiveness and efficiency.

The Interplay Between the Two

While customer-centric and data-centric marketing may appear dichotomous, they are not mutually exclusive. In fact, they can complement each other synergistically to drive superior results. Here’s how:

Personalized Experiences: Data analytics enable marketers to segment their audience based on demographics, behaviors, and interests. By using this data, companies can create tailored content and experiences for different groups, enhancing their customer-focused strategy.

Iterative Optimization: Marketers can improve their strategies by analyzing data and feedback from customers. They can make adjustments to better meet customer preferences and market trends. This iterative process fosters a culture of continuous improvement, reinforcing the customer-centric ethos.

Predictive Analytics: Leveraging advanced analytics techniques such as predictive modeling, marketers can anticipate future trends and consumer needs with greater accuracy. By addressing these needs early, businesses can stay ahead and be seen as trusted advisors by their customers.

Measurement of Impact: Data-driven methods help marketers measure the success of their customer-focused efforts with specific metrics. These metrics give important information about how well strategies are working. They help marketers make necessary adjustments. The metrics include conversion rates and customer lifetime value.

Customer Data platforms will deliver high-quality data for predictive AI and GenAI, activating 80% of real-time personalized customer interactions at scale for G2000 firms with 4x engagement gains by 2026.

IDC FutureScape: Worldwide Future of Customer Experience 2024 Predictions. IDC #US50111423, Oct 2023

Challenges and Considerations

While the marriage of customer-centric and data-centric marketing holds immense promise, it is not without its challenges. Marketers must diligently navigate issues such as data privacy concerns, data silos, and the risk of algorithm bias. Balancing numbers and insights is important to keep the human touch in data-driven efficiency.

The convergence of customer-centric and data-centric approaches represents a powerful paradigm shift. Marketers can use data analytics to improve customer experiences and build strong relationships with their target audience, leading to business growth. Ultimately, it’s not a question of whether it’s possible to be a customer-centric and data-centric marketer simultaneously, but rather how effectively you can harness the synergies between these two paradigms to deliver exceptional value in an ever-evolving landscape.

How Will AI Elevate the Customer Experience in the Near Future?

In the near future, the integration of artificial intelligence (AI) promises to revolutionize the customer experience, offering unparalleled levels of personalization, responsiveness, and authenticity. Two key predictions from IDC shed light on the transformative potential of AI in shaping digital interactions and enhancing customer journeys:

  1. Real-Time Digital Experiences: AI algorithms will make digital experiences change in real-time based on user behavior, preferences, and context. Content and interactions will adjust dynamically to create personalized experiences. Whether it’s tailoring website interfaces, optimizing email campaigns, or refining product recommendations, AI-driven personalization will create seamless and engaging experiences that resonate deeply with customers.

Between 2024 and 2026, digital experiences will be updated in real-time based on measured analysis of content usage aligned to the customer journey.

IDC FutureScape: Worldwide Future of Customer Experience 2024 Predictions. IDC #US50111423, Oct 2023

2. Individualized Personalization: With AI, customer interactions transcend personalization to be individually tailored to the timing and context of each customer’s journey. By analyzing vast amounts of data, including past interactions, purchase history, and browsing patterns, AI algorithms can anticipate customer needs, deliver relevant content, and engage customers at the right moment with the right message.

Customer interactions will be individually personalized (e.g. subject line, send time, content, images, preferred channel) and have the timing and context of each customer’s journey, proving that the brand understands their needs in the context of now. These personalized interactions improve customer engagement across all types of content assets.

IDC FutureScape: Worldwide Future of Customer Experience 2024 Predictions. IDC #US50111423, Oct 2023

Data is King

At the heart of AI-driven customer experiences lies the importance of data. People are more aware of their personal data and rights. Marketers must use public and private data carefully. They need to create content that connects with customers. This should be done while respecting their privacy and preferences.

The Importance of Quality Content

In the era of AI, content remains a cornerstone of informed purchasing decisions, particularly in B2B tech marketing. But too much similar content just creates noise.

In 2026, the number of content creators who make money from content will top 800 million, up from 500 million in 2023.

Ten IDC Generative AI Predictions Influencing Persuasive Content Management and the Customer Experience. Jan 2024 – Document type: Tech Buyer Presentation – Doc  Document number:# US51801424

Quality over quantity is imperative. Focus on creating content that empowers buyers to research, compare, and evaluate solutions online, to drive confidence in their purchasing journey. However, in a landscape where trust and authenticity reign supreme, quality trumps quantity when it comes to content creation. AI can help spread content and engage customers, but it’s important to keep brand trust. Mistakes can quickly make customers lose confidence and loyalty.

AI and customer experience coming together means brands can now have more personalized, responsive, and authentic interactions with customers. By harnessing the power of AI-driven insights and data analytics, marketers can unlock new opportunities to deepen customer relationships, drive brand loyalty, and shape memorable experiences that resonate long after the initial interaction. As AI continues to evolve and integrate into marketing strategies, the imperative for marketers lies in leveraging data responsibly, prioritizing quality content, and fostering trust in an increasingly digital and dynamic landscape.

Learn More About IDC’s:

The space economy has undergone a transformative evolution in the past two decades. The entry of private companies into the industry has created new avenues for business in Earth’s orbit and beyond.

This journey began with the milestone 2004 commercial spaceflight of Scaled Composites’ SpaceShipOne, funded by the Ansari XPrize, which showcased the viability of privately-funded space travel. The success laid the groundwork for pioneers like SpaceX, Blue Origin, and others to venture into commercial endeavors that span space exploration, satellite launches, crewed missions, and more.

Widely recognized examples — such as the GPS technology that shapes our navigation systems and the satellites that enable television broadcasting to our homes — show space’s impact on our daily lives.

We note the acceleration of the space economy and are taking this opportunity to delve into ICT opportunities arising from space tech and research. There’s still a vast reservoir of untapped business potential within the space economy.

McKinsey has projected the market to reach a value of $1T by 2030,  doubling its 2022 size.

This unprecedented growth is concentrated on four subdomains:

Earth Observation Technologies: Space-derived technologies have become integral to Earth observation. They facilitate precise weather forecasting, disaster management, and environmental monitoring, optimizing routes, tracking assets, monitoring infrastructure and managing supply chains. Satellites equipped with advanced ICT systems capture invaluable data, empowering diverse sectors.

In precision agriculture, satellite data is used to optimize crop yields by monitoring factors such as soil moisture levels and crop health. This data enables farmers to make informed decisions about irrigation, fertilization, and pest control, ultimately increasing productivity and reducing resource usage.

In disaster management, satellites provide real-time situational awareness during crises such as hurricanes, wildfires, and floods. By monitoring changes in weather patterns and surface conditions, authorities can effectively plan and coordinate emergency response efforts, minimizing damage and saving lives.

Companies like Maxar Technologies provide satellite imagery and analytics platforms that support industries in monitoring aspects of Earth. Airbus Defense and Space collaborates with Maxar Technologies to enhance global imaging capabilities through satellite projects. The World Bank utilizes Maxar’s expertise in satellite imagery for disaster risk management and infrastructure planning. Mining giants like Rio Tinto rely on Maxar’s solutions to optimize exploration and monitor environmental impacts.

Communication Satellites and Global Connectivity: Constellations of small satellites in low Earth orbit are transforming telecommunications. These satellites promise faster internet speeds and lower latency, disrupting traditional satellite systems and terrestrial ISPs alike.

The mesh network architecture of Starlink facilitates seamless communication between satellites and ground stations, ensuring high-speed internet access even in remote areas like the Amazon rainforest that lack technical infrastructure.

This innovative approach enhances connectivity for individuals and businesses and opens new opportunities for telecommunication providers, content providers, and ecommerce platforms to expand their outreach and services globally. Starlink’s impact spreads across industries.

For Carnival Cruise Line, Starlink facilitates crew connectivity with loved ones while enhancing guest experiences and operational functions on its world-class cruises. Brightline, a transportation company, credits Starlink for revolutionizing train connectivity, providing reliable connectivity for guests and invigorating excitement among train enthusiasts. In the education sector, Chilean school districts have experienced a significant upgrade in connectivity, with Starlink empowering teachers and students with robust and efficient high-speed internet.

Telemedicine from Space: The convergence of space technology and healthcare has sparked significant innovations in telemedicine, leveraging robotic telepresence systems for remote specialist consultations and surgeries.

Drawing inspiration from space mission requirements for remote task execution, these systems enable healthcare providers to deliver care to patients in remote or underserved areas, transcending geographical barriers. The integration of space-derived technologies into healthcare holds the potential to revolutionize patient care, address healthcare disparities, and optimize clinical outcomes.

Companies like Intuitive Surgical have been instrumental in advancing robotic surgical systems, as exemplified by the da Vinci Surgical System. This technology has significantly improved minimally invasive surgeries by enhancing precision and control.

Intuitive’s Single-Site technology, designed for specific procedures, aims to minimize scarring and enhance patient satisfaction. Intuitive’s robotic platforms utilize high-precision imaging and visualization technologies, including high-definition 3D vision and magnification capabilities. These contribute to improved surgical precision and better outcomes for patients.

Space Robotics and Automation: Specialized robots are being designed and developed for space exploration, satellite servicing, and tasks in harsh space environments. These robots handle assembly, maintenance, repair, and exploration missions, operated remotely from Earth or autonomously. Their crucial role in advancing space exploration makes them indispensable for future missions and scientific discoveries.

Honeybee Robotics leads the fusion of space robotics with terrestrial applications, revolutionizing industries spanning mining, energy, infrastructure inspection, and agriculture. Leveraging space-derived technologies, the company develops autonomous systems that enhance efficiency and safety across diverse sectors.

In mining, robotic drilling systems and sampling tools facilitate exploration and resource extraction in remote or hazardous environments, boosting productivity while minimizing operational risks. In agriculture, robotic systems streamline tasks such as soil sampling, crop monitoring, and harvesting, optimizing practices and bolstering yields.

Pacific Gas and Electric Company (PG&E) harnesses Honeybee Robotics’ robotic platforms to inspect and maintain critical infrastructure, including natural gas pipelines and electrical transmission lines. These solutions empower PG&E to conduct remote inspections, detect anomalies, and execute maintenance tasks with greater efficiency and safety.

Honeybee Robotics works with agricultural equipment manufacturers like John Deere to explore the integration of robotic technologies into farming equipment, providing farmers with innovative solutions for precision farming and crop management.

Life in Space: The Role of ICT

If we take some research applications and look into future business opportunities, shaping life in space is the way to go. During mission planning, technology tools assist in trajectory optimization, resource allocation, and risk management, ensuring efficient utilization of resources and the achievement of mission objectives in the unforgiving space environment.

From an operational perspective, tech enables real-time monitoring and control of spacecraft systems, as well as communication between ground control centers and astronauts aboard spacecraft.

Looking even further into the future, there is immense potential for ICT technologies to support extraterrestrial activities, such as mining on Mars or the Moon, where advanced robotics, AI, and data analytics will be essential for resource extraction and colonization.

As we wrap up this dive into ICT opportunities within the space economy, it’s evident we’ve only skimmed the surface. From telecommunications to healthcare, space tech is reshaping industries, offering countless business prospects.

The space economy not only fuels tech advancement and scientific collaboration but also equips businesses with cutting-edge solutions, tested in real-world conditions. By embracing space-derived tech like satellite imaging and remote sensing, industries boost efficiency, optimize resources, and make crucial decisions more effectively.

The convergence of space tech with various sectors highlights the need for a robust tech ecosystem and interconnectivity. This fusion drives demand for key ICT technologies, including data analytics, telecommunications, cloud computing, AI, and robotics.

Data analytics, powered by satellites, aids precision agriculture and disaster management. Telecom innovations, such as small satellite constellations, expand global connectivity. Cloud computing processes vast data sets from satellite imagery, fostering innovation. AI analyzes satellite data for resource optimization and urban planning. AI-driven robotics perform tasks autonomously, from infrastructure inspections to surgical procedures.

Industry collaboration, R&D investment, and further implementation of space tech applications will unlock new markets, drive innovation, and propel growth for the entire technology sector.

As we dive deeper into our space economy research, we want to hear success stories and lessons learned from early adopters. If you want to join the conversation, please contact me at.anguedes@idc.com.

Why? Because data shows that in many cases, a sizable portion of enterprises fall short of managing key types of cybersecurity risks. For example:

  • About half of enterprises surveyed for IDC’s Cybersecurity Capabilities Assessment Framework don’t systematically scan and monitor a majority of their remote endpoints.
  • Barely half of organizations have mobile device management (MDM) strategies in place.
  • Well over half of businesses either don’t generate Software Bills of Materials (SBOMs) to track supply chain security risks at all, or they rely on inconsistent, manual approaches to producing SBOMs.
  • Most organizations report that it takes them at least a week to discover security active threats.
  • Only a minority of organizations have automated compliance tools and processes in place that allow them to scan for and discover risks on a continuous basis.

I could go on, but you get the point: Being in good company on the cybersecurity front doesn’t mean you’re where you want to be. If you want to minimize your exposure to threats, you need to be among the minority of organizations that comprehensively and systematically manage security risks of all types, across all domains – not the majority who fall short in critical areas.

Why it’s hard to do better at cybersecurity

To be fair, it’s hard to blame the typical organization too much for lackluster performance on the cybersecurity front. Implementing a comprehensive cybersecurity program is much easier said than done – due especially to the fact that there’s so much to secure, and that requirements change so quickly.

Because of this complexity, simply deciding how to organize a cybersecurity program can be challenging, given the many different types of risks and threats to manage and the complex ways in which they overlap.

For example, since virtually everything today touches the network in some way, does network security require a distinct set of tools and processes, or do you need to bake network security into other aspects of your security operations? For another example, do mobile devices require their own security strategy? Or should you simply treat them as endpoints – because they are, after all, endpoints at the end of the day.

Struggles to answer questions like these help explain why businesses routinely fall short when it comes to cybersecurity – and why virtually every year over the past decade has set new records for the frequency and cost of attacks. When it’s unclear how to begin approaching cybersecurity and formulating a strategy that covers all key risk areas coherently and efficiently, you’re set up for failure.

A framework for cybersecurity improvement

At IDC, we think organizations can tackle this challenge by devising security strategies that cover seven distinct domains:

  • Network security
  • Endpoint security
  • Identity and digital trust
  • Data security
  • Application security
  • Response, recovery, and resilience
  • Governance, risk, and compliance (GRC)

To be sure, this taxonomy isn’t perfect. There is some overlap between these categories, and in some cases, it may not be clear where emerging technologies – like generative AI tools and services, which in some ways resemble applications but in other ways are all about data – fit in. But we believe it’s a useful foundation for identifying what enterprises need to secure, and how they should organize their security strategies.

From there, beating the curve when it comes to cybersecurity means implementing effective defenses in each of the seven domains identified above. Exactly how you do that, of course, depends in large part on factors like which types of IT assets you have to secure, which cybersecurity tools are available to you and how numerous and experienced your cybersecurity staff are. I can’t tell you exactly which cybersecurity practices are best for you.

But I can tell you – based on data like the information we compiled to substantiate the Cybersecurity Assessment Maturity Framework – what organizations that are optimized for security do differently from the average organization, and which cybersecurity practices can set your enterprise apart from the crowd in a good way.

Using that insight, you can make sure your business sits higher up in the tree, away from the low-hanging fruit that threat actors tend to target first.

To be sure, there’s no way to guarantee you’ll be safe from attack. Even if you’re in the one percent of most secure enterprises in the world, threat actors who really want to break into your IT estate can likely find a way to do so, given enough time and resources. But the reality is that most threat actors just want to breach some company, not your company in particular – so, by beating the average when it comes to protecting against cybersecurity risks, you dramatically reduce your risk of attack.

Learn more about the state of enterprise security – and how your business stacks up

Want more insights on exactly where the typical enterprise falls short on the cybersecurity front? And more actionable guidance on mitigating cybersecurity threats across the seven key cybersecurity domains laid out above?

Tune in for our upcoming webinar, “Cybersecurity Norms and Trends: How Does Your Business Stack Up?” on March 13th at 12Pm/ET, where IDC analysts will walk through data detailing the state of enterprise security and offer guidance on overcoming the roadblocks standing between average and best-in-class cybersecurity performance.

Christopher Tozzi - Adjunct Research Advisor - IDC

Christopher Tozzi, an adjunct research advisor for IDC, is senior lecturer in IT and Society at Rensselaer Polytechnic Institute. He is also the author of thousands of blog posts and articles for a variety of technology media sites, as well as a number of scholarly publications. Prior to pivoting to his current focus on researching and writing about technology, Christopher worked full-time as a tenured history professor and as an analyst for a San Francisco Bay area technology startup. He is also a longtime Linux geek, and he has held roles in Linux system administration. This unusual combination of "hard" technical skills with a focus on social and political matters helps Christopher think in unique ways about how technology impacts business and society.

On Sunday, February 25, we hosted our brunch event to kick off IDC’s Mobile World Congress (MWC) activities in Barcelona. Key executives and decision makers from leading companies in the telecoms and technology sectors attended.

We delivered presentations addressing key transformations underway in the telecoms sector. A panel discussion was held in which senior industry executives shared their perspectives on the future.

Key Overarching Challenges Across the Industry

The telecoms market is massive, with annual worldwide telco services spending of around $1.6 trillion, according to IDC’s Telecoms Services Tracker. The industry, which is showing growth after an anaemic period, makes up 27% of the overall ICT market and employs 4.5 million people globally.

The market is a critical component of the global economy, as well as a key element of public safety. This was underlined last week in the United States, when millions of people in several large states were unable to dial through to the 911 emergency system because of a telecoms issue.

Telco SPs annually invest over $330 billion to build their communication networks. These investments are made to meet several corporate strategies, including driving new network performance efficiencies and creating platforms for future revenue growth.

Given the size of these capex investments, it is important for telcos to monetize their investments and cut costs in order to compete as vigorously as possible. This has led to a wave of M&A activity across the world, especially in Europe, with massive multibillion deals involving Orange, Masmovil, Colt, Lumen, Vodafone, and others.

At the same time, we’re seeing the entry of new types of players, including satellite companies such as Starlink, making an already complex ecosystem even more so.

Value Propositions Beyond the Pipe

Understanding the multifaceted opportunities for monetization is key to thriving in the telecom industry. We identify three levels of connectivity monetization: 

  1. Network Infrastructure Enhancement: Leveraging technologies like network slicing and multi-access edge computing (MEC), and optimizing bandwidth and latency for diverse use cases
  2. Service Innovation: Offering tailored solutions such as fixed wireless access (FWA), private networks, and unified communications and collaboration (UC&C)
  3. Solution Development: Exploring avenues in automation, robotics, and the Internet of Things (IoT) for transformative business solutions

However, telecom features, services, and solutions must solve business issues to deliver material revenue gains. IDC’s 2023 Future of Connectedness Survey, conducted in June 2023, found that 42% of organizations prioritize enhanced access to critical business applications both on premises and in the cloud as their top metric for evaluating connectivity initiatives.

Following closely, 39% prioritize faster data throughput, while 36% emphasize increased levels of automation. This underscores the importance of aligning telecom offerings with the core objectives of businesses to drive meaningful value and performance.

We identify four essential strategies for elevating connectivity:

  1. Network APIs fuel successful revenue opportunities across all three levels.
  2. External partnerships are critical to integrating diverse technology sets into comprehensive solutions.
  3. Utilize differentiated, dynamic pricing models to increase adoption of connectivity-enabled solutions.
  4. Focus on business outcomes, not technologies, to court customer trust and validate meaningful ROI analysis.

Telcos Walking the Walk: Transform Internally to Lead Externally

In 2024, the transformation of telecom operators will encompass internal initiatives, such as cost optimization and the pursuit of new revenue streams through the integration of cloud data and intelligence. Externally, transformation responds to shifting customer expectations and the erosion of traditional core business models.

To navigate these changes effectively, operators are adapting to evolving partner ecosystems, leveraging synergy and agility to remain competitive in a dynamic marketplace.

The journey toward the telco cloud continues unabated. Almost three-quarters (73%) of respondents to IDC’s EMEA Telco Transformation Survey confirmed the deployment of BSS workloads in cloud environments. Similarly, 65% of respondents have already migrated OSS workloads to the cloud. Among the 150 sampled telcos, 37% have taken the significant step of transferring core workloads to cloud platforms.

The hypothesis of “telco wait-and-see” is now obsolete. We believe the success factors for telecom companies are:

  • Connectivity Diversity: Overhauling traditional business models to enable a broader range and higher volume of new services
  • Profitability: Boosting customer loyalty, generating new revenue streams, and enhancing operational efficiency
  • Automation: Adopting advanced technologies and refining processes for innovation and competitiveness

Telcos are gearing up for a transformative era of digital services and mobile applications through the deployment of open network APIs. Demonstrating a strong commitment to this evolution, telcos are actively engaged in the development of telco API standards, with 29 companies already enlisted in the GSMA’s Open Gateway initiative.

As these initiatives mature, attention naturally shifts toward monetization strategies, including the establishment of API marketplaces, and fostering engagement with a wider array of third-party developer communities.

More than half (53%) of our survey respondents indicated their primary focus for API investment lies in developing network APIs capable of being commercialized both internally and by third parties, thereby facilitating transformative changes within their business operations. An effective go-to-market strategy for exposing network APIs will hinge on factors such as segment type, specific use cases, and geographical reach. 

In conclusion, the telco industry stands at a pivotal juncture. It is undergoing a profound transformation that will shape its trajectory for the next 15–20 years. The convergence of culture, technology, internal operations, and customer experience underscores the hyper-complexity of the current landscape.

As we navigate these changes, it’s crucial to recognize that the stakes are high: There will be winners and losers, and the status quo is being redefined. Embracing a mindset of agility and experimentation is paramount.

Don’t hesitate to try and fail fast. Leverage every opportunity to learn collaboratively with your customers. Seek out strategic partnerships to enhance your chances of success in this dynamic environment.

Remember: In such complex scenarios, focus is key. Each player must define their priorities and steadfastly pursue them, recognizing that there’s no one-size-fits-all approach to thriving in the evolving telco ecosystem.

Masarra Mohamad - Senior Research Analyst, European 5G Enterprise Strategies - IDC

Masarra Mohamed is a senior research analyst specializing in analysing the connectivity and communications services markets, focusing on the changing networking requirements, trends, and competitive dynamics that support enterprises in their digital transformation. She explores how enterprise network strategies evolve to enable cloud, AI, and security.