AI-driven infrastructure demand is accelerating investment in datacenter networking, reshaping the Ethernet switch market.

The datacenter portion of the Ethernet switch market continued its strong growth in the fourth quarter of 2025 (4Q25), rising 63.0% year over year (YoY) to reach $9.9 billion, driven by the build-out of datacenter network infrastructure to support AI workloads.

The total Ethernet switch market, inclusive of both datacenter and non-datacenter segments, grew 35.1% YoY to reach $16.2B in 4Q25. For the full year, revenue totaled $55.1 billion, up 31.5% YoY.

Ethernet switch market highlights

  • Datacenter segment: The datacenter portion of the Ethernet switch market saw exceptional growth in 2025, with full-year growth of 53.5% YoY to reach $32.5B. High-speed datacenter switches (800G) accounted for 25.8% of 4Q25 revenues and 16.4% of full-year revenues, while 200Gb/400Gb speeds represented 43.9% of yearly revenue—reflecting rapid adoption of higher-speed networking to support AI workloads.
  • Non-datacenter segment: Ethernet switches used in enterprise campus and branch networks grew 6.4% YoY in 4Q25 and 9.1% for the full year, reflecting steady investment in enterprise infrastructure.
  • Regional performance: Ethernet switch revenues grew in all regions of the world in both 4Q25 and the full year. The Americas led with 45.4% YoY growth in 4Q25 and 40.2% for the year. EMEA posted 28.3% growth in 4Q25 and 23.1% for the year, while Asia Pacific saw 23.3% growth in 4Q25 and 23.5% for the year.

Router market highlights

The total router market, inclusive of both service provider and enterprise segments, rose 11.5% YoY in 4Q25 and increased 11.2% for the full year 2025 to reach $15.0B.

  • Service provider segment: The service provider segment (including communications and cloud SPs) made up 74.4% of total router market revenues in 4Q25 and increased 12.8% YoY.
  • Enterprise segment: The enterprise router market makes up the balance of revenues and grew 7.5% YoY in 4Q25, reflecting ongoing investment in enterprise wide area networking (WAN) connectivity.
  • Regional performance: In 4Q25, the Americas router market rose 15.9% YoY, EMEA increased 16.2%, and APJ grew 3.8%.

Vendor highlights

Vendor performance reflects the shift toward AI-driven datacenter demand.

  • Cisco: Cisco’s total Ethernet switch revenues increased 13.5% YoY in 4Q25 to $4.5 billion, capturing 27.6% market share. Non-datacenter segment revenues (63.9% of Cisco’s total) grew 7.5% YoY, while datacenter segment revenues rose 26.0% YoY. Cisco’s total router revenue increased 22.5% YoY, giving the company a 30.6% market share.
  • Arista Networks: With 92.6% of its Ethernet switch revenues in the datacenter segment, Arista’s revenues grew 31.4% YoY in 4Q25 to $2.0 billion. Arista holds a 12.6% share of the total Ethernet switch market and 19% in the datacenter segment.
  • Huawei: Huawei’s total Ethernet switch revenue increased 14.0% YoY in 4Q25 to $1.7 billion, giving the company a market share of 10.6%. Huawei’s router revenue increased 6.5% in 4Q25, giving the company a 30.2% market share.
  • NVIDIA: NVIDIA’s Ethernet switch revenues, entirely from the datacenter segment, surged 192.6% YoY to $1.5 billion in 4Q25, giving it a 15.2% share of the datacenter segment.
  • HPE: HPE’s total Ethernet switch revenue (66.8% from non-datacenter) increased 11.7% YoY in 4Q25, reaching a 6.7% market share. Following the July 2025 acquisition, HPE revenues now also include Juniper.

Market dynamics

  • Demand for speed and low latency: Organizations are investing in higher-speed switches to support AI-driven and other demanding workloads, fueling growth in datacenter and high-speed segments.
  • AI workloads: The proliferation of AI applications is pushing enterprises to upgrade their networks for both bandwidth and latency improvements.
  • Global uncertainty (constraint): While macroeconomic and geopolitical uncertainty persists, it has not significantly dampened investment in critical network infrastructure.

Why it maters

  • Who should care? CIOs, network architects, IT buyers, and technology vendors should note this acceleration, as it signals a renewed investment cycle in network infrastructure.
  • Business impact: Upgrading to more reliable and faster networks enables more responsive applications, enhances employee and customer experiences, and supports faster decision-making platforms.
  • Ecosystem signal: The market’s robust growth highlights the strategic importance of network modernization, especially as organizations deploy AI and data-intensive applications.

What’s next for the Ethernet switch market

IDC expects continued momentum in the Ethernet switch market as enterprises prioritize network modernization to support AI, cloud, and real-time applications. Growth could accelerate further as AI investments increase, particularly in datacenter AI factories and as more inferencing use cases emerge. However, new supply chain concerns around memory and persistent global uncertainty are headwinds. Watch for ongoing investments in datacenter upgrades and higher-speed switch deployments in the coming quarters.

Brandon Butler - Sr. Research Manager - IDC

Brandon Butler is a Senior Research Manager with IDC's Network Infrastructure group covering Enterprise Networks. His research focuses on market and technology trends, forecasts and competitive analysis in enterprise campus and branch networks. His coverage includes technologies used in local and wide area networking such as Ethernet switching, routing/SD-WAN, wireless LAN, and enterprise network management platforms. While contributing to ongoing forecast and market share updates, he also assists in end-user surveys, interviews and advisory services and contributes to custom projects for IDC's Consulting and Go-To-Market Services practices.

Petr Jirovsky - Senior Research Director, Network Infrastructure and Services - IDC

Petr Jirovsky is a Senior Research Director within IDC's Enterprise Infrastructure global research domain. He provides quantitative insights on network infrastructure for the datacenter, cloud, and campus/branch environments as part of the Network Infrastructure and Services subdomain. Petr serves as the global lead for IDC's Network Infrastructure Trackers, which track Ethernet switches, routers, wireless equipment, and application delivery appliances and services. He also contributes to numerous custom data projects and supports the publication of market share and forecast documents for the subdomain.

Diego Anesini - VP D&A, LatAm & Director Enterprise and Telecom - IDC

Diego Anesini serves as Research Vice-President, Data & Analytics for IDC Latin America, overseeing all the Information and Communications Technologies. Prior to this position, Diego held various roles in the company. The most recent was Enterprise Infrastructure and Telecom Director for Latin America. He has extensive experience in the Telecom and IT markets, due to his more than 25 years in the industry.

OpenAI’s failed attempt to scale “pay in chat” revealed deeper structural constraints: it can neither be a commerce platform nor operate effectively as a middleman. To be a true commerce platform, it would need to build a Shopify-like ecosystem. Even if Codex could build something in a few months, it would take years to accumulate the millions of product listings, merchant relationships, transaction histories, pricing dynamics, fulfillment systems, and customer journey data required.

OpenAI must also contend with the fact that commerce platforms can easily implement their own AI interfaces, as many already are (Alibaba Accio, Amazon Agent Mode, Walmart AI, etc.). AI is merely a new front door. If OpenAI wants to become a true agentic commerce platform, it must own or control one or more of three strategic assets:

  1. The Commerce Graph (inventory, sellers, transactions)
  2. The Customer Graph (identity, purchase behavior, lifecycle data)
  3. The Product Data Graph (high-frequency usage and intent signals)

Let’s look at some potential acquisition candidates that could revive OpenAI’s agentic commerce ambitions based on strategic fit and financial feasibility.

Strategic fit criteria

  • Strength of proprietary data moat
  • Control over demand and transaction layer
  • Monetization leverage
  • Ability to reduce platform dependency.

Financial feasibility criteria

  • Approximate market capitalization
  • Acquisition realism given OpenAI’s capital structure.
  • Financing complexity
  • Regulatory and integration risk
CandidateStrategic
Impact
Financial
Feasibility
Overall
Assessment
KlaviyoHighHighOwn the customer graph
InstacartHighModerateFast cycle replenishment
EtsyModerateModerateSecondary Marketplace
BigCommerceModerateVery HighOwn the product data pipes
Cart.comLow-ModerateHighFulfillment and commerce infrastructure

Other candidates such as eBay, Shopify, Mercari are considered too big for OpenAI to acquire.

The case for Klaviyo

Own the Customer Graph

Klaviyo is a commerce customer data and lifecycle marketing platform serving 193,000+ merchants globally. It doesn’t own a marketplace it owns the customer identity layer across marketplaces. It processes billions of behavioral events, including email engagement, SMS interactions, purchase conversions, cart abandonment, and repeat buying patterns.

In its most recent fiscal year, Klaviyo delivered approximately 32% revenue growth, expanding margins, and strong free cash flow. It supports thousands of midmarket and enterprise brands and maintains net revenue retention above 109%, reflecting deep embedding in merchant workflows.

Primary value to OpenAI:

  • Deep behavioral identity graph: Access to engagement and purchase data across hundreds of thousands of merchants.
  • High-margin SaaS monetization: Recurring revenue aligned with merchant performance and retention.
  • AI-powered personalization engine: Enables OpenAI to embed agents directly into lifecycle marketing and conversion optimization workflows.

The case for Instacart

Own Fast Cycle Replenishment

Instacart is North America’s leading grocery delivery and retail media platform, partnering with 100,000+ retail locations and serving 26+ million active customers. In 2025, it processed approximately 338 million orders, generating over $37 billion in gross transaction value (GTV).

Grocery is high-frequency commerce—weekly or biweekly—creating repeated behavioral loops rarely found in discretionary retail. Instacart also operates a growing retail media business tightly integrated with shopper behavior.

Primary value to OpenAI:

  • High-frequency purchase data: Recurring basket-level signals ideal for agent habit formation.
  • Localized real-time inventory: Enables agents to optimize decisions across substitution, delivery speed, and pricing.
  • Integrated retail media monetization: Advertising revenue directly tied to purchase behavior.

The case for Etsy

Own the marketplace

Etsy operates a global marketplace focused on handmade, vintage, and specialty goods, with 100+ million active listings, 8+ million sellers, and approximately 96 million active buyers. In 2024, Etsy generated more than $12.6 billion in gross merchandise sales.

Its experience is driven by discovery rather than price, emphasizing personalization, gifting, and niche communities.

Primary value to OpenAI:

  • High-signal preference data: Deep insights into taste, gifting, and niche category behavior.
  • 2nd tier marketplace: Large enough to matter, less complex than dominant incumbents.
  • Discovery-optimized commerce: AI agents could materially improve search and recommendation quality.

The case for BigCommerce (Feedonomics)

Own the product data pipes

BigCommerce powers over 130,000 merchants across 150+ countries, generating roughly $350 million in ARR. Feedonomics provides structured product data feeds to major marketplaces (Amazon, Google, TikTok) and ad channels.

It is particularly strong in B2B commerce, where early agentic commerce value is emerging.

Primary value to OpenAI:

  • Structured product feed layer: Clean SKU, pricing, and catalog normalization critical for AI accuracy.
  • B2B commerce exposure: Early advantage in high-margin procurement automation.
  • Ecosystem leverage: Influence over product data standards feeding major marketplaces.

The case of Cart.com

Fulfillment and commerce infrastructure

Cart.com provides end-to-end commerce services, including storefronts, analytics, warehousing, and fulfillment. It has raised over $380 million and expanded through acquisitions to build a distributed logistics network.

Its platform captures operational data such as inventory velocity, shipping performance, and fulfillment timelines—data typically unavailable to discovery-layer platforms.

Primary value to OpenAI:

  • Operational telemetry: Real-world delivery and inventory data to inform agent optimization.
  • Discovery to delivery: Enables AI to reason about fulfillment constraints.
  • Multi-channel insight: Visibility across marketplaces and merchant storefronts.

OpenAI would have to buy several of these companies, top three Klaviyo (customer graph), BigCommerce (data pipes), and Instacart (fast cycle replenishment). Cart.com adds operational telemetry but is most valuable after OpenAI has control of demand and identity. Without the customer graph and habit loop, fulfillment intelligence is secondary. Etsy is a large secondary global marketplace, but Instacart delivers a marketplace and faster cycle replenishment buying activity.

Acquisitions would radically change the monetization potential of OpenAI’s commercial business. New revenue streams from AI SaaS (Klaviyo), transactions and retail media (Instacart), merchant services (BigCommerce), in addition to their enterprise contracting and consumer subscriptions. These acquisitions would make a powerful strategic statement, legitimize OpenAI’s vision of being the default interface to commerce (or at least being competitive as such), and radically rebrand the company as the AI OS for commerce.

Who owns the personal shopper?

Assuming the industry resolves the operational challenges of agentic commerce, the key question becomes: who owns the consumer interface? Consumers will likely default to a single personal shopper agent that interacts across brands and marketplaces.

OpenAI is a strong interface but has structural vulnerabilities. At best, it is a third-party app unless it secures exclusive distribution with a major device manufacturer. Even then, marketplace operators can control and meter third-party agent access—pushing OpenAI back into a margin squeeze.

More critically, platform-native competitors are emerging. A Gemini-powered Siri deployed across billions of Apple devices could be extremely difficult to displace. If OpenAI cannot effectively monetize commerce or advertising, it will depend on enterprise and consumer subscriptions—yet consumers will gravitate toward the lowest-friction interface.

Gerry Murray - Research Director, Marketing and Sales Technology - IDC

Gerry Murray is a Research Director with IDC's Marketing and Sales Technology service where he covers marketing technology and related solutions. He produces competitive assessments, market forecasts, innovator reports, maturity models, case studies, and thought leadership research. Prior to his role at IDC, Gerry spent six years in marketing at Softrax Corp. an enterprise financial solutions provider. There, he managed marketing programs that produced 4 million emails a year, multiple websites, interactive tools and product tours, an online game, collateral, and PR. Concurrently, he was Managing Editor at RevenueRecognition.com, a thought leadership site featuring partnerships with IDC and the Financial Accounting Standards Boards (FASB) which was quoted and referenced in leading industry publications such as CFO magazine, BusinessFinance, and others. Gerry spent the first half of his career at IDC advising executives from some of the world's largest software and services providers on market strategy, competitive positioning, and channel management. He was the Director of Knowledge Management Technology and conducted research on a worldwide scale including: market sizing and forecasting, ROI models, case studies, multi-client studies, focus groups, and custom consulting projects.

企業でのAIを利用したITシステム投資はもはや必然の状況になっています。特に2026年は、AIエージェントの実ビジネス適用の元年となるとIDCでは予測しており、従来のAI利用方法であった「仕事のアシスタント」からAIエージェントを実利用し、ワークフローへの組み込みによる「業務遂行のバディ(相棒)」への構造的変化の年になるとみています。

IDCが2026年3月に発表した「Worldwide AI and Generative AI Spending Guide 2026V1」では、国内AI市場支出額は2025年に2兆3,725億円から、2029年に2.9倍の6兆8,897億円に急成長し、2024年~2029年の年間平均成長率(CAGR)は36.0%に達すると予測しています。このことは、AI市場が国内IT市場の重要な一画を占めるまでになることを意味しており、2029年にはIT市場全体の20%を占めるまでになります。これらのデータからも、企業でのAI投資は必然になっていると言えるでしょう。

今回IDCが発表した「Worldwide AI and Generative AI Spending Guide 2026V1」では、
以下の重要なAIシステム導入/活用のポイントを示しています。

1.AIエージェントの急成長によるAIソフトウェア市場の成長

国内では、企業での生成AIは2025年末に5割以上の企業が実利用しているものの、その利用方法(ユースケース)が例えば翻訳、要約などの一般オフィス業務の補助的役割に留まるケースが多く、十分な価値創出が得られずに試験的導入(PoC)が失敗するケースが多く見られることが判明しています。IDCの企業ユーザー調査においても、PoCにおいて期待する効果が得られなかった経験のある企業が6割に上ることが測定されています。このような背景で、導入効果が得られやすいユースケースとして、補助的な役割のAI利用から業務ワークフローの自動化/自律化へのユースケースの移行が求められます。これを実現する手段として提供が始められたAIエージェントは市場の期待を集めており、ソフトウェアベンダーによるAIエージェント向けプラットフォーム提供やアプリケーションへの組み込みが2025年から始まり、2026年から実ビジネスへの適用が急成長するとみられます。これらのAIエージェントの急成長を加味し、AIソフトウェア市場のCAGRは48.9%と予測し、AI市場全体のCAGR 36.0%と比較して成長率が大きくなると予測しています。

2.AIユースケースのCX適用拡大

AI/AIエージェントを実ビジネスに適用するためには、どの業務のどの部分に適用するかが成功のキーポイントになります。IDCが今回発表した「Worldwide AI and Generative AI Spending Guide 2026V1」では、ハードウェア/ソフトウェア/サービスのAIテクノロジー市場分類だけではなく、ユースケース別の情報も提供しています。これによると、IT運用の自動化、ソフトウェア開発へのAI適用などが成長率の高いユースケースとして予測されていますが、特に成長率の高いユースケースとして「セールス(CAGR 46.2%)」「カスタマーサービス(CAGR 42.0%)」などのCX(顧客エクスペリエンス)への適用拡大が見込まれています。これは、AI/AIエージェントが社内業務の自動化のみならず、人間とAIが協働することによって顧客やパートナーなどの対外的なリレーションを提供する「バディ」として位置づけられるようになることを意味しています。このことは、企業がAIを活用した自動化によるコスト削減ばかりではなく、顧客対応力や市場変化への対応力を強化する活用方法にユースケースを拡大していくことを示唆しています。

まとめ

2026年の国内AI市場はAI活用の変革が起こり、AIがアシスタントからバディへの変化を起こすキックオフの年と位置付けることができるでしょう。このことは、人間とAIが協働する企業運営の再定義をもたらし、AI市場加速の引き金となり、国内IT市場の主要な一画を占めることになるでしょう。

IDCが提供するデータのご紹介

IDCはAI市場に関して、継続的かつ多層的な情報を提供し、分析を行っています。

これらのデータセットを活用することで、国内およびグローバルなAI市場の加速を可視化することが可能となります。

関連する調査やご相談について

より詳細なインサイトや市場動向については、当社アナリストへお気軽にご相談ください

Takashi Manabe - Senior Research Director, AI and Automation, IDC Japan - IDC Japan

Takashi Manabe is the Senior Research Director of AI and Automation groups in IDC Japan. Mr. Manabe's primary responsibility includes the analysis of the dynamics and trends, vendor strategies, and market sizing/modeling of Japan’s enterprise AI related market including Software, Services and Infrastructure. He also covers Security, Data Management/ Bigdata Analytics, Customer Experience and Digital Transformation market related to AI. Before joining IDC, Mr. Manabe worked at Toshiba Corporation, Toshiba America Information Systems, Inc., and Toshiba TEC Corporation. 20 over years his experience in the communications and software market, Mr. Manabe started his business as a system engineer for PBX/enterprise data communications equipment in Toshiba Corporation. He was also act as product planning and marketing manager for communication equipment/ software. He also acted as business planning, business management in Toshiba America's age for cable TV Internet business in the enterprise, security software and consumer communication market. Just prior to join IDC, Mr. Manabe worked at Toshiba TEC Corporation, for document solution such like MFP remote management system, scan OCR solution as product planning manager. Mr. Manabe graduate of Muroran Institute of Technology, Japan, holds a Master Degree of Computer Science and Engineering. He also holds a Bachelor degree in Computer Managed Machinery Systems from Muroran Institute of Technology.

While AI-powered tools proliferate across the advertising (and every other) industry, most deployments remain isolated, unable to deliver the unified, cross-channel customer experiences that brands demand. IDC sees a critical gap in the market—fragmented AI products are failing to orchestrate branding and performance across the entire customer journey.

The solution lies in agentic mesh architectures for CX, where interconnected AI agents collaborate seamlessly, breaking down silos and enabling real-time, authentic engagement at scale. This shift is not just evolutionary; it is foundational for advertisers seeking to lead in the AI age.

Advertising’s agentic revolution

Let’s face it: Traditional adtech often feels fragmented, making campaign orchestration and audience targeting a challenge. This is only exacerbated by current AI deployments which often focus on single tasks and fail to capture the cross-functional nature of a client journey.

The agentic mesh for CX changes the game, deploying semiautonomous and autonomous AI agents that collaborate across departments and systems. For advertising professionals, this means:

  • Real-time, cross-channel campaign management
  • Privacy-compliant personalization
  • Unified brand messaging at scale

Best practices for AI-driven advertising

Brands need to address the agentic mesh for CX framework as they look to effectively enable AI across their Ad-tech stack. In this regard several best practices emerge:

  • Unified aata infrastructure: Build a standardized data foundation that integrates DMPs, CDPs, CRMs, and analytics platforms. This empowers agents to access real-time audience profiles, fueling dynamic creative optimization and attribution.
  • Interoperability and standards: Design advertising agents for seamless workflow handoffs and data exchange using protocols like AdCP, OpenRTB, and IAB frameworks. This ensures compliance, scalability, and consistent campaign execution.Rapid adoption and risk management: Deploy agents for critical use cases—cross-channel attribution, creative optimization, and commerce integration. Regular risk assessments help mitigate errors, ad fraud, and ensure quality AI outputs.
  • Upskilling and governance: Equip your teams for agentic AI by upskilling in programmatic buying, campaign optimization, and creative management. Clear governance and supervisory controls are essential for brand safety and regulatory compliance.

The agentic mesh advantage

Agentic AI isn’t just a buzzword—it’s transforming advertising operations by automating complex workflows, breaking down data silos, and enabling proactive, outcome-driven marketing. Industry leaders like Amazon Ads, Oracle, Adobe, and Salesforce are already embedding agentic mesh principles, even if under different monikers, to drive cross-functional collaboration and supervisory control.

The true test for brands in the AI era is not simply adopting new tools, but evolving from fragmented, siloed adtech to a fully autonomous, agentic ecosystem. This transformation requires real-time data integration, robust infrastructure, and clear protocols for agent interoperability—each a pillar of the agentic mesh advantage.

Brands that unify their advertising investments with cross-functional CX goals and KPIs, leverage no/low-code tools and agent cloning, and prioritize closed-loop measurement will accelerate their journey toward autonomous operations. Selecting partners committed to outcome-driven, agentic innovation is equally critical.

Ultimately, competitive advantage hinges on your ability to break down silos, upskill teams, and implement strong governance. Early adopters of agentic mesh principles will not only deliver consistent, brand-safe customer experiences at scale—they will define leadership in the AI age.

For more information on the Agentic Mesh for CX and how brands can apply it in the advertising context, check out:  Applying the Agentic Mesh for CX to Advertising: Orchestrating Branding and Performance Across the Entire Customer Experience in an AI Age.

Roger Beharry Lall - Research Director, Marketing Applications for Growth Companies - IDC

With over 25 years' experience leading technology driven marketing programs, Mr. Beharry Lall is now a Research Director with IDC covering Advertising Technologies and SMB Marketing Applications. He brings a unique multidisciplinary perspective, evangelizing the innovative and pragmatic use of both martech and adtech solutions for companies of all sizes. Early in his career Rog worked with an IBM subsidiary expanding into the Asian Market and subsequently, he spent over a decade at RIM (BlackBerry) building marketing leadership across new industry segments, geographies, and product categories. This background fuels his perspective as he researches enterprise customers engagement tools and tactics across the unified omnichannel.

Japan’s IT market is evolving beyond its traditional reliance on large enterprises, including public sector modernization, as a primary growth driver.

In 2026, IDC forecasts the market will reach ¥28,418.9 billion, growing 3.3% year on year, with a 6.4% CAGR through 2029. Large enterprises remain dominant, increasing their share from 53.9% (2025) to 56.0% (2029).

Japan IT market growth accelerates as mid-sized firms drive 9.5% spending surge, reshaping vendor strategy and digital transformation demand.

However, the key shift is in the rise of mid-sized companies.

  • Mid-sized firms (100–999 employees) will expand IT spending share from 19.8% in 2025 to 21.2% by 2029
  • In 2026, IT spending (excluding PC) will grow 9.5% YoY, outpacing large enterprises (8.7%)

Japan’s IT market is entering a dual-engine growth phase, combining enterprise modernization with accelerating mid-market digital transformation.

Why Are Mid-Sized Companies Accelerating IT Investment?

1. Is labor shortage forcing mid-sized firms to digitalize?

Yes, and it’s becoming urgent.

Labor shortages in Japan aren’t just a macro trend anymore. They’re showing up in day-to-day operations, especially for mid-sized companies.

Large enterprises have advantages: stronger employer brands, deeper recruiting pipelines, and more mature digital platforms. Many have already invested heavily in automation, workflow integration, and data infrastructure.

Mid-sized companies often lack both talent depth and digital maturity. They cannot compete on compensation scale or recruitment visibility. As labor shortages intensify in 2026, digitalization becomes essential for business continuity rather than a discretionary initiative.

In addition, digitalization mandates from large business partners and public sector procurement processes are cascading downstream. Mid-sized firms that fail to digitize risk exclusion from supply chains and ecosystem participation.

Bottom line: From 2026 onward, digitalization is no longer optional, it’s how mid-sized companies will stay operational and competitive.

2. Why will mid-sized firms rely more heavily on IT vendors and system integrators?

Because they don’t have the in-house capacity.

Large enterprises are increasingly building in-house digital capabilities or partnering directly with hyperscalers and advanced technology firms. Their internal IT maturity has advanced significantly.

Mid-sized companies usually can’t. Most have small IT teams, limited internal expertise, and constraints in executing complex modernization programs. So as digital transformation moves from planning to execution in 2026, they’ll depend more on vendors.

Mid-sized companies typically require:

  • End-to-end implementation support
  • Packaged, use-case-driven solutions
  • Operational scalability
  • External expertise in AI and cloud adoption

What this means for vendors: Serving this segment requires structural adaptation. Projects are smaller. Budgets are tighter. Engagement models must be leaner and outcome oriented.

3. Do mid-tier vendors have a structural advantage?

In many cases, yes.

While Tier 1 and Tier 2 vendors remain essential for large-scale enterprise transformation, mid-sized companies often require a different delivery model. Engagements are more operational, localized, and execution focused.

Mid-tier system integrators and regional IT vendors may hold an inherent structural advantage in this environment.

Their scale, cost base, and organizational focus are often better aligned with the needs of mid-sized enterprises. They’re often closer to the customer, more hands-on, and are better structured standardized delivery of outcome-driven solutions without the overhead associated with mega-enterprise programs.

Meanwhile, Tier 1 vendors are optimized for complex, multi-year transformation programs. Mid-tier vendors are often optimized for speed, proximity, and practical execution, attributes that align naturally with mid-sized companies entering digitalization at scale.

So, the fit matters. As mid-sized companies increase IT investment from 2026 onward, vendors whose size, service intensity, and geographic reach match this segment are likely to capture disproportionate growth.

4. How does cloud adoption lower transformation barriers?

Large enterprises frequently face modernization bottlenecks due to deeply embedded legacy systems and customized architectures. Transformation often requires extensive integration and long transition timelines.

Mid-sized companies face fewer structural constraints.

While legacy platforms may exist, system environments are typically less complex. As infrastructure-as-a-service (IaaS) and cloud-native platforms expand in Japan, cloud adoption reduces both cost and complexity barriers.

Cloud changes the equation by enabling:

  • Faster deployment
  • Lower upfront costs
  • Scalable digital infrastructure
  • Easier integration of AI-related capabilities

And that last point is important. In 2026, spending on AI (models, data, agents) is expected to expand rapidly. Cloud environments allow mid-sized companies to adopt these capabilities without large-scale architectural overhauls.

In simple terms: Cloud reduces friction and that speeds everything up.

What Does This Dual-Engine Growth Mean?

Japan’s IT market is not fragmenting, it’s expanding from two different directions at once. Large enterprises continue invest and modernize. Mid-size companies are stepping up as a real growth driver.

Going forward, growth will be shaped by:

  • Sustained large-enterprise modernization
  • Accelerating mid-sized digital transformation
  • Expanded AI adoption across both segments
  • Increased reliance on scalable cloud platforms

What Should IT Vendors Do Next?

Focus more seriously on the mid-market.

Growth won’t come only from large enterprise deals anymore.

It will increasingly come from:

  • Reaching more mid-sized customers
  • Delivering repeatable, outcome-driven solutions and
  • Aligning pricing and delivery to smaller-scale projects.

The takeaway:

Vendors that adjust early to this dual-engine reality will be in the strongest position to capture the next phase of growth in Japan’s IT market.

Contact IDC for deeper insights, or connect with our analysts to discuss what this means for your business.

Hitoshi Ichimura - Senior Research Manager, Software, Services, and IT Spending, IDC Japan - IDC Japan

Hitoshi Ichimura is responsible for the market analysis of overall Japan IT spending, based in Tokyo. In this role, he is responsible for the market analysis of IT Spending research by vertical, company size and region. His main area of research involves IT Spending market forecast and trends for the Japan financial industry local area and SMB segment. Ichimura is also involved in various custom research projects in the area.

In today’s technology market, certainty has become a luxury. AI adoption is accelerating, but unevenly. Partner ecosystems are fragmenting, consolidating, and recombining at speed. Go‑to‑market models are collapsing into customer‑led buying journeys, and leadership teams are being asked to make high‑stakes decisions with incomplete, fast‑aging information.

Broad market reports, benchmarks, and best practices retain significant intrinsic value as foundations for strategy. Yet decisions that are deeply contextual, ecosystem‑specific, and time‑sensitive often require additional layers of insight beyond these core inputs.

The reality is that strategy is no longer about understanding “the market” in the abstract. It is about understanding your market position, your partners, and your customers, right now. Increasingly, the most important questions leaders are asking sound like this:

  • How is AI changing buying behavior and economics in our customer base?
  • Which partners are actually driving growth, influence, and outcomes – and which no longer align with our direction?
  • How does our ecosystem strategy compare to competitors in EMEA, not just globally?
  • Where are customers genuinely willing to invest, and where are they experimenting, delaying, or pushing back?

These are not questions that generic insight can answer with confidence, because the answers depend on your installed base, your partner mix, your regional footprint, your commercial model, and your competitive posture. In short, strategy has become situational.

Faced with this uncertainty, many organizations default to gathering more data: more dashboards, more surveys, more internal analysis. But volume is rarely the issue. The real challenge is relevance. Internal data lacks external context. Global averages mask regional and sector nuance. Lagging indicators arrive after decisions have already been made. What leaders need instead is interpretation, synthesis, and external validation that is designed around the decisions they actually need to take.

This is why we see a growing shift toward custom insight. High‑performing organizations increasingly start with the decision, not the dataset. Whether the challenge is AI monetization, partner strategy, ecosystem prioritization, or route‑to‑market design, the work begins by asking what choice must be made in the next three to six months, and what evidence is required to make it with confidence. From there, insight is built backwards.

Critically, the most effective custom projects blend signals rather than relying on a single method. Partner surveys reveal capability gaps, investment priorities, and friction points across the ecosystem. Customer surveys surface willingness to pay, buying behavior, trust dynamics, and expectations around AI, services, and outcomes. Qualitative interviews add depth and context, while ecosystem and competitive analysis connects those findings to broader market forces. The value does not sit in any one input, but in how those inputs are connected and translated into strategic implications.

We consistently see customer and partner insight deliver the greatest impact when applied to a small number of high‑value areas:

  • AI and agentic AI strategy, including pricing, packaging, economics, and partner roles
  • Ecosystem and partner optimization, from role clarity to performance segmentation and investment focus
  • Go‑to‑market and route‑to‑market evolution, particularly in EMEA’s fragmented markets
  • Executive alignment, creating a shared, evidence‑based fact base for leadership teams
  • External storytelling, using proprietary insight to support thought leadership and market influence

This is where insight turns into action. In many engagements, the report itself is not the most important output. The real value is decision confidence: knowing that a strategic move is anchored in how customers and partners are actually behaving, not how we assume they are behaving.

There is also a powerful dual role at play. Custom insight supports internal strategy and decision‑making, but it can simultaneously fuel external influence. Proprietary findings can shape executive narratives, strengthen partner and customer communications, and differentiate a company’s point of view in an increasingly noisy market. When insight is designed with this dual purpose in mind, it becomes a strategic asset rather than a one‑off deliverable.

This matters now more than ever. Across EMEA, partner ecosystems are being reshaped by a set of interlocking forces: AI economics, consolidation, shifting alliance hierarchies, collapsing route‑to‑market models, sovereignty pressures, and the rise of in‑product and marketplace‑led buying. Many of these shifts are subtle in isolation but powerful in combination.

Understanding which are leading indicators, which are mid‑cycle effects, and which are lagging consequences requires more than surface‑level analysis. It requires insight grounded in real partner and customer evidence, interpreted through an ecosystem lens.

The bottom line is simple. In a market defined by AI acceleration, ecosystem complexity, and regional divergence, generic insight is no longer enough. Organizations that are pulling ahead are those that ask better questions, invest in insight tailored to their context, and use research as a decision tool rather than a reference document.

If these questions resonate, you’re not alone. Most technology leaders we work with are already grappling with how AI, ecosystem change, and buyer behavior are reshaping their growth models – and are looking for concrete, evidence‑based answers they can act on. Through bespoke research and advisory projects, we help clients translate partner and customer insight into tangible business benefits: sharper internal intelligence for decision‑making, clearer ecosystem strategy, and insight‑led assets that can be used confidently with partners and customers alike.

This perspective is also captured in our 15 Key Trends Shaping EMEA Partnering Ecosystems report, often used as a starting point for bespoke client work. Contact us to learn more about our ecosystem research, custom solutions and advisory portfolio.

IDC’s ecosystem lens

IDC’s ecosystem research focuses on value creation, margin capture, and strategic influence. We analyze how partners orchestrate outcomes, how they align with customer buying journeys, and how they evolve their business models to stay relevant. You can find more information here

If you have any further questions, drop them in the form here

Stuart Wilson - Senior Research Director, EMEA Partnering Ecosystems - IDC

Stuart Wilson is senior research director for IDC’s Europe, Middle East & Africa (EMEA) Partnering Ecosystems program. With over two decades of global experience, Stuart focuses on the rise of complex, connected ecosystems and how platform models are reshaping routes to market and partner engagement frameworks.

The McKinsey/Lilli incident should be read as a market signal. Once a system can see proprietary knowledge, shape work products, and connect to tools, it stops being a productivity layer. It becomes part of the operating core.

That matters because most companies are still thinking about AI risk in yesterday’s terms: data leakage, bad outputs, brand reputation damage. Those are serious issues, but the bigger risk becomes delegating authority to AI systems.

Implications now and next

Right now, most enterprises are still in a relatively contained phase. AI drafts, summarizes, searches, recommends. When something goes wrong, the damage is often painful but contained. A team wastes time. A document is exposed. A workflow stalls. Trust takes a hit.

The practical implication is clear: the severity of failure scales with an agent’s capabilities and permissions.

In a future shaped by a well-established agent economy, the perspective must evolve.

When AI agents end up touching X% of knowledge work, Y% of customer interactions, and Z% of routine approvals, the risk profile changes completely. The issue is not just whether an attacker can see something. It is whether they can quietly influence what the business does. This is the shift boards and executive teams need to understand.

In that future, the most important question will no longer be, “Was data exposed?” It will be, “What decisions were shaped by a compromised system?” If an agent can reprioritize work, alter recommendations, steer analysts toward the wrong evidence, or trigger downstream actions, then integrity matters as much as confidentiality.

At IDC, we expect more than 1 billion actively deployed AI agents by 2029, executing roughly 217 billion actions a day, and forecasts agentic AI will exceed 26% of worldwide IT spending, or $1.3 trillion, that same year.

With that in mind, any vendor or buyer building a Lilli-like platform should now assume they are operating decision infrastructure, not productivity software.

Recommendations for CIOs and CISOs

For CIOs, the priority is to run AI as a governed platform. Standardize approved frameworks, connectors, and protocols; separate confidential and public data; make agents inherit the user’s permissions rather than granting broad ambient access; and maintain a live inventory with owner, version, data sources, and external tool access for every production agent.

Just as important, design for bounded autonomy: default to read-only assistance, push high-impact or irreversible actions behind scoped tools and policy engines, and report three numbers monthly:

  • % of agents with write access,
  • % of critical workflows requiring human approval,
  • X minutes to disable an agent, model, or connector.

For CISOs, the core mistake to avoid is treating AI as a feature instead of a system. Threat-model it with AI-native frameworks such as MITRE’s Adversarial Threat Landscape for Artificial-Intelligence Systems (ATLAS), National Institute of Standards and Technology (NIST) Artificial Intelligence Risk Management Framework (AI RMF), and Open Worldwide Application Security Project (OWASP)’s GenAI and agentic guidance, then red-team the whole chain: inputs, retrieval, memory, tool use, rendering, and agent-to-agent handoffs.

The control baseline should be familiar but upgraded for agents: least-privilege tools, isolated memory, validated inputs, sanitized outputs, full logging of tool calls and decisions, security operations center (SOC) integration and an AI-specific incident playbook with a real kill switch.

This is one of the few cyber investments with a measurable business case. And one to keep in mind if you have a chance to review our Future Enterprise Resiliency and Spending (FERS) survey series. In the Future Enterprise Resiliency and Spending Wave 10 survey, from January 2026, you will notice that organizations are already funding AI/agent security and governance at near parity with the rest of the AI stack, accounting for 16.7% of planned AI investment on average worldwide.

What CEOs should tell their boards

CEOs should tell boards three things:

First: “We are not just deploying AI; we are delegating authority.”

Second: “Success will be measured by controlled autonomy, not by agent count.”

The board pack should show:

  • number of production agents
  • number with external tool access
  • percent with high-impact permissions
  • percent under red-team coverage
  • mean time to revoke access
  • third-party dependencies per critical workflow.

Third: “This is now a supply chain and resilience issue.”
In other words, a weak link in any model provider, cloud platform, connector, or data source can disrupt core workflows, not just expose data.

Once again, our FERS survey series can offer some context as we evaluate these three statements. In the Future Enterprise Resiliency and Spending Wave 8 survey, from October 2025, security, risk, and compliance was among the areas most immune to budget reduction worldwide at 25%, and 29% of organizations ranked “Enhancing cyber recovery and resiliency” among the top areas for significant budget increases in 2026.

To close, let’s consider the opposite perspective.

The right board question is no longer, “How many agents do we have?” It is, “How much authority have we delegated, to which systems, under what controls?”

Boards do not need more demos. They need evidence that management can set clear bounds on autonomy, observe it continuously, and shut it down fast.

Alessandro Perilli - Vice President, AI Research - IDC

Alessandro Perilli is a Vice President leading the Agentic AI Platforms and Strategies research program. Alessandro’s core research coverage includes emerging AI technologies, global AI market trends, the state of enterprise AI, and sovereign AI.
Key takeaways from IDC’s Telco Forum 2026 Barcelona

On March 1, 2026, IDC brought together senior telecom leaders, vendors, system integrators, cloud leaders, partners, and media in Barcelona to examine how the industry is evolving in an AI-driven world. The discussions reinforced a clear message: telecom transformation is no longer theoretical. It is structural, financial, operational and increasingly sovereign. Drawing on insights shared across the event; this blog captures the major themes shaping telecom strategy through 2030.

The four megatrends shaping telecoms through 2030

There are 4 compounding megatrends that have been reshaping the sector since 2022. Looking back, the telco industry has moved through a rapid succession of technological focal points: Network APIs as foundational enablers for exposing network capabilities; Generative AI as the entry point for process automation; and Agentic AI in 2025, which introduced autonomous decision-making into customer experience, network management, and enterprise solutions. In 2026, the critical new frontier for AI is inferencing, the shift from model training to real-time, distributed AI workload execution, and it is this transition that is forcing telcos to fundamentally rethink their infrastructure architecture and competitive positioning. Underpinning all of this are the four defining themes of 2026:

  • Structural transformation is intensifying. Business model reinvention is not new for telcos, but the pace has accelerated. Four distinct strategic paths are now in play simultaneously: the TechCo transition (embracing network-as-a-platform models), Delayering (separating ServCo, NetCo, and InfraCo entities to optimize asset utilization), Consolidation, and a redefined form of Convergence, that focuses on bundling fixed-mobile-satellite services and designed to lock in ARPU and reduce churn.
  • Network investment is tapering. The cyclical CAPEX peak from 5G non-standalone rollout has passed in high and middle-income markets. IDC forecasts a 1.5% decline in global telecom CAPEX in 2026, bringing the total to $320 billion, with CAPEX intensity projected to fall from 22% in 2024 toward 18% by the end of the decade. The drivers are multiple: the one-off FTTP spending peak is fading, satellite partnerships are smoothing access and transport investment, and a structural CAPEX -to-OPEX shift is underway as telcos increasingly rely on ISVs and cloud providers for virtualization and AI. The freed-up cash is flowing into shareholder returns, strategic investments, and targeted digital infrastructure plays.
  • AI adoption is crystallizing around inferencing and sovereign AI. In 2025, the buzzword was agentic AI. In 2026, it is inferencing, and the telcos have a genuine structural advantage to capitalize on it. With distributed infrastructure, low latency, and deep regulatory trust in their home markets, telcos are positioned to become national AI factories, delivering sovereign AI solutions to governments, healthcare systems, and regional enterprises. IDC’s survey data shows that AI compute spending is approaching a pivot point in 2027, when inferencing will overtake training as the dominant driver of AI infrastructure investment. Telcos that expand their data center footprint and deepen relationships with co-location providers now are positioning ahead of that curve.
  • LEO satellite partnerships are becoming strategic. Starlink has established an early lead as the preferred satellite partner for telcos globally. Use cases vary significantly by geography – D2D and satellite broadband in the US and Canada’s large, underserved coverage areas, disaster recovery in Europe’s dense markets, and transport and backhaul across Asia Pacific and Australia. What is clear across all regions is that the satellite-terrestrial boundary is dissolving into a hybrid connectivity model, and the telcos that forge the right partnerships now will have a differentiated coverage story that competitors simply cannot replicate terrestrially.

Balance, pivot, revolt: the transformation imperative

Telcos’ internal transformation focus for 2026 can be framed with three sharp words: balance, pivot and revolt.

Balance is the defining tension of 2026. According to IDC’s C-Suite Tech Survey (September 2025, n=45 telecom respondents), 52% of telco C-suite leaders have AI implementation as a top three priority, but 50% simultaneously have technology modernization as a top three priority. These can be complementary investments as telcos cannot get full value from AI if they have not addressed legacy system complexity, data governance gaps, and architectural debt; but they also compete as telcos must decide between investing in new capabilities that promise significant gains vs. unglamourous IT modernization initiatives which have often been neglected for years. This is at a time with funds for transformations are finely balanced: Telecom CAPEX is declining, though IDC forecasts a modest 5.2% growth in spend on operations and monetization systems in 2026, reaching $54 billion, as telcos invest in IT systems to monetize the billions in CAPEX invested in rolling out new wireless and fixed networks. 5.2% growth is far from a blank cheque, every dollar deployed in IT must demonstrably either cut cost or support new revenue.

The autonomous networks aspiration illustrates this balancing act with particular clarity. TM Forum data from 2025 shows that only 4% of operators self-reported achieving Level 4 autonomous network status, yet 85% aspire to reach that level by 2030. That is an extraordinary gap. According to IDC’s EMEA Telco Transformation Survey (July 2025, n=150), the barriers are familiar: interoperability failures and the persistent lack of a single source of truth in network data. Notably, these are precisely the same barriers that have constrained AI adoption more broadly.

Pivot means making deliberate choices about where to invest and what to sequence. Data quality, accessibility, security are all in focus in 2026. This is represented in telcos making positive investments to overhaul their network inventory systems and updating their data governance policies and infrastructure from customer data down to the network. For autonomous networks, IDC’s research points to a more granular, domain-specific approach gaining traction: telcos are identifying specific use cases, service assurance and fault management are the top automation priorities for EMEA telcos in 2026, and targeting specific domains (IP access, RAN, and core) for Level 4 capability. This is far more tractable than a blanket push to full autonomy. On the people side, 97% of telcos recognize gaps in their talent base for developing and using AI at scale. Sixty-five percent are investing in AI-enabled learning tools, and 58% are expanding internal upskilling programs, but with only 42% currently offering skills training, there is still a meaningful gap between recognition and action.

Revolt is the urgent call to fix customer commercialization before AI finally demolishes the buying behaviour telcos have relied on for decades. For example, a UK mobile subscriber paying £15 per month for 10GB, regularly consuming just 6GB, with known Disney+ and international roaming usage, was on renewal offered to take a device upgrade, to increase their data rate to 30GB for £18 or unlimited data for £24. There was no demand signal for a device, no upsell of complementary services, and no personalization of any kind. The customer found a 40GB plan with the same mobile provider on a comparison site for £7.50, a 50% ARPU reduction and 400% value giveaway. Comparison sites have been established for well over a decade empowering consumer to find the best deal with some manual effort. Today’s consumers and enterprises, however, are already beginning to use AI to undertake similar comparisons with far less manual effort.

The point is not just that this particular offer was poorly designed. The point is that the entire commercial model relies on customer inertia, and AI is systematically dismantling that inertia. As AI agents increasingly make purchasing decisions on behalf of consumers and enterprises, operators that cannot demonstrate differentiated, personalized value in real time will find their customer bases eroding with a speed and scale unlike anything seen before.

5G: from product to platform, and the 6G horizon

The back half of the 5G lifecycle represents an inflection point, but only if operators change their frame of reference. Core mobile remains solid: IDC projects a 2.0% CAGR in global mobile connections through 2029. The world will exceed 9 billion mobile connections within the next two years, surpassing the current global population of 8.3 billion. In saturated markets, however, the growth lever has shifted decisively from subscriber acquisition to retention and value extraction — which brings the customer experience and commercialization issues directly back into focus.

The bigger opportunity lies in the shift from 5G as a product to 5G as a platform. For the first five years of 5G, operators sold speed, latency, and connection density. The next phase is less about branding a connection as 5G and more about 5G as the underlying infrastructure that enables XR, drones, V2X, private 5G, and RedCap solutions to be viable, scalable, and mobile. The challenge is that these use cases do not scale in the millions the way mobility or FWA does, they scale in tens of thousands. That requires a fundamentally different approach to network architecture, back-end systems, and, critically, business models.

Integration complexity remains the most significant brake on enterprise 5G adoption. 46% percent of enterprises cite it as the primary adoption barrier. The solution is less ego and more ecosystem: operators need to be willing to play a back-end role in partner-led solutions rather than insisting on front-facing primacy. 74% of enterprises express interest in network slicing; 49% plan to increase fixed wireless access investment; 58% say they are interested in satellite connectivity, but many still have significant misconceptions about what satellite-to-device actually delivers today. Expectation management is part of the product.

On 6G, If the industry maintains the ten-year generational cycle, 6G commercial launches would begin around 2029. Technical specifications are still in the study phase at 3GPP. The defining features of 6G, AI-native architecture enabling autonomous self-optimization, integrated sensing that turns every cell tower into a radar station, quantum-resistant security, and new terahertz spectrum, collectively point toward a network that moves AI out of the data center and into the physical world. The concept of “physical AI,” or what one operator CTO termed “kinetic tokens,” suggests that 6G will not merely support AI-driven applications but will provide the real-time connectivity substrate that makes physical AI, autonomous robots, connected vehicles, intelligent infrastructure, a viable commercial reality.

The enterprise connectivity opportunity: vast, varied, and underserved

Enterprise connectivity budgets are growing. IDC’s Future Enterprise Connectivity Infrastructure and Services Survey (August 2025, n=758) shows that 37.5% of enterprises increased their connectivity budget by more than 10% over the last two years. For 2026, that proportion rises to 44%. The primary drivers are cloud migration, SaaS usage, AI, video, IoT and device density are driving up bandwidth requirements. Four in ten enterprises saw bandwidth demands increase by more than 50% over the past year. Among organizations with over 10,000 employees, 17% saw their bandwidth demands double. Retail and financial services lead in cumulative bandwidth growth, but the opportunity is sector-wide: only 40-46% of enterprises are at an advanced or market-leading stage of connectivity maturity. The majority are still on the journey and actively looking for guidance.

The question of who captures this opportunity, however, is not straightforward for network service providers. When IDC asked enterprises which provider types they see as best and worst placed to address their future WAN requirements, cloud providers ranked first at 29%, followed by IT partners at 28%, with network service providers third at 23%. More pointedly, in the “worst placed” ranking, network service providers came second. The reasons cited: not treating customers well 35%, limited IT and network capability 28%, and difficult to work with 26%.

This is a reputational and structural challenge, not just a product one. Cloud providers are perceived as having broad network capability, even though they fundamentally depend on telco partners for last-mile delivery. IT partners are perceived as having deep industry expertise, expertise that telcos themselves possess but has not been to communicate or commercialize effectively. The gap is therefore not simply about capability. It is about perception. Perception shapes purchasing decisions, which in turn shape market reality.

Encouragingly, telcos’ “best placed” positioning has improved in recent years as operators have prioritized customer experience and simplified portfolios to deliver more flexible, scalable, and accessible services aligned with enterprise demand. Network as a Service, or NaaS, is central to this shift. NaaS is a cloud-based delivery model in which connectivity, bandwidth, security, and routing are provisioned and consumed on demand via APIs or self-service portals. It abstracts the underlying physical infrastructure and allows enterprises to scale, configure, and optimize network resources without directly owning or managing hardware. Enterprise sentiment toward NaaS remains mixed, 32% said they could make it easier or cheaper for a service provider to manage their networks and security, and 26% said they could simplify self-managed network operations. But 19% said they would not want to be locked into one service provider’s platform regardless of the benefits, and 10% remain unfamiliar with NaaS entirely. The education gap is significant and closing it will require more than technical refinement. It demands commercial clarity, stronger communication, and deeper customer relationships. Ultimately, this is not just a transformation in network architecture. It is a transformation in trust, positioning, and perceived value.

The bottom line

The IDC Telco Forum 2026 in Barcelona surfaced a market that is, in many respects, more coherent in its direction than at any point in recent years, but also more demanding of execution discipline than most operators have yet demonstrated.

The opportunity in AI inferencing and sovereign infrastructure is real and structurally aligned with telcos’ natural positioning. The satellite-terrestrial convergence is creating a coverage differentiation story that was not available five years ago. The enterprise connectivity market is expanding, budget-rich, and hungry for strategic guidance. And 5G, finally maturing beyond its early-product phase, is approaching its platform moment.

But against each of these opportunities sits a structural challenge that must be addressed in parallel: legacy system complexity is limiting AI value extraction; autonomous network ambitions are outpacing organizational readiness; commercial and CX systems are still leaving significant value on the table; and enterprise perception of telcos’ breadth and quality of service lags behind the reality.

The telcos that will win this decade are those that treat these not as separate workstreams but as a single integrated transformation, one where the investment in networks, IT modernization, talent, customer experience, and ecosystem partnerships compounds into a durable competitive position. The window is open. The question, as always, is execution.

For more information on IDC’s telecom research, including the newly launched Satellite and NTN research program, contact your IDC account manager or drop your details in here.

Download a copy of the State of the Telco Market ebook here.

Masarra Mohamad - Senior Research Analyst, European 5G Enterprise Strategies - IDC

Masarra Mohamed is a senior research analyst specializing in analysing the connectivity and communications services markets, focusing on the changing networking requirements, trends, and competitive dynamics that support enterprises in their digital transformation. She explores how enterprise network strategies evolve to enable cloud, AI, and security.

最新の関税動向は、世界のテクノロジー・サプライチェーン全体にコスト圧力と不確実性をもたらしています。日本企業にとっても、部材調達・生産・組立・物流が複数国を跨ぐことが多い中、関税変更は価格戦略やサプライチェーン設計の見直しを迫る要因になり得ます。

IDCでは、Simon Ellis(製造・サプライチェーン担当 グループバイスプレジデント)と、Phil Solis(コネクティビティ/スマートフォン向け半導体担当 リサーチディレクター)が、最新の関税動向がテクノロジー・エコシステム全体の価格、製造戦略、長期投資判断に与える影響を以下のように議論しています。

原文:2026年2月25日公開(英語)|日本語版監修: 寄藤 幸治

直近の最大課題は「不確実性」

最高裁判所が過去の一部関税を「適法ではない」と判断する一方で、新たな関税が導入されつつあります。その結果、コストへの影響(エクスポージャー)は残り、より大きな課題として浮上しているのが予測不能性です。

「こうした事柄について、明確さがほとんどありません。月曜日に真実だったことが火曜日には真実ではなくなる。企業が取り組むべき構造的な対応は、数分、数時間、数日、数週間で終わるものではなく、数か月、場合によっては数年かかります。だからこそ、何が正しい判断なのかが見えにくいのです」
– Simon Ellis(IDC)

製造業やサプライチェーンのリーダーにとって、設備投資、調達先変更や地域的な分散といった構造的な意思決定は複数年単位の時間軸で行われます。政策の方向性が短期間で変わる環境では、企業は「進める/延期する/追加リスクを受け入れる」といった選択の中で、難しい判断を迫られます

変動局面で問われる「価格設定の慎重さ」

関税によるコスト影響は、スマートフォン、PC、サーバーに影響するメモリ価格の上昇など、他のコスト圧力の上に重なります。

「これらの関税が当面続くと考えるなら、その分を織り込んで価格は高くなるでしょう。価格を下げてから、また上げ直すのは難しい。あまりに混乱が大きすぎます」
– Phil Solis(IDC)

価格の意思決定は容易ではありません。コストが上がれば、通常は価格にも反映されます。一方で、コストが下がったとしても、価格が同じペースで下がるとは限りません。追加関税が継続する可能性がある環境では、企業は「いったん値下げして後で値上げに転じる」ことを避け、価格対応に慎重になりがちです。

国境を跨ぐ複雑性と「関税の積み上げ(Tariff Stacking)」

現代のテクノロジー製品は、最終製品になるまでに複数回国境を跨ぐことが一般的です。たとえば半導体が輸入され、モジュールに組み込まれ、サブシステムに統合され、最終製品として組み立てられる―という具合です。

各段階で追加のコスト影響が発生し得るため、関税は「積み上げ」の形でバリューチェーン全体の価格圧力を増幅させます。複雑なグローバル供給網を運用する企業にとっては、コスト管理とコンプライアンスの両面で、部材・製品がどの法域を通過したかを追跡・トレースする重要性が高まります。

効率とレジリエンスのバランス

パンデミック以降、企業はサプライチェーンの効率性とレジリエンスの間で、継続的な緊張関係に直面してきました。関税は、そのバランスに加わる新たな混乱要因です。

マルチソーシングや余剰能力の確保によってレジリエンスを高めれば、リスクは下がる一方で追加コストが生じます。テクノロジー領域の意思決定者は、「どこに柔軟性が不可欠で、どこは効率を優先できるのか」を見極める必要があります。

次の一手をどう選ぶか

主要な製造・インフラ投資は、10年、20年といった長期の時間軸で決まることが少なくありません。短期的に政策が揺れ動く環境では、長期計画は一段と複雑になります。

テクノロジーベンダー、製造業者、そしてテクノロジーバイヤーに共通する中心課題は、不確実性が続く中でも、規律ある意思決定を維持することです。

最新の関税動向が今後数か月のテクノロジー市場に与え得る影響について、IDCのより詳しい見解は、記事内の対談(動画)をご覧ください。

原文:2026年2月25日公開(英語)|日本語版監修:寄藤 幸治

関連する調査やご相談について

より詳細なインサイトや市場動向については、当社アナリストへお気軽にご相談ください

Let me share a conversation I had with the CIO of a reasonably large manufacturing company. The company has implemented the typical commercial software (ERP, CRM, HRIS, etc.) and has a team of 24 software engineers to design, build, deploy, and support the organization’s niche application needs. Under the direction of the CIO, the software engineering team has adopted a Vibe programming platform that has dramatically reduced the time and effort to create these niche applications. The CIO told me that it now takes about one-sixth of the time and effort to create an application than it did in the past.

This reduced time and effort to build an application has created a challenge for the CIO. The CEO of the company has reacted to this increased productivity by suggesting that the CIO can now get rid of five-sixths of the software engineering team. This is not something the CIO wants to do but he is struggling with how to rationalize leaving the software engineering team intact.

The CIO’s challenge is the same challenge that any organization faces when technology and process improvements increase the productivity of the people in the organization. What should the organization do with the time and effort capacity that the productivity creates?

In general, there are two ways to handle excess capacity that improved productivity creates. The correct way is to use that newly created capacity to improve growth. The incorrect way is to eliminate the newly found capacity by getting rid of the people whose productivity just increased. I’ll explain using a manufacturing example.

A manufacturing lesson in capacity and profit

The example organization makes concrete block. An analysis shows that the daily production of concrete block is heavily affected by the number of times each production line must be shut down to change over the block making machine. If the changeover involves both a mold and color change, the typical shutdown time is 45 minutes. If only a mold change is required, the shutdown time is 20 minutes. By analyzing and improving the changeover process, it now takes only 15 minutes for a color and mold change and 8 minutes for a mold change. The net result is, on average, a 2-hour increase in the daily production time for each manufacturing line. Improving the changeover process has created 2 hours of additional capacity for each line.

What should the company do with these hours? One choice (the incorrect choice) would be to reduce the hours of the staff. The organization could do this by consolidating production lines and laying off a percentage of the workforce. If the company does this, it will have reduced its costs by some percentage and increased its profits by a similar percentage.

Why is this the incorrect choice? Because the company is much better off using the newly found capacity to grow both revenue and profits by building more block — and continuing to operate the manufacturing line for the same amount of time as before. What does it cost the company to build an additional 2 hours’ worth of block per day on each production line? Just the cost of the raw materials used in the additional block. There is no increased labor cost because the staff is already “paid for” and now using the same amount of time to produce more block. As long as there is a demand in the market for more block, the additional 2 hours of produced block per line are the most profitable blocks the company creates. Even better, this lower-cost block grows revenue at a lower cost.

Clearing the backlog instead of cutting the team

Let me return to my conversation with the CIO of the manufacturing company. I asked him how many projects the company had on its niche application request list. He told me it was a backlog of two to three years. I next asked him how many projects had never been requested because the existing request list was so long or because the projects did not have sufficient value cases. How much longer would the application request list be if it did not take so much time or effort to build a niche application? He guessed the list would approximately double in size.

I recommended that he approach the CEO with the following reasoning about what to do with his 24 software engineers:

  • “By now being able to build and deploy applications with one-sixth of the time and effort than in the past, we should keep every member of the team and accelerate the building and deploying of every application currently on the list and gather every meaningful idea for other projects to add to the list.”
  • “If every project on the list creates value, then shortening the time to value delivers that value sooner and improves the profits and performance of the organization much more than cutting the newly found application delivery capacity would.”
  • “Without increasing costs (the company is already paying for its team of software engineers), the company can accelerate its growth and do it more profitably.”
  • “In fact, since it takes less time and effort to build and deliver an application, the threshold for what delivers value has also changed, and so projects that might not have been included on the backlog in the past might now belong on the backlog.”

The broader AI productivity question

And this thinking extends well beyond the use of Vibe programming.

Let’s suppose that a marketing team, using AI, can now create marketing campaigns and content in half the time it took pre-AI. The incorrect way to treat this newly found marketing team capacity is to cut the capacity by laying off 50% of the marketing team. What is an alternative that benefits the company more than cutting the marketing team? What would drive improved growth? What specific types of marketing campaigns and content? What could the marketing team do with its additional 50% capacity that it could not do before? How about campaigns and content targeting specific customer personas? Or, even better, specific customers? The team now has time to do things it never had before. With this capacity, what could it do to improve the growth of the organization? And remember that whatever the team does to improve the growth of the company comes at a much lower cost (the company has not increased the marketing team’s cost — just achieved improved results for that cost).

Lead with growth, not fear

There are certainly cases when cutting the capacity that comes from AI will make sense, but the initial reaction to improved productivity should be to identify ways to use the productivity to increase growth (at a similar cost). This might take some innovation cycles to realize what we could never do that we can now. But this seems to be worth the effort — in my experience, profitable growth is always a good thing.

Niel Nickolaisen - Adjunct Research Advisor, IT Executive Program - IDC

Niel Nickolaisen is an adjunct research advisor for IDC’s IT Executive Programs (IEP). He is considered a thought leader in the use of Agile principles to improve IT delivery. And he has a passion for helping others deliver on what he considers to be the three roles of IT leadership: enabling strategy, achieving operational excellence, and creating a culture of trust and ownership.