Markets and Trends March 24, 2026 6 min

AI can’t run on stale data: Why enterprises are rethinking their architecture

As enterprises push toward faster and more automated decision making, traditional data architectures are starting to show their limits. The gap between when data is generated, analyzed, and acted on is becoming a critical challenge, especially as AI moves closer to real time operations.

In this conversation, Devin Pratt, Research Director for Data Management at IDC, explores what this shift means in practice, from converged workloads to the growing importance of real time data for agentic AI, and how organizations can take a practical approach to modernizing their data environments. Recent platform announcements in the market have reinforced this shift toward more unified, real time data architectures.

You recently outlined converged workloads as a framework for the real time enterprise. How should leaders think about this model alongside traditional separated architectures?

Devin Pratt: When I say converged workloads, I mean bringing transactions, analytics, and AI closer to the same live data so businesses can respond faster. I would not frame this as an old versus new or rip and replace decision. Separate transactional and analytical systems were built for good reasons, and those reasons still matter.

What has changed is the speed the business now expects between an event, an insight, and an action. This is why leaders should think in terms of selective convergence. Where timing matters, converged workloads bring live operational data together with the analytics needed to understand it in real time. That helps organizations respond faster and make better decisions.

It is especially important for agentic AI. If you want real ROI from agentic AI, it cannot run on stale data. It needs live operational data to understand what is happening now, and it needs analytics to interpret that data and guide the right action in real time.

The goal is not convergence for its own sake. It is to converge where faster insight, faster action, and AI driven automation create real business value.

IDC’s 2026 FutureScape predicts that by 2029, 60% of enterprise data platforms will unify transactional and analytical workloads. What is driving that shift?

Devin Pratt: The shift is really about speed. Organizations want to reduce the delay between an event, the analysis of that event, and the action that follows. Converged workloads help make that possible by bringing operational data and analytical processing closer together in real time.

AI is obviously a big part of why this is happening now.

That puts real pressure on architectures built around delayed copies and handoffs, because agentic AI depends on current operational data and analytical context.

The technology is also much more ready than it used to be.

The bigger point is that convergence is becoming a mainstream way to support real time decision making, continuous intelligence, and agentic AI.

When would separate transactional and analytical systems still make sense?

Devin Pratt: They can still make sense where organizations want stricter workload isolation around critical systems, or where a phased approach is more practical. Not every business process needs a real time response.

If acting immediately does not materially change the outcome, a more traditional approach can still be the right one. So this is not all or nothing. The practical path is to converge where latency really matters and let the rest evolve over time.

Databricks recently announced Lakebase as generally available. What does this tell you about how the market is evolving?

Devin Pratt: It tells me the lines between categories are blurring. Lakehouse vendors are adding more transactional database capabilities, while traditional database vendors are adding more analytics, automation, and AI directly into their platforms.

The bigger point is that buyers want fewer copies, fewer handoffs, less data movement, and stronger governance across the entire environment. They are looking for platforms that are simpler to run and better suited for real time intelligence and AI.

So I see this as another sign that the market is moving away from rigid categories and toward more unified, AI ready data platforms.

How should organizations evaluate whether to move toward convergence or maintain a traditional model?

Devin Pratt: I would start with one simple question: where does stale data hurt the business? If it is not affecting revenue, customer trust, resilience, or speed, then there is no reason to force convergence.

Then I would look at operating model readiness. Can we run mixed workloads reliably with strong governance and clear visibility into performance and cost? That matters, because most enterprises are already operating across hybrid and multi cloud environments.

My advice is to keep this practical. Start with a few high value real time use cases, take a phased approach, re architect for scale where needed, put governance and observability in early, prove performance and trust, and then expand.

What does the real-time enterprise actually mean beyond faster dashboards?

Devin Pratt: To me, the real-time enterprise is not about better dashboards. It is about sensing what is happening and responding while the moment still matters.

That could mean stopping fraud in the moment, predicting equipment issues before failure, or changing a customer interaction while it is still underway. This is very different from just reporting faster.

This is also where AI agents come in. IDC expects that by 2027, 40 percent of the Global 2000 will adopt modern event streaming and pre built real time data views to support AI agents.

I would describe the real-time enterprise as a shift from looking back at what happened to acting while it is happening.

As AI adoption grows, what architectural considerations should CIOs prioritize right now?

Devin Pratt: First, make trusted data available to AI in real time, even if that data stays in different systems.

Second, build the real time foundation: streaming data, change data capture, event driven workflows, and open interfaces that let AI work from live business context instead of stale copies.

Third, put governance, observability, identity, and access at the center. Trust and control have to be built into the architecture from the start, especially as agentic AI becomes more operational.

Finally, keep AI close to the data. Organizations want AI capabilities embedded into the broader data platform, not pushed into another silo.

The goal is to create a trusted, real time data environment where AI can reason, decide, and act with the right context and guardrails. This is not about putting every workload into one platform. It is about reducing the distance between a business event, a trusted insight, and an action without giving up governance, performance, or control.

Christina Cardoza - Content Marketing Manager - IDC

Christina Cardoza is a Content Marketing Manager at IDC, where she specializes in brand content and social media strategy. With a background in journalism and editorial leadership, she has a proven ability to transform complex technology topics into clear, actionable insights.

Subscribe to our blog

Converged workloads: A framework for building the real-time enterprise