Successful AI Transformation: The Role of the Data Teams
Reading Time: 3 minutes

This is the second post in my four-part series about AI transformation. The first covered trust, the most important ingredient in this transformation.

As enterprises push toward AI-driven operations, data teams are facing a new mandate. In the past, their role focused primarily on supporting analytics, building data pipelines, managing warehouses, and prepping data for various analytics and BI use cases.

Today, the mission is much broader. Data teams must now deliver AI-ready data products that power not only analytics but also machine learning, generative AI, and increasingly, autonomous agents. This shift fundamentally changes the requirements for enterprise data infrastructure.

The New Demands on Data Architecture

AI workloads require a very different data foundation than traditional analytics. AI must:

  • Access live, operational data
  • Combine structured and unstructured information
  • Interpret data using consistent business semantics
  • Operate within governance policies and security controls

Traditional data architectures that rely heavily on copying data into centralized platforms struggle to meet these needs. Data movement introduces latency. Replicated copies multiply governance challenges. And maintaining multiple copies of data increases both cost and complexity.

What organizations need instead is a logical approach to data management.

Delivering AI-Ready Data Products, Logically

Logical data management enables data teams to deliver trusted data products from across all source systems and applications. Because data remains in place, the organization benefits from live access to authoritative sources of truth. Because data is described by a consistent, business-contextualized semantic layer, AI agents will use the right data. And because access control and other governance policies are centrally managed, all AI agents’ data usage, across all data products, is kept within the guardrails. This is essential for AI agents that must react to real-world conditions in real time.

The Denodo Platform enables this approach by providing a unified logical data layer across all enterprise systems. Using the Denodo Platform, data teams are better empowered to deliver data products that meet all these requirements, and deliver them quickly. Data teams are thus better able to keep up with the data demands of AI.

Faster Performance for AI Workloads

AI workloads are also far more data- and compute-intensive than any other workloads that preceded them, driving ever higher compute and other operating costs.

To support these demands, Denodo Platform 9.4, the latest version of the Denodo Platform, introduces the Lakehouse Accelerator, which embeds the open-source Velox execution engine to significantly improve query performance and efficiency in lakehouse environments.   

These enhancements enable data teams to:

  • Support higher data volumes from more sources, accessed by more concurrent agentic workloads
  • Improve query performance with lower infrastructure costs
  • Extract greater value from existing lakehouse investments.

In many cases, organizations can accelerate AI initiatives without redesigning their existing data architecture. Data does not need to be migrated to yet another data platform; Denodo sources data from wherever the data already is, sparing the organization yet another lengthy, expensive migration project. With Lakehouse Accelerator, AI’s high volume and concurrency requirements can still be met with data remaining in place, while keeping costs under control.

From Data Engineering to AI Enablement

As AI adoption accelerates, the role of the data team is evolving. Instead of simply moving and preparing data, data teams are becoming providers of trusted data products for the entire enterprise. These products support both AI agents as well as other business needs that continue as before: analytics and business intelligence, operational applications, and self-service data sharing.

By delivering these capabilities through a logical data layer, organizations gain a scalable foundation for trusted AI. And that foundation enables the rest of the organization — especially AI teams — to innovate faster.

In the next post in this series, I’ll explore how this trusted data foundation accelerates AI development teams’ development and deployment of AI agents.