The rise of composable architectures and AI-powered systems is fundamentally shifting how we think about analytics. Traditional BI—built around centralized dashboards and manual querying—just doesn’t scale in today’s fragmented, real-time world.
We’re moving into a new era where insights need to be faster, more accessible, and more context-aware. LLMs, composable data layers, and natural language interfaces are leading the way.
Here’s how we got here—and where things are going.
The Problem: Traditional Analytics Can’t Keep Up
Data is scattered.
In composable and headless setups, customer data lives across CRMs, CDPs, CMSs, and third-party tools. Stitching it together into a full journey view is hard—and without that, personalization suffers and analysis stays siloed.
Dashboards are noisy.
Every team has their own tools and reporting structure. The result? A pile of dashboards, few shared truths, and too many conflicting signals.
People don’t know what’s possible.
There’s often a big gap between the data orgs collect and the insights they actually use. Without intuitive ways to explore data, key opportunities get missed or buried in noise.
Everything takes too long.
Waiting for analysts, battling with SQL, or just trying to find the right report wastes time. And in fast-moving environments, that delay means you’re reacting, not leading.
The Fix: Start With a Smarter, More Composable Stack
To move faster, we need better foundations. That means architectures built for flexibility, AI-readiness, and real-time scale.
Composable + AI-ready by design
You need seamless integration across platforms—pulling behavioral, transactional, and campaign data into a unified layer where models can actually do their job.
Governance isn’t optional
Clean models need clean data. Clear governance, consistent schemas, and high data quality aren’t just nice-to-haev—they’re requirements for anything AI-driven to work.
Data Mesh + UDM + Medallion Architecture
- Data Mesh gives teams control of their domains while contributing to the larger ecosystem.
- UDM (Unified Data Model) lets you use data from different systems interchangeably.
- Medallion architecture brings order: raw → cleaned → optimized-for-insight.
LLMs Are Changing the Game in Analytics
We’ve hit a turning point. Large Language Models are transforming how people interact with data—moving from dashboards and SQL to direct, conversational access to insights.
This shift isn’t just nice to have—it’s a must-have for teams working across composable environments.
Here are three major patterns leading the change:
1. RAG Over Chunked Data
“Ask your data like you talk”
With Retrieval-Augmented Generation (RAG), users don’t need to hunt through dashboards. They just ask:
“What drove PDF downloads last week?”
“Which campaigns underperformed in Q1 and why?”
Behind the scenes, the system:
- Chunks and embeds your data
- Retrieves only what’s relevant
- Generates a human-friendly answer using an LLM
It makes hidden insights accessible—fast, contextual, and clear. But to get it right, you need good chunking logic, precise retrieval, and an architecture that can scale without lag.

2. Natural Language to SQL
“Query without knowing SQL”
With the right prompt engineering and schema descriptions, anyone can ask real questions in plain English and get live answers.
Example:
“Which campaigns had the highest engagement last quarter?”
Here’s what happens:
- LangChain builds a query from the natural language input
- The SQL runs against Spark or your engine of choice
- The result is summarized in a business-friendly format
This democratizes access to data. No backlog. No bottlenecks. Just answers.
Of course, you still need schema clarity, performance guardrails, and governance in place to keep it safe and accurate.

3. Agentic Workflows
“Analytics on autopilot”
This is the next evolution—LLM agents orchestrating full analytic flows from question to answer to action.
A user might ask:
“How are my North America retention numbers trending?”
An agent breaks that down, calls the right tools (SQL, Python, Spark), summarizes the output, and maybe even posts it in Slack or kicks off a report.
These agentic patterns power:
- Automated KPI tracking
- Personalized report generation
- Embedded analytics in everyday tools
They turn your data into a living system—always-on, intelligent, and responsive. But they do come with complexity: retries, tool orchestration, and cost/performance management become part of the equation.
The Big Shift: Analytics That Works the Way People Think
We’re not just evolving BI—we’re replacing it.
From dashboards to dialogs. From reports to real-time responses. From centralized tooling to decentralized, intelligent systems.
LLMs are the missing layer that makes this transition possible.
And if you’re already working with Spark, Fabric, Databricks, LangChain, or SQL Server—you’re already part of this shift.
Recent Comments