AI and Analytics, from First Principles

If you asked me ~1.5 years ago what AI could do for analytics, I'd say it generates SQL and fixes syntax. That was cool.

Obviously that's no longer where we are now.


The agent has an environment now

The biggest shift is that agents don't just write SQL anymore. They have a full environment, can write SQL, run Python, query the database, check their own results, and iterate. Think of it like giving someone a laptop and database access instead of asking them to write queries on a whiteboard.

~1.5 years ago
"What's Q2 revenue?"
SELECT SUM(revenue)...
A number
Whiteboard query. One shot. Done.
Now
>Write SQL
>Run Python
>Query the database
>Check its own results
>Adjust logic
>Iterate & verify
A laptop with database access, not a whiteboard.

An agent can now try to build a model by running exploratory analysis to see if the output makes sense, adjusting its logic, and verifying along the way. It's not just generating code. It's doing the work.

But here's the catch: how good the output is depends almost entirely on what's underneath.


The foundation is human-built

For this whole thing to be solid, you need (almost) every piece of data logic explicitly written in code. dbt models, documented transformations, clean naming conventions. Once the logic is in code, the agent has something reliable to work with. It can reason on top of well-defined building blocks instead of guessing at ambiguous raw tables.

Getting there is the hard part. It requires data people talking to business people to understand what success looks like. It requires talking to product managers (or using the product yourself) to understand features, user flows, and what events matter. And it requires working with engineers to instrument and log the right data points.

All of these conversations eventually translate into raw data in the warehouse, data models built on top of the raw data, and reporting or analysis on top of the models.

The Analytics Stack
Business Context
What does success look like?
Product Understanding
Features, user flows, what matters
Engineering Instrumentation
Logging the right data points
Raw Data
Events, logs, tables in the warehouse
Data Models
dbt, documented transformations, clean naming
Analytics
Reports, dashboards, ad-hoc analysis

That's the stack: business context → product understanding → engineering instrumentation → raw data → data models → analytics.


From first principles, what actually requires a human?

This is where it gets interesting.

Engineering and instrumentation: Today, AI agents can read production code. They can understand how data is written, which parts of a codebase need logging, and what schema changes are needed. The gap between "I need an engineer to tell me how this works" and "the agent can just read the code" is closing fast.

Product and user experience: Browser-use agents are real. They can log in, click through flows, and mock the user experience as if they were a human. They don't need someone to walk them through the product. They can see it for themselves.

Business strategy and OKRs: AI already knows the best practices. It knows how to measure business health, what metrics matter for growth vs. retention, and can probably come up with a better measurement framework than 80% of analysts can.

From First Principles: What Still Requires a Human?
Engineering & InstrumentationAI ~85%
Agents read production code, understand schemas, suggest logging
Product & User ExperienceAI ~70%
Browser-use agents click through flows, mock the user experience
Business Strategy & OKRsAI ~60%
AI knows best practices, measurement frameworks across industries
Conviction & Influencedeeply human
Having a point of view, pushing for it, bringing people along
AI can do itUniquely human

So from a first-principles perspective, there's little AI fundamentally cannot do. The physical limits are shrinking.


What this means for analysts

I don't know exactly how this plays out. But two things feel clear to me.

Use the tools. Now. The more you use AI in your day-to-day, the more you actually understand what it's good at, where it breaks, and how to get the most out of it. When the day comes where the remaining limits disappear, you want to already be fluent. Not scrambling.

Be more business-focused. We still do business with humans. The ability to have conviction, to take a position, argue for it, bring people along, and make decisions under ambiguity. That's still deeply human. AI might know the best practices, but knowing best practices and having the instinct to apply them in your specific context with your specific stakeholders are very different things.


On paper, AI may eventually be better at a lot of what we do. But what it can't easily replicate is having a point of view, the conviction to push for it, and the ability to bring people with you to make that point of view have impact.

AI and Analytics, from First Principles | Heqing