Databricks Lakebase Goes GA: The Neon-Powered Postgres Built for AI Agents

When Databricks announced its $1 billion acquisition of Neon in May 2025, most coverage framed it as another big-cheque AI data deal. A year later, the framing looks small. In February 2026, Databricks shipped Lakebase — a fully managed, Neon-powered serverless Postgres service — into General Availability on AWS, followed by Azure GA at FabCon 2026. Databricks has disclosed that more than 80% of new databases on the platform are now provisioned by AI agents, not humans. If you are building anything where an AI system reads, writes, branches or reasons over structured data, this is the infrastructure story of 2026.
Databricks acquires Neon for $1B to power AI agent integration on high-performance Postgres — Anjin AI Insights header

A $1B Acquisition That Became a Product: Lakebase

Databricks closed the acquisition of Neon on the back of a simple observation: its existing Data Intelligence Platform was optimised for analytics, not for the millisecond-scale, transactional, high-concurrency reads and writes that AI agents generate. Lakehouse architectures were built for humans asking one question at a time. Agents don't behave that way. They fire thousands of small queries per second, spin up ephemeral database instances for sandboxed reasoning, and expect sub-10ms latency on operational data.

Neon's architecture — separation of compute and storage, copy-on-write branching, sub-500ms cold-start Postgres instances — was already being abused (in a good way) by AI companies building agent memory layers. Databricks bought the engine and, within nine months, shipped Lakebase: a Postgres-compatible operational database that now sits inside the Databricks Data Intelligence Platform alongside Delta Lake analytics and Unity Catalog governance.

At GA, Lakebase supports Postgres 17 with pgvector, sub-10ms read latency, 10,000+ queries per second concurrency, and native integration with Databricks Apps, Jobs and Mosaic AI. Storage pricing post-integration dropped roughly 80% to $0.35/GB-month, and compute pricing fell 15–25% across tiers. This is not a cautious enterprise rollout. It's priced to win the agent layer.

Why PostgreSQL Is the Substrate for AI Agents

Agents need three things a traditional analytics warehouse cannot give them: transactional writes, durable memory, and cheap forkability.

  • Transactional writes — when an agent books a meeting, places an order, updates a CRM record, or decides on a price, it needs ACID guarantees. Getting that wrong at agentic scale isn't a bug; it's a lawsuit.
  • Durable memory — long-running agents need a place to persist state across sessions, tool calls and hand-offs. Vector stores handle semantic recall; Postgres handles the structured facts (user ID, account status, last order, consent flags) that agents have to be right about.
  • Cheap forkability — agents routinely need isolated environments. 'Fork the production database, let the agent try a plan, discard if it fails' is now a normal pattern. Neon's copy-on-write branching makes this cost negligible; traditional Postgres cloning was minutes and dollars per attempt.

Postgres also wins on ecosystem. Every LLM in existence has seen millions of tokens of Postgres documentation, SQL examples and schema patterns. Agents write Postgres fluently. They do not write Snowflake SQL dialect as fluently.

80% of New Databases Are Created by Agents, Not Humans

This is the number that reframes everything. Databricks disclosed at Lakebase GA that the majority of new database instances on the platform are being provisioned by AI agents acting on behalf of humans — not by humans clicking 'create database.' One agent builds the product prototype. Another agent creates a staging branch to test a migration. A third agent spins up a per-user sandbox to try a personalisation strategy, then tears it down.

If machine-generated infrastructure is already the dominant traffic pattern in 2026, two things follow:

  1. Database pricing and UX has to be redesigned for agents. Billing by the query, sub-second provisioning, and programmatic teardown stop being nice-to-haves. They're the product.
  2. Every application architecture written before 2024 is now legacy. If your stack assumes a human DBA approves schema changes, you will be out-shipped by teams whose agents approve their own.

Agent Bricks, Mosaic AI, and the Full Stack Taking Shape

Lakebase is one layer of a bigger assembly. At Data + AI Summit 2025, Databricks launched Agent Bricks — a declarative way to build production AI agents that auto-generate synthetic evaluation data, auto-benchmark against task-specific LLM judges, and auto-optimise for the cost/quality trade-off the customer picks. By April 2026 the Agent Bricks umbrella has expanded to include visual Classification and Information Extraction agents, multi-agent Supervisor patterns, and tight hooks into MLflow 3.0 (redesigned for GenAI observability) and AI Runtime (managed auto-scaling GPU inference).

Stacked together: Lakebase holds the state, Mosaic AI trains and serves the models, Agent Bricks wraps them into production agents, Unity Catalog governs the access, and Databricks Apps hosts the UI. That's a vertically integrated agent stack, sold as one platform, billed as one platform.

The Competitive Picture: Snowflake Cortex, Google AlloyDB, AWS Aurora DSQL

Databricks is not alone. Snowflake has countered with Cortex Agents and a deepening Postgres story via its Crunchy Data acquisition. Google pushed AlloyDB Omni as a Postgres-compatible option tuned for AI retrieval and vector search. AWS launched Aurora DSQL, a distributed serverless Postgres aimed at the same agent-heavy workloads. Databricks' own Azure GA at FabCon 2026 widens the footprint but does not remove the competition.

The punchline: every major data platform now believes the centre of gravity for AI is operational Postgres, not analytics SQL. That's an enormous validation of the Neon bet — and a warning that Databricks' Lakebase moat is narrower than the press release made it sound. Execution speed and developer experience will decide this, not architecture diagrams.

What This Means for Marketers and Operators

Most readers of this post are not going to write a line of Postgres. Fine. The implication is not technical. It's operational.

If the database layer is being provisioned by agents, the workflow layer sitting on top of it will be too. That means the rest of the stack — content generation, competitor tracking, SEO fixes, outreach, briefs, campaign state — is also on the clock. The teams who win the next 18 months are the ones whose marketing stack behaves the way Lakebase behaves: agentic by default, composable, fast to fork, cheap to run, governed as one thing rather than stitched together from twelve SaaS logins.

That is exactly the gap Anjin was built to fill.

Anjin: The Marketing Operating System for the Agent Era

Anjin is the Marketing Operating System. Not a point tool, not a CRM add-on, not another ChatGPT wrapper. It's the operating layer that sits on top of your marketing — briefs, research, writing, backlink outreach, competitor tracking, SEO fixes, content refresh, brand voice — and runs it the way Lakebase runs databases: through agents, at agent speed.

Databricks built the infrastructure for an agent-native data stack. Anjin builds the agent-native marketing stack that runs on top of it. Both are betting on the same thing: that the next decade of work is not about better dashboards, it's about systems that execute. For founders, agency owners and in-house marketing leaders, the practical question is no longer 'should I use AI in marketing' — it's 'is my marketing stack still organised around humans clicking buttons, or is it organised around agents that ship?'

The £888 Lifetime License — Offer Closing Soon

Lifetime access to Anjin for a one-time payment of £888. Not a subscription. Not a seat. Not a trial. One payment, unlimited use, for as long as Anjin exists.

The average marketing team spends £888 in about three working days on tooling, freelancers and coordination software. You're buying the platform that replaces most of it — once.

This price will not be offered again once we close our early-access cohort.

Claim your £888 Anjin lifetime license →

Founders, agency owners and in-house marketers — this is how you run marketing at AI speed without the team, the burn, or another year of waiting.

Sources: Databricks Lakebase is now Generally Available, What's New in Azure Databricks at FabCon 2026, Databricks Agrees to Acquire Neon, Databricks Launches Agent Bricks, Lakebase: A New Class of Operational Database, InfoQ Lakebase coverage, Futurum Group, Constellation Research, Mosaic AI Announcements at Data + AI Summit 2025

Continue reading