An a16z Partner Just Mapped the Entire AI Investment Landscape in One Investor Call

Source: a16z | Published: 2026-01-26T12:00:00Z

AI model API costs have plummeted 99% in two years, yet only 40 million out of 2 billion users actually pay — making price discrimination and tiered pricing the biggest wildcard in consumer AI business models.


The six largest publicly traded companies in the US are all tech companies. Seven or eight of the global top ten by market cap are American tech firms. Technology has devoured the entire market — and AI is accelerating the trend.

David George, a growth fund partner at a16z, laid out his panoramic view of AI investing with a set of data points and convictions during an investor briefing.


A $400 Billion Foundation — Built by Someone Else

Annualized capital expenditure from big tech's most recent quarter comes to roughly $400 billion, the vast majority flowing into AI infrastructure and data centers. David believes this number will only grow.

The key is who's footing the bill. The builders are Google, Meta, Amazon, and Microsoft — arguably the most profitable companies in human history. Even if overcapacity emerges, they can absorb it. Every company building applications on top of this infrastructure gets to ride the capex wave without paying for it.


Costs Down 99%, Capabilities Still Doubling

Over the past two years, the cost of calling AI models has dropped by more than 99% — faster than Moore's Law. Meanwhile, frontier model capabilities are roughly doubling every seven months. Collapsing costs plus surging capabilities: this is the best window to build new products.

The internal view at a16z is that AI will eventually become like electricity or Wi-Fi — you don't visit a friend's house and offer to split a few cents of their power bill. The perceived cost of an AI call will trend toward zero.


Software Captured 1% of GDP. AI Is Targeting 20%

The last cycle of mobile plus cloud computing created roughly $10 trillion in new market value. David argues AI's impact will be far larger, and the logic is simple: US software spending is 1% of GDP, while white-collar wages are 20%. The economic surface area AI can reach dwarfs what software alone could touch.

About 90% of newly created value will flow to end users, with companies capturing only 10%. But David offered two examples to show that even 10% is staggering: the iPhone sells for a thousand dollars, yet consumers' true willingness to pay is far higher — Apple is still an extraordinary business. Google earns only about $200 per user per year, but the value it delivers far exceeds that.


ChatGPT Matched Google's Search Volume in One-Fifth the Time

ChatGPT hit 365 billion searches in two years; Google took eleven. David sees demand as the critical difference between this wave and the dot-com bubble: AI is built on top of the internet and cloud infrastructure. No new hardware to ship, no networks to lay. Over five billion internet users worldwide can access it immediately.

"If you add up all the platforms, more than half the world's internet population has probably used an AI tool by now. Active users are somewhere between 1.5 and 2 billion."

That means the supply-side infrastructure investment is backed by demand signals far clearer and more immediate than anything the early internet had.


2 Billion Users, Only 40 Million Paying

David spent considerable time dissecting AI's consumer business model. About one billion people use ChatGPT, but only 30 to 40 million pay. OpenAI just launched a $3–4/month subscription in India, while its $200–300/month premium tier in the US can't keep up with demand.

This hints at something consumer internet has historically struggled to achieve: effective price discrimination. Google and Apple can't easily charge different users different prices, but AI's subscription model naturally enables tiered pricing. Free users may eventually be monetized through something resembling an affiliate model — David concedes the term sounds unglamorous, but believes AI-era commercial recommendations will take this shape.

For context: over the past decade, Facebook and Google increased per-user monetization eightfold. The monetization ceiling for AI companies is likely being systematically underestimated.


Baseball Bats, Deep Research, and Google's Traffic Crisis

David shared a personal anecdote: he needed to buy his son a baseball bat — a task involving a maze of specs and year-over-year value comparisons. He used a deep research product and got results that blew Google search out of the water — no clicking through seven sponsored links, no bouncing between websites.

"I told my son, you have no idea how much research I did. I turned the internet upside down."

The flip side of that experience is already visible in public markets: a wave of listed companies are reporting declines in referral traffic and user engagement. Companies like Groupon are taking the hit in real time. Fortune 500 companies are broadly grappling with a single question — when consumers can conduct exhaustive purchase research through AI without visiting any website, how do brands reach them?


Energy Is the Bottleneck for the Next Five Years. After That, It's Cooling

The most obvious bottleneck in AI infrastructure today is compute, but David believes chip and infrastructure capacity will naturally scale to meet demand. The real constraint is energy. a16z has invested in nuclear; Three Mile Island may be reactivated; big tech is building data centers near nuclear plants; and natural gas in West Texas is powering large training clusters.

He cited xAI as an example: xAI built what was then the largest data center in a quarter of the industry's standard timeline by buying up all the backup generators across multiple states and poaching workers from other projects. These "unconventional moves" illustrate that construction itself is a massive engineering challenge.

Once energy is solved, the next bottleneck will be cooling — how to keep these chips from, quite literally, boiling the ocean.


Gross Margins Can Be Forgiven. Retention and Acquisition Cannot

There's been plenty of debate around AI company gross margins, particularly the Cursor–Anthropic relationship. David's position: as long as the model layer remains competitive among multiple players, input costs will keep falling, so AI companies deserve more margin leniency than mature SaaS companies. GPT-5's release and Google Gemini's progress in coding are both pressuring Anthropic on pricing.

But if forced to rank three metrics — gross margin, retention, and customer acquisition efficiency — he'd prioritize the latter two. Gross retention above 90% signals real product value. Low acquisition costs with strong organic demand signal the market is pulling the product forward. You can give margins time, but if customers don't love the product, time won't save you.


Medical Scribes Are Sticky. Vibe-Coding Your Own Salesforce Is Not Realistic

Which AI applications have staying power? David's litmus test is integration depth and the accumulation of enterprise-specific rules. Medical scribes are highly sticky because doctors' workflows are already built around them. Customer support systems are sticky because troubleshooting flows and brand voice are deeply embedded. High-end financial analysis, same story.

What's not sticky? Experimental internal tool-building, low-end website prototyping — who wins and where the use-case boundaries lie remain unsettled. As for whether enterprises will vibe-code their own Salesforce replacement, David's answer was blunt: not worth it — it's not a core competency.

"I wish we could vibe-code our way out of our own Salesforce."


The Era of Per-Task Pricing Hasn't Arrived

From perpetual licenses to SaaS per-seat pricing to cloud usage-based billing, the software industry has gone through two business model revolutions. The expected AI-era model is per-task-completion pricing — customer support is the furthest along because "resolving a ticket" is clearly measurable.

But David is cautious. Outside of support, task completion is hard to measure objectively, and customers still default to paying by seat or usage. He drew an analogy to the steam engine: the steam engine wasn't priced based on how many workers it replaced. Once competitive forces kicked in, it settled at a price yielding reasonable returns on capital, with the vast majority of surplus flowing to users. AI will almost certainly follow the same pattern.


High Growth Now Lives Almost Exclusively in Private Markets

In public markets, only about 5% of software and internet companies are expected to grow more than 25% over the next twelve months. High growth has concentrated almost entirely in private markets. The median time from founding to IPO has stretched to fourteen years — and it's still lengthening. Private tech companies valued above $1 billion now total roughly $3.5 trillion in combined valuation, equivalent to 10–12% of the Nasdaq — a decade ago, that figure was just $500 billion.

David doesn't see this reversing. Private markets have developed mechanisms to cope with extended pre-IPO timelines: more frequent tender offers for employee equity and more flexible secondary trading. He acknowledged that DPI (distributions to paid-in capital) remains an industry-wide challenge, but noted that a16z's growth fund has delivered liquidity through multiple exits.


Disrupting Salesforce Requires Three Conditions Met Simultaneously

Asked which public software companies will be disrupted by AI, David declined to name names but offered a framework: for a startup to beat an incumbent head-on, it needs three things at once — a fundamentally new UI/UX (from "helps you record" to "acts on your behalf"), entirely new data sources (from structured databases to unstructured data), and a disruptive business model.

All three are non-negotiable. a16z hasn't yet found the startup that can truly replace Salesforce, but they've made plenty of investments in the opportunity windows surrounding these systems.

More articles on TLDRio