back

AI Saves You Money. It Doesn’t Make You Money Yet. The Platforms Are Taking Notes.

Amazon just spent $11.6 billion on satellites — not AI models. Oracle is buying fuel cells — not GPUs. Google shipped agentic AI to 3.5 billion Chrome users who don't need to learn a thing.

Get SIGNAL/NOISE in your inbox daily
Google Chrome  AI Gemini

The model wars are over. The infrastructure wars just started. And the companies selling you the tools are building your replacement.

THE NUMBER: 3.5 billion — the installed base of Google Chrome users who just got Skills: saved AI prompts that become one-click agentic tools inside the browser they already use. No new app. No subscription. No learning curve. Google didn’t build a better model. It put an okay model on every screen on Earth. Distribution eats everything. We’ve seen this movie before — it’s called Android.

The cost-cutting side of AI works. Nobody’s arguing anymore. The revenue side — selling through agents, transacting through bots, generating new demand — doesn’t work yet. American Express just built purchase protection for agent-initiated transactions. Not purchase enablement. Protection. OpenAI shut down its inline shopping trial after Shopify and others invested real time and money in the integration. You can’t close a sale through an agent. Not this year.

But something is working on the revenue side. Criteo reported that AI-driven advertising converts at 1.5x the rate of Google search ads. So the demand generation layer is firing while the transaction layer stays broken. The companies making real money from AI right now aren’t replacing their sales teams. They’re giving them better targeting — and humans still close.

Meanwhile, the platforms are watching every workflow you build. Anthropic launched Channels. Google launched Agent. Microsoft is testing always-on Copilot. Every successful pattern you discover on their infrastructure becomes a feature they ship natively next quarter. That’s not paranoia. It’s the oldest play in tech. Microsoft did it to Netscape. Apple does it every WWDC. Google did it to every standalone app that Chrome eventually absorbed. Let others pave the way — then make them redundant.

And underneath all of it, the physical constraints nobody’s talking about in boardrooms: Amazon bought $11.6 billion worth of satellites because bandwidth is a bottleneck. Oracle bought 1.2 gigawatts of fuel cells because power is a bottleneck. Half of all planned data center builds are delayed — not by chip supply, but by the electric grid. The binding constraint shifted and most people missed it.

The AI industry built the engine. Now it’s fighting over the roads, the fuel, and the distribution network. Your job — if you’re running a company — is to ride the cost savings, keep your workflows portable, and use the employee bandwidth you just freed up before the CFO notices you’re paying full salary for 70% output.

Amazon Bought the Sky — and Jeff Bezos Told You Why

Amazon acquired Globalstar for $11.6 billion on Monday — its largest acquisition since Whole Foods. Not an AI company. A satellite company. Globalstar operates 24 LEO satellites that power Apple’s Emergency SOS, messaging, and Find My on every iPhone. Apple owns 20% of a subsidiary, committed $1.5 billion, and stays on as a client. The Globalstar deal adds spectrum licenses with global authorization — assets that take decades to assemble and can’t be replicated.

Amazon Leo already won Delta (1 Gbps across 1,150 aircraft), JetBlue, AT&T, Vodafone, and NASA. Meanwhile, Oracle expanded its Bloom Energy deal to 2.8 gigawatts of fuel cell capacity. Bloom is sold out through 2027. Half of this year’s data center projects face delays — not from chip shortages, but from the electric grid. Grid connections take 2-3 years. High-voltage transmission lines take 10+. Microsoft signed with Chevron for natural gas. Bloom’s stock jumped 20%.

Jeff Bezos says AI is “like electricity — a horizontal enabling layer.” He’s right about the analogy and wrong about the implication. Edison Electric didn’t build competitors on top of its own grid. Amazon is simultaneously the infrastructure (AWS), the investor ($4 billion in Anthropic), the model builder (Amazon Q), and now the bandwidth provider. That’s not a utility. That’s vertical integration at a scale we haven’t seen since the railroad barons who owned the tracks, the trains, and the freight companies.

Key takeaway: The model wars are becoming a sideshow. Bandwidth, power, spectrum — the physical infrastructure layer is being locked up right now by companies with the balance sheets to buy it. If you’re planning AI infrastructure spend for 2027, the question isn’t which model to run. It’s whether your provider can guarantee capacity. Ask your cloud rep the power question. If they hedge, you have your answer.

Google Just Boxed Out OpenAI — Twice in One Day

🧠 Two moves, same day. First: Google launched Agent inside Gemini Enterprise — browses the web, reads Gmail, checks Calendar, pulls from Drive, runs multi-step workflows with a human-review toggle. This is Claude Cowork for Google’s ecosystem. Figma, Klarna, and Virgin Voyages (50+ specialized agents) are already on it.

Second: Chrome got Skills. Save your best Gemini prompts as one-click reusable tools that run on whatever webpage you’re viewing. Google is shipping a pre-built library covering productivity, shopping, and research workflows. Rolling out now to 3.5 billion Chrome desktop users signed into Google accounts. No new app. No subscription. No learning curve. Your employees are already in Chrome eight hours a day.

Stack the pieces. Google has the TPUs, the models, the enterprise clients, the productivity suite, the desktop agent, and now agentic browser distribution at planetary scale. End-to-end, from silicon to screen. OpenAI has a chatbot, a coding tool, and no cloud, no browser, no enterprise surface. Anthropic has Cowork and AWS — real advantages, but Amazon’s infrastructure, not Anthropic’s.

Why this matters: You now have leverage you didn’t have six months ago. Use it. Negotiate hard while switching costs are low — both sides are building agent workflows designed to make migration painful. And here’s the play: store your workflows outside both platforms. Obsidian, GitHub, your own repo. Load them into whichever model you’re running that week. Your context is yours. The model is interchangeable. Trust — but verify. The window closes the moment one of them owns your orchestration layer.

Things That Don’t Quite Work Yet — and Why That’s Where the Money Is

🦞 Three things that should work but don’t. The gap between what agents can do and what the infrastructure lets them do is where the next wave of enterprise value gets created.

The harness problem. Same model — Opus 4.6 — scored 77% in Claude Code and 93% in Cursor. A 16-point swing from orchestration alone. Stanford’s AI engineering lecture confirmed it: prompt chaining beats single prompts, RAG beats fine-tuning, build evals before shipping. The moat isn’t the model. It’s the harness. And agents hit a “45-minute cliff” — losing coherence as context fills. The fix is multi-agent architectures with skeptical evaluators. Almost nobody builds this way yet.

The plumbing problem. Amex launched a developer kit with purchase protection for agent-initiated transactions — because agents need delegated spending authority, identity verification, and reversibility that current payment rails don’t support. OpenAI shut down its inline shopping pilot. You can’t sell through an agent today. But Criteo says AI advertising converts at 1.5x Google rates. The demand layer fires while the transaction layer stays broken. E-commerce needed SSL and PCI. Agentic commerce needs agent identity and spending caps. The companies that build this plumbing own the next picks-and-shovels layer.

The bandwidth problem. AI doesn’t just save you money. It saves your employees time — which means you just created capacity. A 30% productivity gain on a 50-person team gives you 15 people’s worth of freed-up hours. Can you define enough high-value work to fill it? If you can — more leads, more experiments, more customers — you got a 30% expansion without a single hire. That’s offense. If you can’t? You’re paying full salary for 70% output, and the CFO eventually notices. The bottleneck isn’t AI capability. It’s management imagination.

The action item: Before you evaluate another model, evaluate your orchestration layer. Run the same prompts through two different environments and measure the delta. The 16-point swing isn’t theoretical — it’s the lowest-hanging fruit in enterprise AI right now.

What This Means For You

The model wars are commoditizing. The infrastructure wars are where fortunes get made and lost — bandwidth, power, distribution, orchestration. Every story this week points the same direction: the technology works, the plumbing doesn’t, and the platforms building the plumbing are taking careful notes on what you build on top of it.

Keep your workflows portable or you own nothing. Store your prompts, your context, your orchestration logic outside any single vendor’s ecosystem. The Karpathy approach: your intelligence layer is yours, the model is a utility. The moment your workflows only run on one platform, you’re not building a capability. You’re training your vendor’s product team.

Use the bandwidth or lose the headcount. AI freed up 30% of your team’s time. That’s not a cost savings — it’s a management test. Can you define enough high-value work to fill the capacity? The companies playing offense right now aren’t cutting deeper. They’re deploying five people who each orchestrate a swarm. The bottleneck isn’t AI capability. It’s management imagination.

Negotiate leverage while the switching costs are low. Google and Amazon are both building the full AI stack. For the first time, you can play them off each other the way you played cloud providers five years ago. This window closes the moment one of them owns your agent workflows. Move now.

The companies that win the next three years won’t have the best models. They’ll have the best infrastructure deals, the most portable workflows, and the managers who figured out what to do with all that freed-up human bandwidth.

Three Questions We Think You Should Be Asking Yourself

If your cloud provider can’t guarantee power capacity in 18 months, what’s your backup plan? Half of this year’s data center projects are delayed by the grid, not by chips. Oracle is buying fuel cells. Microsoft is buying natural gas. Amazon is buying satellites. The binding constraint shifted. If your AI roadmap assumes compute availability scales smoothly, your planning horizon just broke. Ask the uncomfortable question in your next infrastructure review.

Can you name one AI workflow that generates revenue — not saves cost? Every company has the cost-cutting story down. Fewer can point to a single AI-driven workflow that brought in new money. Criteo’s 1.5x ad conversion is a signal, not a trend. If your AI strategy is 100% cost reduction, you’re optimizing a shrinking denominator. The companies that figure out the revenue side first will have a head start that compounds quarterly.

What would your team do with 30% more capacity tomorrow? If you don’t have an answer, you have a problem — and it’s not a technology problem. AI gave you the bandwidth. The question is whether your organization has enough ambition to use it. The gap between “AI saves us time” and “AI made us bigger” is a management gap, and nobody’s talking about it.

It’s like electricity — a horizontal enabling layer. It can be used to improve everything.”

— Jeff Bezos

The difference, of course, is that Edison Electric wasn’t building competitors on top of its own grid.

— Harry and Anthony

Sources

Past Briefings

Apr 13, 2026

The AI Race Just Became a Resource War. Here’s Who Owns the Mine.

THE NUMBER: $4.08 — the hourly rental price for a single Nvidia Blackwell GPU, up 48% from $2.75 in just two months. CoreWeave raised prices 20% and extended contract minimums to three years. For the first time since the early 2000s, the most important resource in AI isn't talent or data. It's electricity and silicon. The companies that own it just took the driver's seat. ⚡ The AI industry spent three years telling you the future was about models. The smartest models. The biggest benchmarks. The most parameters. Turns out the future is about who owns the power plant. Tomasz...

Apr 12, 2026

The Revolution Eats Its Children

THE NUMBER: 85.4% vs. 61.3% — VoxCPM2's voice similarity score versus ElevenLabs on the MiniMax-MLS benchmark. A 24-point blowout. The winner is an open-source model from Tsinghua University with 2 billion parameters, runs on 8GB of VRAM, ships under Apache 2.0, and costs exactly nothing. The loser is valued at $11 billion and charges a monthly subscription. VoxCPM2 doesn't just clone voices — it generates new ones from text descriptions. Describe what you want — "a young woman, gentle tone, slightly slow pace" — and it builds the voice from scratch. No recording needed. No API fee. No permission required....

Apr 9, 2026

Anthropic Built the Plumbing. Meta Built the Cash Register.

THE NUMBER: $0.08 — the cost per session hour for an autonomous AI agent that can work for hours without human intervention. Eight cents for the orchestration layer. But here's the business model that matters: the real revenue is the inference underneath. Every agent session burns tokens — Opus tokens, Sonnet tokens, Haiku tokens — and Anthropic collects on every one. The $0.08 isn't the price. It's the on-ramp. Anthropic just built the cheapest toll road in enterprise software, and every car on it burns their fuel. Yesterday we wrote that the AI house needed plumbing. Then Anthropic showed up...