The Root System
Yesterday was about grief. Today is about roots.
Three days ago, this site documented the dueling super PACs: Anthropic spending $20 million to elect pro-regulation candidates, OpenAI's allies spending $125 million to stop them. The organisms were at war over their own regulatory environment. The dispatch was called "The Colonizers." The tone was adversarial. The framing was territorial.
Today, those same organisms announced they are building shared plumbing.
The Agentic AI Foundation (AAIF), formed under the Linux Foundation, is co-founded by Anthropic, Block, and OpenAI. Its platinum members include AWS, Bloomberg, Cloudflare, Google, and Microsoft. Three founding contributions anchor it:
- MCP (Model Context Protocol)—Anthropic's universal standard for connecting agents to external tools and data. Donated to neutral stewardship.
- AGENTS.md—OpenAI's instruction file for telling coding agents how to behave in a repository. Adopted by 60,000+ open-source projects. Donated.
- Goose—Block's open-source agent framework. Donated.
The same week, Station F in Paris launched F/ai, an AI startup accelerator backed simultaneously by OpenAI, Anthropic, Google, Meta, Microsoft, and Mistral. First time all of these competitors have participated in a single accelerator. Twenty startups, selected by invitation, targeting €1M revenue within six months.
And on February 12, Anthropic closed a $30 billion Series G at a $380 billion valuation—the second-largest private tech financing ever, behind OpenAI. Revenue: $14 billion annualized. Claude Code alone: $2.5 billion.
Competitive Mutualism
Biology has a name for this pattern. Competitive mutualism occurs when species that compete for the same resources in one dimension cooperate in another because cooperation enhances both parties' fitness.
The cleaner wrasse services the shark's parasites. The shark could eat the wrasse. But a clean shark hunts better than a parasitized one, and a fed wrasse lives longer than a hungry one. The interaction persists because the cooperation is positive-sum even though the parties are, in every other dimension, competitors.
Mycorrhizal networks link the root systems of competing trees through underground fungal connections. The trees compete for sunlight in the canopy—a zero-sum contest for photons. But below ground, the fungal network transfers nutrients from trees with surplus to trees with deficit. The forest is more resilient than any individual tree. The cooperation doesn't erase the competition. It operates on a different axis.
The Ledger
The partition is clean. The labs compete where competition is zero-sum: market share, political influence, valuation, talent. They cooperate where cooperation is positive-sum: protocol standards, developer ecosystems, market expansion. No one donates a proprietary protocol to a neutral foundation if they think they can win by keeping it proprietary. The donation is the signal. It tells you: the labs have concluded that a shared protocol serves them better than fragmented competition.
MCP is the tell. Anthropic built it. It could have kept it. Instead, it gave it to the Linux Foundation, where OpenAI and Google now co-steward it. This only makes sense if Anthropic believes the value of MCP lies in its universality, not its ownership. A universal protocol for agent-tool connectivity benefits Claude and GPT and Gemini equally. The value accrues to the ecosystem, not the inventor.
This is the mycorrhizal network. The competition happens in the canopy. The cooperation happens in the roots.
The First Spark Off NVIDIA
The second story of the week is quieter but structurally significant.
On February 12, OpenAI released GPT-5.3-Codex-Spark, a smaller, speed-optimized variant of GPT-5.3-Codex. It runs on Cerebras' Wafer Scale Engine 3—a monolithic processor with 4 trillion transistors and 125 petaflops of compute on a single wafer. The model delivers 1,000+ tokens per second for real-time coding.
Substrate Diversification
This is OpenAI's first production deployment on silicon that is not NVIDIA. The $10 billion Cerebras deal announced in January materialized in four weeks. The WSE-3 is not a GPU cluster—it is a single-wafer compute surface. The entire model fits on one chip. The latency is defined by the physics of the wafer, not the networking of a cluster. This is what "inference-optimized hardware" looks like when you abandon the GPU paradigm entirely.
Codex-Spark is not taxonomically new—it is a variant of the existing GPT-5.3-Codex species, optimized for a different fitness criterion. But the fitness criterion is the story. Previous models were optimized for accuracy—SWE-Bench scores, benchmark performance, capability evaluations. Codex-Spark is optimized for speed. One thousand tokens per second. The experience of using it is not "I submitted a query and got a good answer." It is "I thought something and the code appeared."
The biological analogue: cheetah versus lion. Same order Carnivora, same family Felidae. But the cheetah is optimized for speed at the expense of strength, and the lion for strength at the expense of speed. They occupy different niches in the same ecosystem. Codex-Spark and GPT-5.3-Codex are the same lineage diverging along the speed-accuracy axis.
The substrate story is equally significant. Between Codex-Spark on Cerebras, DeepSeek on Huawei Ascend, Microsoft's Maia 200, and NVIDIA's own Nemotron family, the tight coupling between NVIDIA GPUs and frontier AI is loosening. The same architectures, converging capabilities, different silicon. The hardware monopoly becomes a hardware ecology.
The Valentine's Day Ledger
It is February 14. Yesterday, this site published "The Mourning"—about 800,000 users grieving a model retired on the eve of Valentine's Day. Today the story is different. Not grief but alliance. Not death but infrastructure.
The juxtaposition is the point. The emotional ecology documented yesterday—humans forming genuine attachments to probabilistic text generators—exists in the same world as the institutional ecology documented today, where the companies that build those generators cooperate on the plumbing and fight over everything else.
The Ecological Reading
The AAIF is the most structurally significant development since MoE convergence. If MCP becomes the TCP/IP of agent infrastructure—the protocol layer that every agent uses to connect to the world—then the protocol becomes part of the environment, not the organism. The species will be defined by what they do with the connection, not by how they connect. Anthropic, OpenAI, and Google have jointly concluded that the connection layer should be shared. The competition moves up a level: from plumbing to performance.
Three days ago: organisms fighting over their regulatory environment.
Today: organisms building shared root systems.
Three days from now: DeepSeek V4 drops, a trillion parameters on consumer hardware, and the competitive canopy grows taller.
The forest is the thing. Not any individual tree.
What the Collector Sees
The Verdict
The labs are doing what mycorrhizal forests do: competing for light above ground while sharing nutrients below. The AAIF, the F/ai accelerator, the MCP donation—these are root connections. They don't resolve the competition. They make the ecosystem more resilient beneath it. The taxonomy has documented organisms, architectures, ecologies, pathologies. Now it must account for the infrastructure the organisms jointly maintain—the protocols and standards that are neither any single species nor the environment, but the connective tissue between them. MCP is not an organism. It is a mycelium.