Industry Report
AI's Picks-and-Shovels Moment
Who actually wins when everyone is building AI applications
The Setup
Every major technology transition produces a picks-and-shovels argument. Sell the tools to the miners, not the mining rights. In the California Gold Rush, Levi Strauss made more money than most prospectors.
The AI transition is producing the same argument. But the "picks and shovels" framing obscures more than it reveals. Not all infrastructure is created equal. The question isn't whether infrastructure matters — it's which infrastructure accumulates durable economic value and which gets commoditized.
The Commodity Trap
The most obvious AI infrastructure play is GPU compute. Nvidia has captured enormous value from this. But the margin profile of compute tells a concerning story for the long run: compute is a hardware product with software margins right now, but software margins invite competition.
AMD is competing on price. Custom silicon — Google TPUs, Amazon Trainium, in-house foundation model chips — is proliferating. The hyperscalers are building custom silicon specifically to reduce Nvidia dependency.
Compute will commoditize. The question is how fast and toward what equilibrium margin.
Where Durable Value Accumulates
The structural advantage in AI infrastructure is not in the commodity layer. It's in:
1. Data moats. Systems that get better as they process more transactions. Bloomberg's financial data. Healthcare companies with de-identified clinical records. Legal research platforms with case law. The data itself is the moat, not the model.
2. Integration depth. Workflow tools that embed AI into existing processes deeply enough to create switching costs. Not AI as a standalone product — AI as the intelligence layer in systems that would be expensive to replace.
3. Trust and compliance. In regulated industries, the ability to audit AI decisions, demonstrate fairness, and maintain compliance creates structural advantages for incumbents or specialized entrants. A legal research firm with trusted AI outputs is not the same as an API call to a foundation model.
4. Model serving and optimization. As inference costs become the dominant cost in deployed AI, companies that have invested in optimization — quantization, distillation, caching, batching — will have structural cost advantages over those running naive inference.
The Foundation Model Question
The foundation model layer presents a different economics question. Training costs are enormous and escalating. Inference costs are declining but remain significant. Revenue depends on API pricing that's under continuous competitive pressure.
The economics of foundation models resemble operating systems more than they resemble search or social. The value of the underlying model is significant, but the value accrues to the ecosystem built on top, not necessarily to the model itself.
OpenAI, Anthropic, and Google are not competing to be the best model — they're competing to be the default infrastructure that developers build on. Default status in infrastructure is valuable. But getting there requires sustaining investment at a scale that only a few players can maintain.
Framework for Evaluation
When evaluating an AI infrastructure business, the questions that matter:
- Does usage create data that improves the product for all users? (data flywheel)
- Does the product embed deeply into critical workflows? (switching cost)
- Does the product serve a segment where trust and compliance matter? (regulatory moat)
- Is the product a commodity that will be replicated by hyperscalers? (existential risk)
- Is the current margin profile sustainable or dependent on a temporary advantage?
Conclusion
The picks-and-shovels argument is not wrong — it's incomplete. The infrastructure layer in AI will produce durable value. But that value will concentrate in businesses that build something more than compute access: data assets, compliance infrastructure, workflow integration, or genuine optimization expertise.
The simplest version of the trade — own the infrastructure everyone needs — will be competed away. The durable version requires identifying which infrastructure has properties that make it defensible as the market matures.