In 1833, Benjamin Day had a crazy idea. What if newspapers didn't have to cost six cents? What if you could sell them for just one penny and make up the difference somewhere else?

This sounds obvious now, but at the time it was radical. Newspapers were expensive because they were written for expensive people—merchants, politicians, the educated elite who could afford six cents and cared about shipping reports and political speeches. Day's New York Sun aimed lower. It featured crime stories, human interest pieces, and local gossip. It was sensational, accessible, and cheap.

The penny press, as historians call it, worked because Day figured out something that seems simple in retrospect: if you make your product accessible to everyone instead of just the wealthy, you can make money by selling something else entirely. In this case, attention. Advertisers would pay to reach all those penny-spending readers who couldn't afford the six-cent papers but might buy soap or patent medicine.

For thirty years, this model dominated American journalism. Newspapers multiplied, circulation exploded, and a genuinely democratic press emerged. Publishers competed to create the most engaging content because engagement directly translated to revenue. The better your stories, the more readers you attracted. The more readers you attracted, the more advertisers would pay.

Then, in the 1840s, Samuel Morse's telegraph changed everything.

The telegraph didn't kill newspapers. But it completely restructured how information moved and who controlled it. Suddenly, speed mattered more than storytelling. A detailed account of a Congressional debate that took hours to write could be beaten by a telegram that said "BILL PASSED 47-23 STOP." The Associated Press, founded in 1846, recognized this shift immediately. Instead of competing to write the best stories, they competed to control the wires.

Telegraph-era journalism compressed information ruthlessly. Complex political narratives became vote tallies. Elaborate descriptions of social events became bare-bones summaries. The human craft of storytelling became secondary to the mechanical efficiency of transmission. The penny press didn't disappear, but it was no longer the center of the information ecosystem.

I think we're living through the same kind of transition right now, except instead of the telegraph disrupting newspapers, AI agents are disrupting the web. Reading Ben Thompson this week, I've been trying to work out how I feel about the future of the web, the internet economy, and the professional and personal world I've been living in for the past decade. And I wanted to take a moment to unravel it all.

The Web Was Always the Penny Press

Think about how the modern internet works. Most content is free because it's funded by advertising. Publishers compete for human attention because human attention translates directly to revenue. The entire ecosystem—from search engine optimization to social media algorithms to content management systems—optimizes for human engagement.

This model assumes a human reader sitting at a screen, clicking through pages, spending time with content, maybe clicking on an ad. Every design choice serves this human user: readable fonts, engaging headlines, intuitive navigation, fast loading times. The economic loop is elegant: compelling content attracts humans, humans attract advertisers, advertising revenue funds more compelling content.

Google became the Associated Press of this system by controlling discovery rather than creation. PageRank organized the chaos of web content and directed human attention where it needed to go. Instead of owning telegraph lines, Google owned the algorithm that decided which information humans would see first.

But AI agents are a new "user" type: they don't browse, they don't click, and they definitely don't look at ads.

How AI Agents Consume Information

When ChatGPT answers a question about restaurants in Seattle, it doesn't visit Yelp, scroll through reviews, compare star ratings, and click on restaurant websites. It accesses some internal representation of restaurant information—maybe derived from web content, but completely divorced from the web's user interface.

When an AI agent needs weather information, it doesn't load Weather.com with its ads and newsletter signup prompts and social media widgets. It queries an API or accesses structured data that contains just the meteorological facts it needs.

This is like the telegraph all over again. The telegraph stripped away everything that made penny press journalism engaging—the narrative flourishes, the local color, the human interest angles—and transmitted only essential information. AI agents do the same thing to web content. They extract the information and discard everything designed for human engagement.

The carefully crafted prose, the engaging headlines, the user experience design—none of it matters when your reader is a machine. An AI agent gets the same value from a beautifully designed restaurant website as it gets from a spreadsheet of restaurant data. Actually, it probably prefers the spreadsheet.

The Economic Problem

This creates an obvious problem. If AI agents become the primary consumers of web content, the entire advertising-supported model collapses. Why would anyone pay for ads that no one sees? Why would publishers invest in creating human-engaging content if their audience is increasingly non-human?

We're already seeing early signs of this tension. The New York Times is blocking AI crawlers. Other publishers are demanding payment from AI companies for training data. Some content creators are explicitly optimizing for AI consumption rather than human readership.

But these are temporary patches on a fundamentally broken system. You can't maintain an economy built around human attention when most of your consumers aren't human.

The Protocol Economy

AI agents prefer structured data to unstructured content. They work better with APIs than web scraping. They can handle machine-readable formats that would be completely unusable for humans.

This creates opportunities for new types of businesses built around machine consumption rather than human engagement. Instead of creating websites optimized for human visitors, you create databases optimized for AI queries. Instead of monetizing through advertising, you charge directly for API access.

Cryptocurrency suddenly makes sense in this context. Managing wallets and processing microtransactions is annoying for humans but trivial for AI systems. Agent-to-agent payments could enable entirely new economic models based on direct value exchange rather than attention arbitrage.

The restaurant review website becomes less valuable than the restaurant database. The travel blog becomes less valuable than the travel API. The news website becomes less valuable than the structured news feed.

New Aggregators

Ben Thompson's aggregation theory explained how internet platforms gained power by controlling the relationship between suppliers and users. Google aggregated websites and users. Facebook aggregated content creators and audiences. Amazon aggregated merchants and shoppers.

But as Thompson has already pointed out, AI agents operate differently. They don't browse through options—they execute specific queries. They don't have preferences in the human sense—they optimize for defined parameters.

This moves aggregation power toward whoever controls the infrastructure that AI agents depend on. The training data. The computational resources. The API standards. The model architectures.

OpenAI's rumored hardware ambitions make perfect sense from this perspective. Controlling the physical interface through which AI agents interact with the world provides leverage over the entire ecosystem. Google's AI integration into search is a defensive move to maintain aggregation power as information consumption patterns change.

The new aggregators won't be the companies that best serve human attention spans. They'll be the companies that control the protocols and infrastructure that AI agents use to access information.

What This Means Strategically

Publishers face a choice: optimize for humans or optimize for machines. These are increasingly incompatible strategies. Content that performs well in human-engagement metrics (time on page, social shares, return visits) may be poorly structured for AI consumption. Content optimized for AI extraction may be boring for human readers.

Google's AI-enhanced search results create a particular tension. If AI-generated summaries satisfy user queries without driving traffic to original sources, they undermine the content ecosystem that makes those summaries possible. It's like the Associated Press summarizing newspaper articles so effectively that nobody reads the original newspapers.

Platform companies need to decide whether they're building for human users or AI agents. These require different interfaces, different monetization models, and different success metrics. Companies that try to serve both may find themselves optimizing for neither.

The biggest strategic opportunity belongs to companies building infrastructure specifically for the AI economy. Not websites that AI agents can scrape, but protocols that AI agents can interact with natively. Not content optimized for human engagement, but data optimized for machine consumption.

The Infrastructure Web

The web isn't dying. It's becoming infrastructure.

Just like newspapers didn't disappear when the telegraph arrived—they just became less central to how information moved through society. The human-facing web will persist, but the economic and structural center of gravity is shifting toward systems designed for machine consumption.

This new infrastructure web won't be as colorful or engaging as the human web. It will be more like plumbing—essential, efficient, and largely invisible. Success will depend on reliability, speed, and interoperability rather than creativity, engagement, and virality.

We're moving from an internet designed to capture human attention to an internet designed to feed machine intelligence. The companies that recognize this shift early and position themselves accordingly will build the foundational infrastructure of the AI economy.

What’s at stake now is the kind of infrastructure web we build—one that supports human flourishing or one that prioritizes machine efficiency above all else. The telegraph era of journalism wasn't necessarily better or worse than the penny press era, but it was fundamentally different.

The same will be true of the infrastructure web. It won't be better or worse than the human web that preceded it. But it will be optimized for entirely different purposes, serving entirely different users, operating according to entirely different economic principles.

And most of it will be invisible to the humans whose world it increasingly shapes.


What I'd Build

Agent-Native Paywall and Metering Infrastructure

So that's my take on things. But practically speaking, what can you do with it? Here's my thinking. Publishers like The New York Times or The Economist have spent decades building valuable content, but when GPT or Claude wants to access that information to answer someone’s question, there’s no good way to pay for it. It’s like having a world-class library with no checkout system.

Imagine building the infrastructure that solves this. Content providers could expose their information through APIs that AI agents can actually use—think structured JSON endpoints or rich Markdown with metadata attached. The key part: every time an agent accesses that content, the publisher gets paid.

You’d handle all the messy stuff in the middle. Micro-payments that could be a few cents per API call, whether that’s through traditional payment rails or crypto microtransactions. Usage tracking so publishers can see exactly how their content is being used. Different pricing tiers and access levels depending on what kind of agent is asking and what they’re doing with the information.

It’s Stripe, but instead of processing payments for e-commerce, you’re creating the payment infrastructure for the AI economy. Every time an AI agent needs to quote a news article, access a research database, or pull information from a premium source, your system makes sure everyone gets paid fairly.

The timing feels right too—we’re at this inflection point where AI agents are becoming capable enough to be truly useful, but the content ecosystem hasn’t figured out how to work with them yet.​​​​​​​​​​​​​​​​

Again, read the Thompson piece linked above — he goes into more detail about stablecoins, systems, etc. 

Westenberg explores the intersection of technology, systems thinking, and philosophy that shapes our future—without the fluff.

Free readers get powerful ideas. Paid subscribers get more:

  • Exclusive in-depth essays
  • Early access to new work
  • Private discussions and Q&As
  • Future digital products and resources
  • The satisfaction of supporting independent thinking

$5/month or $50/year. No sponsors. No bullshit. Just valuable insight delivered directly.

If it makes you think—we're aligned.

Please consider signing up for a paid monthly or annual membership to support my writing and independent/sovereign publishing.

Subscribe