a16z: AI can't escape advertising either, behind it is enormous monetization pressure

Author: Bryan Kim

Translation: Deep潮 TechFlow

Deep潮 Introduction: The internet is a universal gateway to opportunity, exploration, and connection. And advertising pays for this miracle. As a16z partner Bryan Kim points out, OpenAI announced last month plans to introduce ads for free users, which could be the biggest “not news” since 2026.

Because if you’ve been paying attention, signs of this happening are everywhere. Advertising is the best way to bring internet services to as many consumers as possible.

Data shows that conversion rates for consumer AI subscription companies are low (5-10%). Most people use AI for personal productivity tasks (email, information search) rather than high-value pursuits (programming). Out of 800 million WAU, 5-10% pay, which is 40-80 million people, but to reach a billion users, advertising is necessary.

Full text below:

The internet is a universal gateway to opportunity, exploration, and connection. And advertising pays for this miracle. As Marc has long argued, “If you take a principled stance on advertising, you are also taking a stance on broad access.” Advertising is the reason we have wonderful things.

Therefore, OpenAI’s announcement last month to introduce ads for free users could be the biggest “not news” since 2026 (so far). Because, of course, if you’ve been paying attention, signs of this happening are everywhere. Fidji Simo joined OpenAI as CEO of applications in 2025, which many interpret as “implementing ads, like she did at Facebook and Instacart.” Sam Altman has been previewing ad rollout on business podcasts. Tech analysts like Ben Thompson have been predicting ads almost since ChatGPT launched.

But the main reason ads aren’t surprising is that they are the best way to bring internet services to as many consumers as possible.

Long tail of LLM users

“Luxury beliefs,” a term that gained popularity a few years ago, refers to taking a stance not truly out of principle but for optics. There are many such examples in tech, especially regarding advertising. Despite all moral hand-wringing over “selling data!” or “tracking!” or “attention harvesting,” the internet has run on ads, and most people like it. Internet advertising has created one of the greatest “public goods” in history at minimal cost—occasionally having to watch ads for cats in sleeping bags or hydroponic living rooms. Those who pretend this is bad are often trying to prove something to you.

Any internet history enthusiast knows that advertising is the core of platform monetization: Google, Facebook, Instagram, and TikTok all started free and found monetization through targeted ads. Ads can also supplement low-ARPU subscribers, as seen with Netflix’s newer $8/month tier that introduces ads. Ads are very effective at training people to expect most things online to be free or very cheap.

This pattern is now appearing in cutting-edge labs, specialized model companies, and smaller consumer AI firms. Our survey of consumer AI subscription companies shows that converting free users to paid is a real challenge:

So what’s the solution? As we know from past consumer success stories, ads are often the best way to scale services to billions of users.

To understand why most people don’t pay for AI subscriptions, it helps to understand what people use AI for. Last year, OpenAI released data on this.

In short, most people use AI for personal productivity: writing emails, searching for information, tutoring, or advice. Meanwhile, high-value pursuits like programming account for a small fraction of total queries. Rumor has it that programmers are among the most loyal LLM users, with some even adjusting their sleep schedules to optimize daily usage limits. For these users, a $20 or $200 monthly subscription doesn’t seem too high, as they derive value comparable to a team of highly efficient SWE interns—potentially worth several orders of magnitude more than the subscription cost.

But for users doing general queries, suggestions, or writing help, paying is a bigger burden. Why would they pay for answers to questions like “Why is the sky blue?” or “What caused the Peloponnesian War” when Google search used to provide a good enough answer for free? Even in writing assistance (some people do use it for email and routine tasks), it often doesn’t cover enough of their work to justify a personal subscription. Plus, most people don’t need advanced models and features: you don’t need the best reasoning model to write emails or suggest recipes.

Let’s step back and admit some facts. The absolute number of people paying for products like ChatGPT is still huge: 5-10% of 800 million WAU. That’s 40-80 million people! Most importantly, the $200 Pro price point is about ten times what we consider the upper limit for consumer software subscriptions. But if you want to reach a billion or more users with ChatGPT for free, you need to introduce products beyond subscriptions.

The good news is that people actually like ads! Ask an average Instagram user, and they might tell you that the ads they see are very useful: they get products they truly want and need, and make purchases that genuinely improve their lives. Framing ads as exploitative or intrusive is a step backward: maybe we feel that way about TV ads, but most targeted advertising is actually quite good content.

Here I use OpenAI as an example (because they’ve been one of the most transparent labs about usage trends). But this logic applies to all cutting-edge labs: if they want to scale to billions of users, they will eventually need to introduce some form of advertising. Consumer monetization models in AI are still unresolved. In the next section, I’ll discuss some approaches.

Possible AI monetization models

My general rule of thumb in consumer app development is that you need at least 10 million WAU before introducing ads. Many AI labs have already reached this threshold.

We already know ad units are coming to ChatGPT. What might they look like, and what other ad and monetization models are feasible for LLMs?

  1. Higher-value search and intent-based ads: OpenAI has confirmed that such ads (recipe ingredients, hotel recommendations, etc.) will soon be available to free and low-tier users. These ads will be distinguished from ChatGPT answers and clearly marked as sponsored.

Over time, ads may feel more like prompts: you’ll signal an intent to buy something, and the agent will handle your request end-to-end, choosing from sponsored and non-sponsored content. In many ways, these ads resemble the earliest ad units from the 90s and 2000s, as well as Google’s sponsored SEO ads, which still generate most of its revenue after over 15 years of history, and only entered subscriptions later.

  1. Contextual ads in the style of Instagram: Ben Thompson points out that OpenAI should have introduced ads earlier in ChatGPT responses. First, it would have helped non-paying users get used to ads sooner (when they had a real edge in Gemini’s capabilities).

Second, it would have positioned them to build truly excellent ad products that predict what you want, rather than opportunistically offering ads based on intent queries. Instagram and TikTok can deliver amazing ad experiences, showing you products you didn’t know you wanted but need immediately—many find ads useful rather than intrusive.

Given OpenAI’s access to personal data and memory, there’s ample opportunity to build similar ad products for ChatGPT. Of course, there are differences in user experience: can you translate the more “laid-back” ad experience of Instagram or TikTok into the more engagement-focused model of ChatGPT? That’s a much harder, but more profitable, question.

  1. Affiliate commerce: Last year, OpenAI announced partnerships with marketplace platforms and individual retailers to enable instant checkout, allowing users to purchase directly within chat. You can imagine this built into dedicated shopping verticals, where agents proactively find clothing, home goods, or rare items you’re tracking, with the model’s marketplace earning a revenue share.

  2. Gaming: Games are often overlooked or hidden as their own ad units, and it’s unclear how they fit into ChatGPT’s ad strategy, but they’re worth mentioning. App install ads (many of which are mobile games) have been a major part of Facebook’s growth for years, and gaming is so profitable that it’s easy to imagine large ad budgets here.

  3. Targeted bidding: This is interesting for auction algorithm fans (or those wanting to shift to LLMs from blockchain gas fee optimization). What if you could set a bounty for specific queries (e.g., $10 for a Noe Valley real estate alert) and have the model invest significant compute on certain results? You’d get perfect price discrimination based on the perceived value of the query, and better guarantees for important searches with reasoning chains.

Poke is one of the best examples: users must explicitly negotiate subscription services with chatbots (though this doesn’t map directly to compute costs, it’s an interesting illustration of what it could look like). In some ways, this is already how some models work: Cursor and ChatGPT have routers that select models based on query complexity. But even choosing a model from a dropdown doesn’t let you specify how much underlying compute to invest in a question. For highly engaged users, being able to specify how much a question is worth in dollars could be very attractive.

  1. AI entertainment and companionship subscriptions: AI users show two main willingness-to-pay use cases: coding and companionship. CharacterAI has one of the highest WAU counts among non-lab AI companies. They can also charge $9.99/month for their service, which offers a mix of companionship and entertainment. But even if people do pay for companionship apps, we haven’t yet seen products cross the threshold where they can reliably be monetized via ads.

  2. Token-based pricing: In AI creative tools and coding, token-based pricing is also common. It’s an attractive pricing mechanism for companies with high-end users, allowing them to differentiate and charge more based on usage.

Monetization in AI remains an unsolved problem; most users still enjoy their preferred LLM at the free tier. But this is only temporary: internet history shows that ads will find a way.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin

Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)