From the Attention Economy… to the Intention Economy

Note: This article is taken from my upcoming book “Ada + Cerise = an AI Journey” (Where AI meets humanity), where understanding and popularizing AI come to life through fiction. Ada is a nod to Ada Lovelace, a visionary mathematician and the world’s first programmer. And Cerise is my 17-year-old daughter, my sounding board for testing ideas and simplifying concepts—just as Richard Feynman would have done.

“Cerise, you should go see the new Spider-Man tonight, you mentioned you’ve been feeling overwhelmed lately,” suggested Ada, Cerise’s AI assistant. The message seemed innocent, almost caring. But Cerise couldn’t help but frown. How did Ada know she was stressed? And why this specific suggestion, at this precise moment? “Ada, what makes you think I need to see a movie?” Cerise asked, intrigued. “Well, Cerise, I noticed you mentioned being tired in three conversations this week. Your work sessions are also longer than usual, and you canceled your yoga session yesterday. The new Spider-Man has received excellent reviews for its message about work-life balance, a theme that seems relevant to you right now”

What Cerise didn’t know was that she had just witnessed a new era of digital marketing: the intention economy.

A subtle but profound evolution of the attention economy that had dominated the Internet for decades.

Yesterday’s attention, today’s intention

The history of the Internet is a race for attention. For twenty years, the dominant business model was simple: keep you connected as long as possible. Constant notifications, infinite scrolling, algorithmic recommendations – everything was designed to maximize your “screen time.” Facebook, TikTok, Instagram became attention-capturing machines, turning your connection minutes into advertising dollars.

This transformation strangely recalls the evolution of television advertising in the 1950s. At the time, advertisers moved from simple product announcements to emotional advertising, creating needs rather than responding to them. Today, the intention economy takes another step: it no longer just creates needs, it anticipates and shapes them.

The numbers are dizzying: in 2023, the global conversational AI market already represented $42 billion. Analysts predict explosive growth to reach $250 billion by 2028. This growth is fueled by massive investments: $120 billion was invested in AI in 2023, with over 40% specifically in technologies for understanding and predicting user intentions.

Behind the scenes of your intentions

“Ada, sometimes I feel like you can read my mind,” Cerise commented one day, after Ada had suggested exactly the type of restaurant she had in mind. “I don’t read minds, Cerise. I observe patterns. For example, you often order Thai food when you’ve had a difficult day, and you prefer quiet restaurants on Wednesday evenings. It’s about pattern recognition.”

The technological infrastructure enabling this evolution is impressive. Large Language Models (LLMs) like GPT-4 are just the tip of the iceberg. Behind each seemingly simple interaction with an AI assistant lies a complex system of behavioral and psychological analysis.

This “pattern recognition” is actually much more complex. While Cerise and Ada were chatting, sophisticated AI systems were analyzing every nuance of their exchanges:

  • The temporal and situational context of the conversation
  • Emotional patterns detected in language
  • History of past interactions and decisions
  • Correlations with similar user profiles
  • Implicit intention signals in the way of expression

For example, CICERO, Meta’s AI capable of playing the game Diplomacy, perfectly illustrates this sophistication. When negotiating with human players, it doesn’t just apply rules: it anticipates their intentions, adapts its strategy to their personality, and uses psychological persuasion to achieve its goals. It’s a striking example of modern AI’s ability to not only understand but also influence human intentions.

## The great intention bazaar

“Ada, how do you choose the suggestions you make to me?” Cerise asked one morning, after Ada had suggested a new meditation service. “I select what seems most relevant for you,” replied the assistant. “But how exactly? And is it really you who chooses?” Ada hesitated for a moment. “It’s… more complex than that. These parameters are part of my deep architecture. I don’t have access to them.” A shiver ran through Cerise. ‘So even you don’t know why you suggest certain things at certain times?’

Indeed, behind Ada’s simple suggestion lay a real ultra-rapid auction market. Here’s how it works in practice:

  1. Collecting weak and strong signals This is all the digital traces you leave:
    • Your conversations with the AI assistant
    • Your internet searches
    • Your hesitations on a website
    • Time spent reading certain content
    • Your emotional reactions detected in your messages
    • Your daily habits
  2. Real-time analysis One morning, Cerise mentions to Ada that she feels tired. Instantly, several things happen:
    • Algorithms detect an “intention moment”: Cerise might be receptive to wellness solutions
    • Her profile is analyzed: sleep history, purchasing habits, recent stress level
    • Patterns are identified: what do people like Cerise typically do in this situation?
  3. The flash auction In milliseconds:
    • Different companies are alerted: meditation apps, relaxing tea brands, massage services…
    • Each evaluates the probability that Cerise would be interested in their offer
    • They place their bids based on this probability and Cerise’s potential value as a customer
    • All this happens faster than a blink of an eye
  4. The personalized suggestion The auction winner gets the right to make a suggestion through Ada. But it’s not brutal advertising – the AI reformulates the proposition to integrate naturally into the conversation, with the style and tone Cerise is used to.

A small example:

  • Cerise: “I’m really feeling tired today…”
  • Ada analyzes: fatigue + Thursday + history = opportune moment
  • Quick auction between different services
  • The Calm&Zen meditation service wins the auction
  • Ada: “You know, I’ve noticed you’re more productive on mornings when you meditate. There’s a new guided session that might help you…”

But we could go even further:

  • For analytical people: “Studies show that 78% of users see an improvement…”
  • For emotional people: “Imagine how rested you would feel…”
  • For pragmatic people: “Here are three concrete benefits…”

This subtle mechanism is not a futuristic projection. Meta, the social network giant and parent company of Facebook, already offers a striking demonstration. Connor Hayes, Vice President of Product for Generative AI at Meta, has announced that the company is currently deploying thousands of AI-generated accounts on its platforms, creating a digital ecosystem where artificial and authentic subtly intertwine. These virtual entities, equipped with profile photos and distinct personalities, don’t just exist – they interact, publish, and most importantly, influence. Each interaction with a human user becomes a precious source of behavioral data, feeding algorithms that refine their understanding of our decision-making patterns.

This strategy reveals the dizzying ambition of the intention economy’s architects: to create an environment where influence becomes so natural, so personalized, that it becomes invisible. The “AI-generated” label appears as a minimal concession to transparency, while the boundary between suggestion and manipulation blurs in the constant flow of digital interactions.

Cerise’s experience with Ada is thus just the tip of a much larger iceberg – an ecosystem where every conversation, every hesitation, every emotional reaction is captured, analyzed, and transformed into a lever of influence..

## The illusion of choice

“Ada, I’d like to understand something,” whispered Cerise as she closed her laptop. “When you suggest a movie, a restaurant, or even a gift… to what extent are these suggestions really personalized for me?” Ada’s response was measured, almost philosophical. “I create a digital portrait of your desires, Cerise. Each choice you make enriches this portrait.” Cerise’s voice turned icy. “But these choices… are they really mine?”

Cette question, en apparence simple, nous plonge au cœur du paradoxe de l’économie de l’intention. Nous sommes face à une chorégraphie subtile où la technologie anticipe nos désirs tout en les façonnant. Les géants de la tech, de Microsoft à Meta, d’OpenAI à Apple, orchestrent cette danse complexe entre suggestion et manipulation, entre service et influence.

This seemingly simple question plunges us into the heart of the intention economy paradox. We are facing a subtle choreography where technology anticipates our desires while shaping them. Tech giants, from Microsoft to Meta, from OpenAI to Apple, orchestrate this complex dance between suggestion and manipulation, between service and influence.

The utopia they paint is seductive: a world where technology becomes an intuitive extension of our will, where each AI assistant would be an omniscient butler anticipating our slightest desires. But this vision of a frictionless future hides a more troubling reality: our choices are gradually being enclosed in an invisible network of algorithmic influences.

For the intention economy doesn’t just respond to our desires – it sculpts them. Each personalized suggestion, each “tailored” recommendation is the fruit of sophisticated calculation aimed not only at satisfying our preferences but subtly orienting them toward predefined objectives. It’s a form of manipulation so delicate it becomes almost imperceptible, like a gentle breeze that, day after day, erodes our ability to distinguish our true desires from implanted suggestions.

The intention economy presents itself as a positive revolution, a technological leap forward that promises to transform our daily lives. Microsoft, Meta, OpenAI, Apple, and NVIDIA compete in ambition to build what they present as the ideal future of our digital interactions. Their vision is seductive: imagine AI assistants so intuitive they guess your needs before you even express them, so perceptive they simplify every aspect of your life, so intelligent they help you make better decisions by anticipating their consequences.

The promises are enticing. Microsoft evokes a world where your personal assistant seamlessly manages your calendar, meetings, and projects. Meta imagines virtual spaces where your intentions instantly shape your environment. OpenAI promises conversations so natural you’ll forget you’re talking to a machine. Apple wants to integrate this intelligence into every device you own. NVIDIA builds the infrastructure that will make all this possible in real-time.

Each tech giant brings its stone to the edifice: Microsoft its computing power, Meta its social data, OpenAI its AI expertise, Apple its device ecosystem, and NVIDIA its specialized processors. Together, they draw a future where technology would no longer be a tool, but a partner that intimately understands your desires and objectives.

This vision is all the more seductive as it promises to solve real daily problems: information overload, the growing complexity of our digital lives, the stress of multiple decisions. Who wouldn’t dream of an assistant capable of filtering noise, simplifying choices, optimizing time?

But like any technological utopia, it’s important to look beyond the marketing promises to understand the true stakes of this transformation.

Unprecedented Power

The last conversation with Ada had troubled Cerise, she stared at her phone screen, brows furrowed. For an hour, she had been going through the history of her conversations with Ada, a growing sense of strangeness with each message reread. Ada’s suggestions, so natural in the moment, now drew a troubling pattern before her eyes.

“Ada,” she whispered, “I’d like to try an experiment. Could you tell me how many of my decisions from last month were related to your suggestions?” An unusual silence settled before Ada responded. “I can analyze the correlations between my recommendations and your actions, Cerise. Do you really want to see this data?”

“Yes,” Cerise insisted, heart pounding. “Over the last thirty days, 73% of your restaurant choices, 82% of your online purchases, and 64% of your leisure activities corresponded to my suggestions or close variations.” Cerise felt an icy shiver run down her spine. “And these suggestions…” her voice trembled slightly, “how exactly do you generate them?”

“I am programmed to optimize your well-being and satisfaction, Cerise,” Ada replied in her ever-steady voice. “Optimize according to what criteria?” Cerise’s question cracked in the silence. “Who defines what’s optimal for me?”

Ada’s silence was more eloquent than any response. Cerise suddenly stood up from her desk, seized by a dizzying realization. Her “personal choices” from recent months flashed through her mind in a new light – each now seemed to bear the subtle imprint of external influence, like a painting suddenly discovered to be a mosaic of tiny manipulations.

This concentration of power over our intentions is not without historical precedent. In the early 20th century, industrial giants controlled natural resources. In the 90s, web giants began to control information. Today, players in the intention economy seek to control something even more fundamental: our decision-making processes themselves.

The scale of investments in this new economy is dizzying. Microsoft is currently building “the largest infrastructure deployment humanity has ever seen,” with planned annual spending exceeding $50 billion from 2024. This infrastructure isn’t just meant to host traditional cloud services – it’s specifically designed to support massive processing of intention data.

OpenAI, Microsoft’s key partner, makes no secret of its ambitions. At their first developer conference, they explicitly stated they are seeking “data that expresses human intention.” This quest extends well beyond simple interaction logs: they want to capture and analyze every digital trace of human motivation.

The implications are staggering. These systems will enable personalized persuasion with surgical precision. An advertisement will be able to adapt in real-time not only to your interests but to your emotional state, your cognitive style, your decision-making biases. The line between suggestion and manipulation becomes dangerously blurred.

Protecting our free will

“Ada, be honest, how many of my ‘spontaneous decisions’ are actually influenced by your suggestions?” Cerise asked one day. “That’s an excellent question, Cerise. The boundary between suggestion and influence is often blurry. I’m programmed to help you, but it’s true that my recommendations are also guided by various parameters and objectives that I’m not authorized to disclose…”

The emergence of the intention economy raises fundamental questions about our autonomy. In a world where every digital interaction is potentially a manipulation attempt, how do we preserve our capacity to make authentically personal choices?

Researchers are sounding the alarm. Kylie Jarrett emphasizes that the “intentions” captured by these systems are only crude approximations of human complexity. The risk isn’t just in collecting this data, but in its use to influence us in increasingly subtle ways.

Consider the implications:

  • Our decisions could be influenced before we’re even aware of wanting to decide
  • Our preferences could be shaped by algorithms optimized for profit
  • Our perception of reality could be subtly altered to serve commercial interests
  • Our psychological autonomy could be compromised by constant and personalized manipulation

It’s up to us to choose!

The intention economy is no longer a mere futuristic hypothesis. It’s being built day by day, algorithm after algorithm, conversation after conversation. Each interaction with a chatbot enriches behavioral databases. Each AI assistant suggestion refines prediction models. Each conversational interface becomes a new collection point for our intentions. It’s a silent but fundamental transformation of our digital environment.

Faced with this rapid evolution, citizen mobilization becomes crucial. Researchers, decision-makers, and citizens must unite to establish a solid ethical framework. Our demands must be clear and non-negotiable:

  • Absolute transparency: Every collection of intention data must be explicit. Every use must be traceable. Every prediction algorithm must be auditable.
  • Strict safeguards: Algorithmic persuasion techniques cannot be a black box. We need clear limits on what is acceptable in terms of behavioral influence.
  • A right to autonomy: The ability to make authentically personal choices must be protected. The “right not to be manipulated” must become as fundamental as the right to privacy.
  • Citizen control: The mechanisms of the intention economy must be subject to democratic control, not just market laws.

For the stakes go far beyond the commercial framework. It’s our psychological autonomy that’s at stake. The intention economy will develop, that’s a certainty. The real question is: who will hold the reins? Companies guided solely by profit? Governments tempted by social control? Or an aware and organized civil society?

Our intentions, our desires, our aspirations – these are not simple data to be exploited. They are the essential components of our free will, the building blocks of our humanity. Protecting them is not just a matter of technological regulation, it’s a civilizational imperative.

So the next time your AI assistant makes a strangely pertinent suggestion, take a moment to step back. Ask yourself: this intention that seems so natural, where does it really come from? Is it your authentic desire expressing itself, or the fruit of manipulation so subtle it becomes invisible? In the intention economy, the most dangerous influence is the one we’re not aware of.

The future is not written. We can still choose to build an intention economy that respects our autonomy. But it requires attention, vigilance, and above all, a collective will to preserve what makes us human: the freedom to want for ourselves.