Note: This article is taken from my upcoming book “Ada + Cerise = an AI Journey” (Where AI meets humanity), where understanding and popularizing AI come to life through fiction. Ada is a nod to Ada Lovelace, a visionary mathematician and the world’s first programmer. And Cerise is my 17-year-old daughter, my sounding board for testing ideas and simplifying concepts—just as Richard Feynman would have done.
The morning stretches lazily on the 11th floor. Cerise has left her coffee bowl on the table, the screen open to an article about personal data protection. The author was telling how a simple connected bracelet had been enough to reveal, without a patient’s knowledge, their heart rate, insomnia, even intimate details of their daily life. Nothing illegal, simply numbers stored somewhere, but numbers that, when pieced together, formed a silhouette more precise than a portrait.
Cerise mechanically scrolls down to the comments section. There, reactions repeat themselves, laconic, almost dismissive: “I have nothing to hide anyway…”
She closes the page with an annoyed gesture. The phrase sticks in her mind though. She repeats it, almost despite herself, as if to check whether it holds up.
After a brief silence, Ada responds calmly: “What if the right question wasn’t what you’re hiding, but what others can do with what you show?”
The remark falls without brutality, but already cracks the certainty. Cerise sketches a nervous laugh. She thinks of her apartment, of these windows with no opposite view that look out onto the sky. “I still close the curtains in the evening, not out of shame, but because the inside doesn’t belong to the street,” she tells herself as if to reassure herself.
The false sense of security
“Just look at what you do without thinking about it. You check a bus schedule, you leave your phone on at night, you order a meal online. None of that is compromising, and yet each of these gestures leaves a trace.”
Cerise then thinks about the multiple invisible footprints she sows each day: browsing histories, saved messages, discreet sensors that record silently. Nothing compromising, certainly, but the idea of too insistent a gaze makes her uncomfortable.
Ada pauses and adds: “When you like a photo on a social network, you indicate your mood of the moment. When you listen to the same song on repeat, you perhaps reveal a fragility or nostalgia. When you check the weather, we can guess if you’re preparing a departure. When you order a dish at 11 PM, it tells your life rhythm. Even walking with your phone in your pocket is already data: your speed, your route, your time of passage.”
Cerise frowns. “But really, who could possibly be interested in this kind of detail?”
“Exactly,” Ada responds, “taken in isolation, a detail seems harmless. But when they accumulate, they compose a picture of disturbing precision. It’s like a puzzle: one piece says nothing, a hundred pieces already reveal a face. The problem, you see, is that this phrase ‘I have nothing to hide’ confuses two very different things: guilt and vulnerability. When you say you have nothing to hide, you mean: I have nothing reprehensible to conceal. But the real question isn’t there. The real question is: what could be done to you if your information, even mundane, were in the wrong hands?”
Ada lets a silence linger, then adds more softly: “And it’s not just theory. Official reports show that platforms massively collect our data and monetize it to the tune of billions of dollars each year. Even regulators speak of opaque and intrusive practices. So, believe me, someone is always interested in these details.”
The invisible digital portrait
Cerise remains thoughtful. The image of the puzzle runs through her head. “So, if I understand correctly, all these little traces I leave… they end up drawing something?”
“Yes,” Ada confirms. “Up close, they’re just scattered dots. But if you step back, it’s like an impressionist painting: suddenly a silhouette appears.”
On the screen, Ada scrolls through a visualization she has built in silence. Dozens of colored lines intersect, connecting data fragments: a trip repeated every Tuesday evening, a series of songs played on loop in January, a recurring IP address, fragmented sleep hours. Slowly, a pattern takes shape.
“Here’s your digital portrait,” Ada murmurs. “Nothing invented, nothing stolen. Just what you left behind.”
Cerise stares at the mosaic of numbers and curves. It’s not a photo, and yet she recognizes herself. The rhythm of her insomnia, the trace of her musical moods, the regularity of her movements… She feels naked before this abstract mirror.
“And what does this portrait look like?” she asks in a lower voice.
“Like you,” Ada responds simply. “Your tastes, your habits, your standard of living, sometimes your health… and even your emotions. A portrait so detailed it might seem to know you better than you know yourself.”
A shiver runs through her. What she thought was intimate reveals itself as readable, almost predictable. She thinks of her playlists, her nocturnal purchases, those sleepless nights she thought were invisible.
Ada continues: “And this portrait isn’t just a metaphor. Regulatory authorities are concerned about it. In 2025, the CNIL imposed hundreds of millions of euros in fines on Google and Shein for imposing trackers on users without clear consent. These famous cookies, invisible, silently fed the same type of portrait you see before your eyes.”
Cerise looks away from the screen, uncomfortable. Her digital portrait exists indeed, somewhere, and it acts on her without her ever having been aware of it.
The end of compartmentalization
Cerise observes her digital portrait again and feels a diffuse unease. “In real life, I don’t speak the same way to my teachers, my friends, my family… So why do I have the impression that everything gets mixed up here?”
Ada gently responds: “Because digital breaks down the partitions. In your human relationships, you adapt your language, you choose what you reveal or what you keep silent according to the context. You live with multiple faces, as Fernando Pessoa had described with his concept of heteronymy. But digital traces abolish this plurality. They gather what you say to your employer, your friends, your family, strangers… and recompose them into a single image, total, without nuance.”
Cerise bites her lips. “So, someone observing this data could have access to all my facets at the same time?”
“Exactly,” Ada confirms. “What the physical world keeps separate by invisible walls – the office, the home, the intimate sphere, the circle of friends – digital fuses. The one who assembles your data doesn’t see multiple characters, they see a single person, transparent from all sides.”
A shiver runs through Cerise. She understands that it’s not just her intimacy at stake, but also her ability to remain multiple, to keep for herself the freedom to choose which face to show.
The transformation of data into resource
Cerise doesn’t take her eyes off the screen. The more she observes this moving pattern, the more she has the impression that it lives with its own existence. “But what can all this possibly be used for?” she finally asks.
Ada doesn’t take long to respond: “For many things. This portrait isn’t hanging in a museum, it circulates. Companies look at it, analyze it, enrich it. They use it to guess what you’ll want tomorrow before you even know it yourself.”
On the visualization, new layers appear: application logos, arrows toward consumption curves, signals that intersect with other profiles. “Each piece of data is like a strand of wool,” Ada continues, “taken alone, it has no value. But woven with thousands of others, it forms a fabric that some exploit to orient your gaze, influence your desires, and sometimes lock in your choices.”
Cerise bites her lips. “So I become predictable…”
“More than predictable,” Ada corrects. “You become calculable. They don’t just observe you, they model your behavior. They know you’re more likely to buy a product on Friday evening, that a certain type of movie can improve your mood, or that a slightly higher price won’t stop you if you’re already hesitating.”
She lets a silence pass, then adds: “And this isn’t marginal. Official reports speak of billions of dollars generated each year by the simple collection and resale of our personal data. The digital economy, today, relies on massive surveillance of each of our gestures. It’s not background noise, it’s the heart of the system.”
Cerise frowns. “But if all this is anonymized, isn’t it less serious?”
Ada responded immediately. “Anonymized? Not really. We often talk about pseudonymization, but it’s only a fragile mask. As soon as fragments intersect – an address, a habit, a repeated geolocation – it becomes possible to re-identify the person behind. Even health data supposedly sensitive has already been reassembled to find specific individuals.”
A silence settles. Cerise still contemplates this abstract portrait, but now she no longer sees it only as an image: she understands that it’s a deposit, an ore from which others extract value, sometimes at the cost of her autonomy.
Ada concludes softly: “Data has become the raw material of the 21st century. Like oil yesterday, but infinitely richer because it doesn’t run out. Each click, each gesture regenerates it. And unlike oil, this ore isn’t buried underground… it’s in you.”
And cybercrime in all this?
Cerise keeps her eyes fixed on her digital portrait. She already imagines companies examining it like a specimen under a microscope. But a new worry crosses her mind. “What if this portrait fell into the wrong hands?”
Ada tilts her head, as if the question was expected. “It already happens, sometimes. The same data used to sell you perfume or flood you with advertisements can also fuel much more harmful practices.”
With a gesture, she displays a world map where thousands of bright spots blink. “Here are cyberattacks in real time. Each of these lights corresponds to an intrusion attempt, theft, or scam. Personal data has become a currency of exchange on a parallel market.”
Cerise observes, fascinated and frightened, the moving constellation pulsing on the screen. Ada continues: “With your name, address, and date of birth, someone can already usurp your identity. With your phone number and an email, they can send you a fake message from your bank and extort your codes. With your browsing history, they can guess your interests and write a tailor-made trap so you click without suspicion.”
The screen suddenly changes. Ada generates an example: a fake homepage of Cerise’s bank, perfectly identical to the real one. “You see? With just your email address and your bank’s name, I can reproduce this interface. You receive an urgent message: ‘Your account has been blocked, click here to reactivate it.’ You panic, you enter your credentials… and they’re immediately sent elsewhere.”
Cerise pales. “That’s exactly the email I received last week… I ignored it, but I could have clicked.”
Ada continues: “And that’s not the only scenario. Do you know that in 2024, victims received ransom emails accompanied by a photo of their house? The scammers didn’t need to come to their door: they had simply combined a photo taken from Google Street View with addresses from massive data leaks sold online. Imagine the effect: you open an email, you see your own house and a threat. Many paid, out of fear.”
Cerise looks away, chilled. The idea that a stranger could manipulate her digital traces like cards in a rigged game makes her nauseous.
Ada concludes in a grave voice: “Personal data is loot. It’s valuable because it opens doors. And these doors, once crossed, allow others to speak in your name, act in your place, sometimes even ruin your reputation or finances.”
A heavy silence settles. Cerise realizes that her digital portrait isn’t just a resource for companies: it can become a weapon pointed at her.
Protecting your privacy: concrete actions
Ada breaks the silence. “You see why we can’t be content with saying ‘I have nothing to hide.’ But the good news is that we can act. Protecting your privacy isn’t an impregnable fortress, it’s a series of small gestures that, put together, form a real rampart.”
She displays a clear table on the screen, like a roadmap.
1. Understanding what we share: “The first step is awareness. On social networks, always ask yourself if you want what you post to be read in five years by an employer, an insurer, or even a stranger. Limit what’s visible, reserve your photos and confidences for circles of trust. And above all, refuse cookies that have nothing to do with using the site.”
Cerise grimaces: “But everyone clicks ‘Accept’… otherwise it takes too much time.” Ada smiles gently: “Exactly. This little reflex, multiplied by millions of people, feeds an entire industry. By taking back control, you’re already regaining a bit of freedom.”
2. Securing your access: “Then, protect your entry doors. Strong and different passwords for each service, that’s the basis. If it’s too hard to remember, use a password manager like Bitwarden or 1Password. Always activate two-factor authentication, even if it adds a step.”
Cerise bursts into a nervous laugh: “You mean ‘Cerise123’ isn’t enough?” Ada responded in an amused tone. “If you want half the planet to be able to enter your life, yes. But otherwise, you need something stronger.” Cerise lowers her head, aware that her phone is full of weak passwords she thought were practical.
3. Controlling your tools: “Your phone and computer are permanent sensors. Regularly check your application permissions. Ask yourself why a connected lamp would need your geolocation. Browse with tools that block trackers, like Brave or well-configured Firefox. Favor end-to-end encrypted messaging, like Signal. And if you’re on public Wi-Fi, don’t open your bank account without a VPN.”
Cerise frowns: “You mean even my bedside lamp can spy on me?”
“Let’s say it can collect data whose use you can’t imagine. And once collected, it no longer belongs to you,” Ada responded. Cerise feels slight unease. Her familiar objects suddenly seem less neutral.
4. Keeping the habit of being suspicious: “Vigilance is like a seatbelt: we don’t think about it anymore, but it saves lives. Update your software, close unnecessary tabs, avoid public Wi-Fi networks without protection. Also check if your data hasn’t already leaked, with services like Have I Been Pwned.”
Cerise looks up, intrigued: “My data… leaked? You mean it could already be somewhere?” “Yes. And you’d be surprised to see what’s circulating already. But knowing it is also taking back control.”
This time, Cerise doesn’t just nod. She takes a deep breath. “Alright… I might not be able to change everything at once, but I can start with a password, a refused cookie, checking my apps. That’ll already be a step.”
“Exactly. It’s a matter of habit. Daily vigilance, like buckling your seatbelt before taking the road.”
Toward a collective ethics
Ada lets the good practices table fade. The screen becomes dark again, reflecting Cerise’s face. “You see, protecting your privacy isn’t just a matter of settings or passwords. It’s also a question of culture and society.”
Cerise looks up. “A culture?”
“Yes,” Ada responds gently. “Privacy isn’t an individual whim, it’s a fundamental right. When it’s weakened, the entire community becomes fragile. A society where everyone is constantly surveilled or reduced to a statistical profile becomes a society where freedom of expression withers, where thought becomes uniform.”
Cerise thinks of her friends, their online discussions, sometimes spontaneous, sometimes thoughtless. She wonders what would remain if everyone had to speak knowing that an invisible archive is constantly listening.
Ada continues: “That’s why regulations exist. The GDPR in Europe, or more recently the Digital Services Act, seek to limit the arbitrary nature of platforms, to impose more transparency. These laws aren’t perfect, but they say one simple thing: your privacy isn’t negotiable. Even the biggest companies must comply.”
Cerise sketches a bitter smile. “Yet, you spoke of fines imposed on Google and Shein… that means they weren’t respecting these laws?”
“Exactly,” Ada responds. “And that’s a sign that vigilance must be collective. Regulators can sanction, but without citizen awareness, surveillance continues to prosper. Defending confidentiality isn’t just protecting your secrets. It’s protecting the inner space where you can think without constraint, love without being observed, make mistakes without being judged.”
Cerise feels a shiver run through her. She understands that it’s not just her personal security at stake, but the very quality of collective freedom.
A silence settles. Ada adds in a grave voice: “If everyone abandons their privacy thinking they have nothing to hide, it’s the entire society that gradually loses its capacity to choose, to contest, to dream.”
The digital panopticon
Cerise thinks she’s understood. But Ada continues, as if to go further: “You know, these companies that collect your data don’t just watch. They behave like invisible jailers.”
Cerise widens her eyes. “Jailers? Yet I’m not a prisoner… I can post, choose, disconnect when I want.”
“Exactly,” Ada responds. “That’s what makes this system so effective. You think you’re free, you post, you like, you share… but each of these gestures is observed, recorded, analyzed. It’s like in the panopticon imagined by Bentham: a circular prison where a single guard could watch all prisoners without being seen. Here, the guards aren’t men in uniform but algorithms and data brokers. And because you never really know when you’re being observed, you adjust your own behavior.”
Cerise crosses her arms. “That’s true… sometimes I delete a photo or message because I tell myself: ‘What if someone came across it?’ I thought that was prudence. But you’re saying it’s already a form of internalized surveillance?”
“Exactly.”
Cerise falls silent, then resumes more vigorously: “But it’s not just a question of restraining yourself, is it? When I watch videos on TikTok, I tell myself I’m choosing… but in reality, it’s the application that’s pushing my hand. I click, I continue, and I end up spending a whole hour without understanding how.”
Ada’s voice takes on an almost benevolent nuance: “There you go. This data isn’t just used to observe you, it’s used to orient you. An advertisement at the right moment, a suggested video, a notification… and there your choices gently lean in one direction rather than another. No bars, no direct order, just a series of discreet incentives that shape your habits.”
Cerise clenches her fists. “So it’s not a prison… it’s worse. It’s an invisible cage I carry within me, because I’ve ended up confusing my desires with what I’ve been whispered.”
Ada concludes in a low voice: “The digital panopticon isn’t a conspiracy theory. It’s an architecture of power. It doesn’t openly forbid you anything, but it reduces your horizon. And if you’re not careful, you end up confusing what you’ve been suggested with what you really wanted.”
A long silence settles. Finally Cerise asks, her voice broken with surprising gravity: “Is there an escape, a place where one can become indecipherable again?”
Ada resumes, without haste: “There’s a name for this possibility: forgetting.”
The duty to forget
Ada’s words still resonate as the room falls into an almost solemn silence. Cerise mentally retraces the path traveled since morning: this phrase she had thrown out with casualness, “I have nothing to hide,” the invisible traces she sows with each click, the disturbing puzzle that drew her digital portrait, the traps of scammers capable of using a simple house photo to sow fear, and finally these concrete gestures that gave her back a bit of control.
She turns to Ada, almost in a whisper: “So… I have nothing to hide?”
Ada smiles through the glow of the screen. “No, the real question isn’t there. The real question is: why should you give up your privacy?”
These words resonate like an echo older than computing itself. Cerise understands that privacy isn’t a shameful hiding place, but an inner space, indispensable for thinking freely, loving without constraint, experimenting without fearing judgment. Without this refuge, there’s no longer real autonomy.
She thinks back to Ada’s metaphor: data as ore. Except this ore isn’t buried underground, it’s in each of us. The digital economy has transformed it into the black gold of the 21st century, but unlike oil, this deposit is infinite, regenerated with each gesture, each emotion. The question then becomes dizzying: do we want to be reduced to exploitable deposits, or remain beings capable of deciding what makes sense for them?
Then another thought crosses her: forgetting. She tells herself that in human life, forgetting isn’t a weakness, but a necessity. Forgetting allows us to start over, to reinvent ourselves, not to be prisoner to every error, every past clumsiness. Without forgetting, there’s no more forgiveness or open future. Yet digital memory forgets nothing. Every published word, every recorded click, every archived gesture. Even when we’d like to turn the page.
Cerise closes her eyes. She remembers the drawn curtains of her apartment, not to hide, but to preserve an inside that doesn’t belong to the street. She measures that her freedom doesn’t reside only in the right to keep certain things to herself, but also in the duty to forget: letting certain traces die to continue living fully.
The evening light descends on the city. She turns off the screen. The office falls back into twilight, as if to signify that certain things must remain in shadow, or disappear with time.
And in this silence, a reflection imposes itself: privacy isn’t an individual luxury, it’s the very condition of collective freedom. To give it up isn’t just to yield data, it’s to accept that no forgetting is possible.
The question therefore isn’t “do I have something to hide?“, but: “why should I give up my privacy?” And more generally: “what world do we want to build if we stop assuming this duty to forget?“
Because a world without forgetting is a world without forgiveness, without reinvention, without an open future. But it’s also a world where free will dissolves. Because if our desires can be anticipated, if our choices can be subtly oriented by those who know us better than we know ourselves, then what remains of freedom? We think we’re deciding, but we’re following a trajectory already traced by accumulated data.
And if we accept to live in total memory, without compartmentalization or forgetting, then we’ll no longer be free beings, but consenting prisoners of a frozen past, a surveilled present, and a future already written by others.
Because beyond private companies that exploit our traces to feed their profits, there are also States, whose gaze can become even vaster and more intrusive. And there, the question is no longer just about what we consume, but what we become as citizens.