Or the slow death of our digital utopias
Not so long ago, going online felt like a small daily enchantment. You’d open your computer the way you’d crack open a window onto somewhere new. The web resembled an open wasteland, a territory still poorly marked out where you could lose yourself with delight. You’d stumble across an old school friend on a somewhat clunky Facebook, type a few words into Google and, in return, the world would unfold before you, almost docile, through a few links with a relevance that made it feel like a universal library had just landed in your living room. That internet wasn’t perfect, but it carried within it a quiet promise, that of a low-key utopia, plugged into a simple electrical outlet.
For a brief moment, this digital world still felt like an extension of our intimate space. Computers held our music, our documents, our photos, like a wooden box where you store what matters. It was the era when computing remained personal: our archives lived at home, in a corner of the hard drive, not in some distant cloud. Tariq Krim described this interlude as “analog internet“, a fragile moment when we truly owned what we created. You could still sense the continuity between our physical and digital lives, as if the keyboard were just a natural extension of our shelves and notebooks. This continuity didn’t vanish overnight; it dissolved slowly, almost soundlessly.
Then, without us noticing, the scenery gradually shifted. The same window now opens onto a saturated landscape. News feeds resemble loud shopping malls, where “recommendations” jostle to capture our attention, burying the rare updates from friends. Search results have gradually stopped leading us toward knowledge to guide us instead toward omnipresent merchandise, packaged as pseudo-information.
Today, internet users have become stocks of attention to be managed, segmented, monetized.
This diffuse shift has a name that speaks volumes: ENSHITTIFICATION.
This word, coined by writer Cory Doctorow, describes how a platform ends up poisoning what it had initially made desirable.
This term isn’t just another insult in the digital lexicon—it’s a remarkably effective analytical tool for understanding how so many services we loved have transformed into “useless but indispensable garbage“.
To understand how we got here, we first need to dissect the mechanism itself.
Anatomy of a programmed rot
Enshittification isn’t a failure—it’s a well-oiled mechanism, patient, unfolding in three acts like a play whose final scene is written from the moment the curtain rises. Cory Doctorow described it as a cycle, almost a law of thermodynamics applied to platforms.
- First comes attraction, a phase that often resembles a gift. The platform shows itself generous, almost lavish. Everything is smooth, efficient, free or nearly so. You feel like you’ve found the perfect tool, one that simplifies everything without asking for anything in return. If this abundance seems improbable, it’s because it is: it’s loss-funded, supported by investors who are buying time, waiting for network effects to do their work. The more users flock in, the more unthinkable it becomes to leave. This is when the service stops being a choice to become a habit, then a small dependency. The entrance door closes silently behind us.
- When the critical mass of users is reached, exploitation begins. The platform turns toward those who will pay to reach this captive audience: advertisers, sellers, creators. To seduce them, it subtly reconfigures the experience. Recommendations become more insistent, sponsored content slides to the forefront. On Amazon, it’s no longer the best products that appear, but those whose sellers have paid for podium access. On Facebook, the feed becomes a succession of ads whose logic escapes the visitor. Same for LinkedIn. The user is no longer at the heart of the system—they become the fuel that powers a commercial machine.
- When businesses themselves have become dependent on the platform to exist, comes extraction. The power dynamic shifts brutally. Amazon raises its commissions. Meta stifles organic reach and makes you pay to reach the audiences you’d already won. Every gesture, every bit of visibility, every interaction becomes a line item. The entire ecosystem loses quality, but the platform, protected by the absence of real competition, reaps most of the value. Everyone pays, except them.
After some time, a fourth stage arrives: death.
It comes slowly, like accumulating fatigue. When a viable alternative appears, or when the degradation becomes unbearable, users flee. Commercial clients follow. The platform collapses, emptied of its substance, sometimes after prospering for years on an experience it had itself destroyed. But the agony can last a long time, long enough for millions of people to suffer daily through a mediocre service without being able to truly escape it.
If this entire cycle holds together so well, it’s because the market itself contributes to it. Information asymmetry, negative externalities, the progressive degradation of a common good like information quality—all of this forms a kind of structural trap.
It’s not a moral drama, even if we might be tempted to see it that way. It’s even, from a certain viewpoint, implacable economic rationality: a digital version of what economists call “penetration pricing,” that old strategy of selling at a loss to kill the competition, then raising prices once the monopoly is secured. Paul Krugman recently described this mechanism in what he calls his “general theory of enshittification.” Except that in digital, the exit costs are immense.
This asymmetry gives platforms unprecedented power. Leaving Facebook means giving up years of memories. Leaving Amazon means disappearing from the market. This type of lock-in used to take time. Years, sometimes. Time for habits to form, for alternatives to disappear, for dependency to settle in slowly.
But something has changed. As digital tools infiltrated organizations and their protocols, this logic began to circulate through capillary action. And above all, one seemingly innocuous technical element transformed this slow mechanism into an almost instantaneous phenomenon: APIs.
An API is a small opening arranged by a platform to allow another service to talk to it automatically. Imagine a kind of window where two machines exchange very short messages, like: “show this ad,” “classify this user,” “send me data on their behavior.” What a human would do in several seconds, they do in a fraction of a second, without noise or witnesses.
We celebrated these little windows as a revolution in simplicity. They allow systems that would otherwise never understand each other to connect. They automate repetitive tasks. They orchestrate millions of simultaneous actions. In short, they become the invisible pipes that support digital effervescence.
But these pipes rest on a seductive, almost naive assumption: that a digital interaction would be neutral, without fatigue, without impact. A click would weigh nothing, an ad would have no effect, a poorly targeted recommendation would only cause slight annoyance. In reality, each interaction carries a tiny but very real human cost. The intrusive ad irritates. The absurd suggestion tires. The impression of being reduced to a series of metrics gnaws away at trust.
APIs have made these costs invisible, and above all, they’ve externalized them. Algorithms see neither weariness, nor loss of connection, nor psychological wear. They only count what can be measured. And what can’t be measured doesn’t exist.
Technical efficiency then becomes an amplifier, a kind of fuel that accelerates the process instead of containing it.
Digital contagion
When machines can degrade the experience of millions of people in a fraction of a second, it’s no longer just an ergonomics problem. It’s a change of scale, almost a change of state. And this logic, once installed, never stays confined to its first territories. It spreads, like a way of thinking that has become reflexive. Enshittification then overflows from the platforms that invented it to contaminate entire swaths of digital, then the rest of the economy.
Once you understand this mechanism, you recognize it everywhere. It’s dizzying at first, then overwhelming. What seemed like isolated dysfunctions (a service degrading here, a platform becoming greedy there) suddenly reveals a common pattern, almost a grammar of deterioration. Enshittification isn’t an accident that repeats itself—it’s a method that replicates.
The list of affected sectors now resembles a Prévert-style inventory, but without the poetry.
- Amazon started as a giant bookstore that promised abundance. Today, it’s a market for sponsored products where visibility is bought and where sellers sometimes see up to half their revenues disappear into commissions and mandatory services.
- Audible, which reigns over audiobooks, locks each title behind mandatory DRM. Result: leaving the platform means losing your library. It’s no longer a high exit cost—it’s a soft but implacable form of dispossession. Digital should liberate; here it serves as a padlock.
- Netflix, after conquering the world thanks to a simple, ad-free offer, is progressively introducing what its original model promised to eliminate. The captive customer finances the mutation of the service they had precisely chosen for the opposite.
- Uber and Airbnb, once symbols of low-cost comfort, have become more expensive, less reliable, more opaque. The initial advantage wasn’t a model but a temporary subsidy.
- Online retail giants like Shein, Temu or AliExpress push the logic even further: a daily avalanche of cheap products, renewed by the minute, with no regard for the environment or quality. Here, enshittification doesn’t just degrade the experience—it transforms our very relationship with objects. Everything becomes disposable, including attention.
- Generative artificial intelligence, whether ChatGPT, Claude, Gemini or their derivatives, extends this logic to an unprecedented level. What was supposed to facilitate access to knowledge becomes an engine of saturation: automated articles, standardized images, content produced assembly-line style. The mechanism described by Steve Bannon, “flood the zone with shit,” finds an industrial dimension here. Overabundance no longer just clutters the page—it blurs the entire horizon: nothing really stands out anymore.
- The contamination even reaches less visible but economically critical domains. The online market research industry, analyzed by JD Deitch, offers a clear example. These platforms, tasked with providing respondents for surveys, followed exactly the same triptych. They first attracted participants with simple rewards, then promised businesses abundant and cheap data. Finally, they sought margins by automatically routing the same respondents to dozens of studies, until exhausting them. Tired, rushed, some start answering anything just to get their meager compensation.
The result is a silent collapse. The pool of reliable respondents dries up. Data becomes hazardous. Decisions relying on this data degrade in turn. And yet, the system holds up, because no one individually has an interest in challenging it. It’s a tragedy of the commons applied to the very foundations of commercial knowledge.
When Enshittification attacks reality…
What makes the situation truly concerning is that this logic no longer stops at screens. Now it infiltrates our daily lives, as if a way of thinking (extract first, optimize later, never repair) had found a path into the physical world. Enshittification then becomes less a technological drift than a social diagnosis.
… expertise becomes just another commodity!
This migration of the problem first touches what we thought protected: professions based on experience, judgment, encounter. Antonio Casilli and Bernard Stiegler had anticipated the contours through a simple and chilling idea: the proletarianization of knowledge. In their perspective, digital doesn’t relieve humans—it strips them. Professional gestures are broken down, simplified, packaged into protocols designed to be applied quickly and uniformly.
Expert work then transforms into click work. The doctor, researcher, engineer or teacher become operators of a system that dictates which box to check. The heart of their profession (interpretation, nuance, decision-making) fades behind performance indicators designed to meet platform objectives rather than field needs. It’s enshittification applied to humans: attract talent, squeeze them like a lemon, then reduce their expertise to a standardized protocol.
And this isn’t a theoretical abstraction. This mechanism takes concrete form, sometimes brutally, in two domains we thought beyond reach: medicine and science.
… the programmed decomposition of general medicine accelerates!
In the editorial published in January 2025 in the British Journal of General Practice, Dr. Euan Lawson unfolds a narrative that resembles less a medical analysis than an institutional autopsy. The title “The enshittification of general practice” had all the makings of a provocative gesture. But the twenty-eight months of investigation presented behind that title give the opposite impression: the word is almost too weak.
Lawson and his team observed, week after week, twelve practices across the United Kingdom, from rural England to densely populated suburbs. Everywhere, the symptoms were the same, as if an administrative virus had circulated from one practice to another.
The first shock goes back to 2004. With the introduction of the Quality and Outcomes Framework (QOF), general practitioners shifted into a new universe: that of codified consultation. Everything, absolutely everything, gets translated into indicators. Intuition, relationship, clinical examination, delicate discussion, become boxes to check, with their scale, their thresholds, their penalties. Lawson speaks of an “act of voluntary surrender.” By accepting this system, doctors handed over their professional autonomy to an accounting bureaucracy convinced that care quality can be measured like assembly line productivity.
The result is a slow but profound transformation of clinical practice. The general practitioner, once the pivot of the care relationship, finds themselves caught in a double vise. On one side, successive governments multiply requirements: each year, new targets, new forms, new obligations are added to an already hypertrophied system. On the other, digital tools meant to help (triage platforms, enriched electronic records, teleconsultations) add an additional layer of friction. The study shows that for certain tasks, it now took five separate screens to accomplish what a doctor previously did in a few minutes with a paper notebook. Technical progress reverses medical progress: it gives the appearance of efficiency while undermining the essence of the therapeutic relationship.
Then comes the staff shortage, the true catalyst of the crisis. To “free up doctor time,” the system massively redistributes tasks. Simple consultations, once the natural breathing rhythm of the practice, are entrusted to clinical assistants, specialized nurses or pharmacists. On paper, it’s logical. In practice, it’s a time bomb. Deprived of these simple cases that structured the day’s rhythm and allowed maintaining an overall view of patients, the doctor finds themselves permanently exposed to complex cases: poly-pathologies, uncertain diagnoses, fragile social situations.
Lawson calls this phenomenon the complexity loop. As easy cases disappear, each consultation becomes longer, more demanding, riskier. The more complexity the doctor treats, the more exhausted they become. The more exhausted, the higher the risk of error. The higher the risk, the more protocols supervisors add. And so on. A spiral with no exit.
The study also points to another paradox: digital triage tools, introduced with the promise of smoothing pathways, actually create new forms of inefficiency. Some patients multiply entries into the system, triggering unnecessary consultations. Others, conversely, are directed to circuits that ignore their actual situation. Algorithmic optimization, conceived as a filter, acts as additional fog.
And behind this structural degradation appears a colossal human cost. A report from the Institute of Health Equity published in January 2024 estimates that one million premature deaths in the United Kingdom since 2010 can be attributed to austerity policies. Lawson is explicit: these choices aren’t accidents. They stem from a managerial vision of care, where every minute must be optimized, every act accounted for, every expense justified by an indicator.
General medicine, once the cornerstone of the British health system, thus finds itself reduced to an impoverished mechanism. Enshittification doesn’t produce degraded news feeds or mediocre search results here. It produces professional burnout, delayed diagnoses, fragmented care. It produces shortened lives. It’s the same process as digital platforms, but this time applied to a domain where the consequence is measured in years of existence.
… and science drowns in paper mills and zombie journals
If medicine bends under protocols, science suffers a more insidious form of degradation. In a 2024 editorial, Toomas Timpka describes a landscape that researchers now see daily: that of scientific publication transformed into a market, saturated with dubious articles, zombie journals and clandestine workshops capable of producing fake research assembly-line style.
The story begins, however, with a beautiful idea: open access. Publish freely, share knowledge, break down tariff barriers. Then journals slid, almost imperceptibly, toward a logic of volume. The number of publications becomes a commercial argument. Deadlines accelerate. Reviews lighten. Editorial highways appear, where publication depends more on fees paid than on rigor.
It’s in these cracks that paper mills proliferate. Their trade: manufacturing articles that imitate science. Plausible graphics, coherent tables, well-formatted references, invented but credible results. For a rushed reader, the illusion holds. For a detection algorithm, it passes.
The arrival of generative AI transformed these artisanal workshops into industrial chains. What took several days is now written in hours. In 2023, more than ten thousand articles were retracted worldwide, eight thousand from a single publisher, Hindawi. A hemorrhage that led to the closure of several journals and swallowed millions.
But the real danger lies elsewhere. These AI models train on authentic scientific literature, digest its style, recompose plausible texts… then reinject their fakes into the database. The contamination becomes circular: the more fake articles exist, the more models produce them, and the more they produce, the more they pollute the literature. It’s epistemic cannibalism, a self-feeding circle.
And retracted articles never truly disappear. They remain indexed, cited, picked up in analyses, like toxins that continue to move through the scientific food chain long after their appearance.
The trajectory is identical to digital platforms: an initial promise, a progressive shift toward quantity, then increasingly aggressive extraction. The result isn’t the disappearance of science, but its slow drowning, in an ocean of plausible but false texts that imitate rigor while betraying its spirit.
When lucidity becomes a trap, and how to escape it
At this point, you might think collective lucidity would be the first step toward a surge. The phenomenon is documented, discussed, dissected. Enshittification has become a conference keyword, a panel topic, an analysis column. Everything seems ripe for a groundswell.
And yet, nothing changes.
Ben Hunt put a name to this inertia: the Common Knowledge Problem. The problem isn’t just that everyone knows. It’s that everyone knows that everyone knows. This additional layer of consciousness creates a strange atmosphere where total visibility of the disaster no longer leads to action, but to a form of social torpor. We talk about the problem instead of solving it. We confuse diagnosis and treatment. Discourse becomes a substitute for engagement.
To this cognitive paralysis is added an economic lock. In a system based on extraction and permanent competition, any isolated resistance attempt is immediately punitive:
- the Amazon seller who refuses to pay loses visibility,
- the scientific journal that strengthens its review loses its authors,
- the study platform that treats its respondents better loses its clients,
- the user who leaves Facebook loses their social connections.
Everyone sees what should be done collectively, but everyone is penalized if they act alone. It’s a prisoner’s dilemma on a planetary scale.
So how do we break this circle?
The answer doesn’t lie in awareness (it’s already there) but in creating real room for maneuver. The first, most structural one, is what Cory Doctorow has been defending for years: making departure possible. In a word, interoperability.
Being able to leave a service without leaving your digital life behind. Export your messages, contacts, archives. Continue a conversation elsewhere. Return to users what platforms have methodically transformed into captivity. Interoperability isn’t a technical gadget: it’s a soft but formidably effective form of counter-power.
To this foundation can be added concrete levers, often neglected because they’re neither spectacular nor media-friendly:
- real, usable, standardized portability,
- a return to an internet where intelligence resides at the edges, not the center,
- anti-monopoly law applied without trembling,
- sustainable public funding for open infrastructures.
These are slow, technical, sometimes thankless battles. But they’re the ones that, patiently, crack the extraction structures that make enshittification inevitable.
On an individual scale, gestures remain possible:
- migrate to Signal, Mastodon, or free services,
- support projects that care for digital commons,
- pay for virtuous services,
- choose partial desertion over total resignation.
On a collective scale, we must continue to nourish what resists: Framasoft, Wikimedia, Nextcloud, Mastodon. Not because they’re perfect, but because they endure. And enduring, in the economy of enshittification, is already a form of victory.
It would be tempting to believe that this patient work of alternatives, interoperability and common structures will never be enough to reverse the system’s inertia. And yet, it’s precisely in this modest, continuous, sometimes almost invisible effort that a decisive part of the digital future is being played out. Enshittification prospers where everything is equivalent, where exhaustion replaces vigilance, where resignation slips into the interstices of daily life. It advances less by force than by habit, less by domination than by the progressive abandonment of all demands.
Resisting this slope requires neither heroism nor a grand upheaval. It simply requires not letting wear dictate the shape of our world. Accepting that certain trajectories demand to be slowly straightened, with a constancy that platforms haven’t anticipated and that their economic models don’t integrate. Nothing obliges us to settle for diminished tools, impoverished services, interfaces made hostile by calculation. Nothing prevents preferring quality to ease, attention to automation, continuity to capture.
This choice is far from abstract. It plays out in how we direct our uses, in the initiatives we support, in the infrastructures that deserve protection. It weaves through these discreet spaces where communities refuse to let digital be reduced to a trap, in these projects that hold firm despite pressure, in these fragments of internet that continue to carry another idea of progress.
Enshittification won’t disappear on its own, and nothing indicates it will retreat everywhere with the same intensity. But it doesn’t constitute an insurmountable fate either. As long as there exist places where rigor, transmission and cooperation aren’t slogans but practices, as long as there remain actors to defend a certain idea of what a digital space worthy of the name could be, programmed degradation can never fully impose itself.
We sometimes forget that digital hasn’t always had this absence of contour, this impression that every gesture leaves a trace stored elsewhere, in a space we no longer control. There was a time when our digital lives resembled a workshop: folders we organize, photos we keep, music we carry. Nothing was perfect, but it all still belonged to us, with the simplicity of an object held in hand.
Remembering this analog age has nothing nostalgic about it. It’s a way of recalling that another relationship with digital is possible, based on continuity, mastery and a certain respect for time. Enshittification prospers when everything becomes interchangeable, when our traces dissolve into flows that no longer have an interior. Restoring an anchor, even fragile, even partial, means reopening a space where we can breathe and think differently.
It may not be a brilliant victory, nor a sudden reversal, but a way of preserving the possibility of another future. A future that wouldn’t be dictated solely by extraction, but by a choice, even minuscule, repeated day after day: that of not resigning ourselves to mediocrity. And in this stubborn, almost artisanal vigilance, lies perhaps the only way to recall that digital can still be a matter of our choice, and not merely our acquiescence.
References
For meticulous minds, lovers of figures and sleepless nights verifying sources, here are the links that nourished this article. They recall a simple thing: information still exists, provided we take the time to read it, compare it and understand it. But in the near future, this simple gesture may become a luxury, because as texts generated entirely by AI multiply, the real risk is no longer disinformation, but the dilution of reality in an ocean of merely plausible content.
American Dialect Society (2024). 2023 Word of the Year is “enshittification”. (https://www.americandialect.org/2023-word-of-the-year-is-enshittification)
Buschini, P. (2023, October 23). Think as a Service or the proletarianization of knowledge. buschini.com.
Casilli, A. (2019). Waiting for robots: An inquiry into click work. Seuil.
European Commission. “The Digital Markets Act: ensuring fair and open digital markets”. (https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-markets-act-ensuring-fair-and-open-digital-markets_en)
Deitch, J.D. (2024). The Enshittification of Programmatic Sampling: How Buyers and Sellers Can Navigate Market Failures for Better Data and Healthier Panels. E-book. An in-depth analysis of market failures in the online research industry. (https://cal.com/jddeitch/sample)
Doctorow, C. (2023, January 23). The ‘Enshittification’ of TikTok. WIRED. (https://www.wired.com/story/tiktok-platforms-cory-doctorow/)
Doctorow, C. (2024, January). ‘Enshittification’ is coming for absolutely everything. Financial Times. (https://www.ft.com/content/6fb1602d-a08b-4a8c-bac0-047b7d64aba5) [Doctorow uses the terms “enshittocene” and “enshittificatory”]
Doctorow, C. (2023, April). As Platforms Decay, Let’s Put Users First. Electronic Frontier Foundation. (https://www.eff.org/deeplinks/2023/04/platforms-decay-lets-put-users-first) [On the end-to-end principle and technical solutions]
Framasoft. “Dégooglisons Internet”. (https://degooglisons-internet.org/)
Haugen, F. (2021). The “Facebook Papers”. Revelations widely covered by media such as The Wall Street Journal (“The Facebook Files”).
Hunt, B. (Various). Epsilon Theory. Analyses on market narratives and the concept of “Common Knowledge”. (https://www.epsilontheory.com/)
Institute for Local Self-Reliance (2022). Amazon’s Monopoly Tollbooth. (https://ilsr.org/amazons-monopoly-tollbooth/)
Institute of Health Equity, University College London (2024, January). Health inequalities, lives cut short. Cited by Lawson (2025). (https://www.instituteofhealthequity.org/resources-reports/health-inequalities-lives-cut-short)
Krugman, P. (2025, July). The General Theory of Enshittification. Paul Krugman Substack.
Lawson, E. (2025, January). The enshittification of general practice. British Journal of General Practice, 75(750), 1-48. DOI: https://doi.org/10.3399/bjgp25X740361. This editorial applies Doctorow’s concept to the British health system.
Le Grand Continent (2024). How Trump wants to reshape Europe into a far-right vassal through digital means. (Analysis based on a LinkedIn post referencing this article).
Liverpool, L. (2023). AI intensifies fight against ‘paper mills’ that churn out fake research. Nature, 618(7964), 222-223. doi:10.1038/d41586-023-01780-w
Masnick, M. (2023, July 5). It Turns Out Elon Is Speedrunning The Enshittification Learning Curve, Not The Content Moderation One. Techdirt. (https://www.techdirt.com/2023/07/05/)
Payne, R., Dakin, F., MacIver, E., et al. (2024). Challenges to quality in contemporary, hybrid general practice: a multi-site longitudinal case study. British Journal of General Practice. doi:10.3399/BJGP.2024.0184. [Case study cited by Lawson on the degradation of medical practices]
Stiegler, B. (2015). The Automatic Society, 1. The Future of Work. Fayard.
Stone, B. (2025, June 3). Enshittification: What is it and why is it coming for artificial intelligence?. brianwstone.com.
Tariq Krim (2025). How we lost analog internet https://www.cybernetica.fr/comment-nous-avons-perdu-linternet-analogique/
Timpka, T. (2024, August 27). The “enshittification” of online information services obligates rigorous management of scientific journals. Journal of Science and Medicine in Sport.
Van Noorden, R. (2023). More than 10,000 research papers were retracted in 2023, a new record. Nature, 624, 479-481.
Wikipedia. Cory Doctorow https://en.wikipedia.org/wiki/Cory_Doctorow
Wikipedia. https://fr.wikipedia.org/wiki/Merdification
Wright, J.C. (2023). Stakeholder Management in Change Initiatives: Reddit Changes Its API Pricing. London: SAGE. [Case study on Reddit’s enshittification]
Wu, T. (2018). The Curse of Bigness: Antitrust in the New Gilded Age. Columbia Global Reports.
Zuckgraf, R. (2024, January). Airbnb Was Supposed to Save Capitalism. Instead, It Just Devolved Into Garbage. Jacobin. (https://jacobin.com/2024/01/airbnb-big-tech-hotels-travel-sharing-economy-capitalism)
