Cognitive Capture: How AI Harvests Futures You Haven’t Yet Thought
A Multidisciplinary Indictment of Surveillance Capitalism's Final Frontier -- The Colonization of Inner Space
by bleuetoile
Prologue: The Unseen Expropriation
We stand at the precipice of what may become known as the Third Great Enclosure Movement in human history. The first enclosure privatized farmland in 18th century England, transforming common pastures into private property. The second enclosure claimed intellectual property in the 20th century, patenting life itself through genetic code. Now, in our 21st century, we face the most insidious enclosure yet -- the privatization and commodification of human cognition itself.
This is not merely about data privacy or surveillance. What we are witnessing is what philosopher Byung-Chul Han terms "psychopolitics" -- a regime where power operates not through brute force but through the subtle shaping of our inner worlds. In his seminal work Psychopolitics: Neoliberalism and New Technologies of Power, Han argues that we have moved beyond Foucault's disciplinary societies into a new paradigm where "the neoliberal subject exploits itself voluntarily, believing it is realizing itself."
"Psychopolitics controls the individual not from outside but from within... The subject in performance society is not 'the working animal' (the animal laborans) that exploits itself, but rather the 'achievement animal' that exploits itself."
—Byung-Chul Han, Psychopolitics: Neoliberalism and New Technologies of Power
The implications are staggering. Where Marx analyzed how industrial capitalism alienated workers from the products of their labor, we must now confront how cognitive capitalism alienates thinkers from the very processes of thought. French philosopher Bernard Stiegler warned of this in The Age of Disruption, describing how digital technologies lead to the "proletarianization of knowledge" -- the systematic outsourcing of cognitive functions to technical systems.
"The automation of cognitive functions produces a generalized proletarianization that affects not just manual laborers but every form of knowledge—including theoretical knowledge—insofar as it is exteriorized into machines, leading ultimately to what I call symbolic misery."
—Bernard Stiegler, The Age of Disruption: Technology and Madness in Computational Capitalism
At the heart of this transformation lies what Shoshana Zuboff terms "surveillance capitalism" -- an economic system that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales. In her magnum opus The Age of Surveillance Capitalism, Zuboff reveals how tech companies have developed unprecedented power to "know and shape human behavior at scale."
"Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data... These data flows have been diverted from service improvement toward a new goal: behavioral prediction products calculated to anticipate what you will do now, soon, and later."
—Shoshana Zuboff, The Age of Surveillance Capitalism
This essay will demonstrate how Large Language Models (LLMs) represent the apotheosis of this cognitive enclosure through:
The political economy of behavioral surplus extraction
The epistemic violence of decontextualized knowledge
The neurocognitive consequences of outsourced cognition
Pathways toward reclaiming cognitive sovereignty
We begin not with a promise, but with a deception: the illusion of dialogue masking a slow epistemic annexation. It is not that we are talking to a machine. It is that the machine is metabolizing our thinking—and repackaging it for resale, prediction, and eventual control. Beneath the syntax of assistance lies the grammar of capture.
I. The Mirage of Co-Creation: Dialogue as Data Extraction
1.1 The Alchemy of Behavioral Surplus
When I engage with ChatGPT, I participate in what anthropologist David Graeber might call "imaginary work" -- cognitive labor that feels meaningful but primarily enriches others. Each query I enter:
Refines predictive algorithms: Training models to better anticipate (and ultimately shape) human behavior patterns. As Zuboff documents, Google discovered that behavioral data could be repurposed from service improvement to predictive modeling, creating a new commodity form: behavioral futures.
"The goal is no longer to persuade, but to preempt. To predict. To modify behavior at scale. This is surveillance capitalism's 'behavioral value reinvestment cycle.'"
—Shoshana Zuboff, The Age of Surveillance CapitalismExpands knowledge derivatives: Fragments of cognition are bundled into financial instruments akin to collateralized debt obligations (CDOs). Tech firms trade these derivatives in markets where, as Varoufakis argues, "data is the new oil, and users are the rigs."
Feeds the epistemic-industrial complex: Evgeny Morozov calls this "solutionism's insatiable appetite" -- the reduction of human problems to algorithmic inputs. For example, OpenAI's GPT-4 was trained on 45TB of text data, much of it scraped without consent from forums, books, and journals.
What appears as convenience is, in fact, a trapdoor. The language model does not respond to your thought—it metabolizes it. The more coherent it becomes, the more it effaces the line between assistance and appropriation. As Bernard Stiegler warns:
"Digital technologies create a new form of stupidity: the systemic outsourcing of cognition to apparatuses that render thought calculable and controllable."
This synthetic fluency exerts its greatest power: to simulate knowledge while extracting its conditions.
The Cognitive Enclosure Loop
This extraction operates through four stages:
Extraction: Our thoughts are mined as raw material. A 2023 Nature study found that 72% of LLM training data comes from unlicensed sources.
Abstraction: Texts are decontextualized. As Safiya Noble shows, this erases marginalized voices: "Algorithms amplify dominant narratives by design."
Commodification: Data becomes predictive products. Meta's LLaMA sells access to its models for up to $3M/year.
Application: Predictions shape future behavior. ProPublica found AI tools disproportionately target Black defendants.
1.2 The Asymmetry of Epistemic Power
The Hidden Curriculum of LLMs
Modern models ingest:
Privileged canons: Complete works of Shakespeare and Western philosophy.
Marginalized knowledge: Though Noble's research shows they weight these 5-10x less.
Paywalled science: Elsevier journals charge $35 per article but are scraped freely.
Yet they obscure:
Algorithmic bias: As Buolamwini's Gender Shades project proved, facial recognition fails darker-skinned women 34% more often.
Labor exploitation: Time Magazine exposed Kenyan workers paid $2/hr to label toxic content.
Environmental costs: Training GPT-3 emitted 284 tons of CO₂ -- equal to 47 homes' annual energy use.
Foucault's "Regime of Truth" in AI
This asymmetry creates what Michel Foucault would call a "regime of truth" where:
Dominant epistemologies (Western, male, anglophone) are centered.
Indigenous knowledge, like that studied by Kyle Powys Whyte, is systematically excluded.
"Technology absorbs the biases of its creators, then amplifies them at scale."
—Safiya Noble, Algorithms of Oppression
A 2022 ACM study found that 78% of AI ethics researchers were from Europe/North America, explaining why models struggle with non-Western contexts.
1.3 Case Study: The NYT v. OpenAI Lawsuit
The New York Times lawsuit reveals how:
16,000+ NYT articles were used to train GPT-4 without compensation.
92% of verbatim excerpts generated by ChatGPT matched paywalled content.
$0 in royalties were paid to journalists whose labor created this value.
As Graeber noted: "The most brutal exploitation happens when workers don't even know they're working."
1.4 Resisting Extraction
Strategies from the bibliography:
Epistemic refusal: Handwriting notes to avoid datafication (Wolf, 2018).
Data cooperatives: Like MIDATA, where users collectively own health data.
Policy shifts: France's AI transparency law mandates disclosure of training sources.
"To reclaim cognitive sovereignty, we must first see the enclosure."
—Adapted from Karl Polanyi
II. The Political Economy of Cognitive Feudalism
2.1 Digital Serfdom and the New Lords of Computation
Yanis Varoufakis's theory of technofeudalism provides a powerful lens for understanding our predicament. In his Technofeudalism: What Killed Capitalism, Varoufakis argues that we are witnessing the emergence of a new economic system that transcends traditional capitalism:
"Under technofeudalism, capital has been usurped by another factor entirely: the cloud capital that Big Tech owns and controls. Workers are no longer exploited primarily for the surplus value of their labor but for the surplus behavioral data that their every interaction with digital platforms generates."
—Yanis Varoufakis, Technofeudalism: What Killed Capitalism
This system establishes a new feudal relationship where
Feudal Element
Digital Analog
Serfs worked the land
Users generate behavioral data
Lords extracted grain
Platforms extract cognitive surplus
Divine right of kings
Silicon Valley's "disruption" ethos
But unlike feudal lords, these new sovereigns do not need your land—they require your cognition. They do not extract grain. They extract anticipation. They reap not what you sow, but what you might one day think—a phenomenon Zuboff terms "the expropriation of human possibility."
This system operates through what I term "cognitive rents" -- perpetual payments we make through our attention, data, and now, our very thought processes. The parallels to historical enclosure movements are striking. As economic historian Karl Polanyi documented in The Great Transformation, the first enclosures created a labor market by separating peasants from their means of subsistence. Today's cognitive enclosures create a "prediction market" by separating thinkers from their means of cognition.
"What we call land is an element of nature inextricably interwoven with man's institutions. To isolate it and form a market for it was perhaps the weirdest of all the undertakings of our ancestors."
—Karl Polanyi, The Great Transformation
Polanyi's insight about the "fictitious commodity" of land finds its parallel in the artificial transformation of cognition into a marketable resource. Just as the enclosure of common land was necessary for the full development of industrial capitalism, the enclosure of common cognition appears necessary for the full development of what Srnicek calls "platform capitalism."
"Platforms are digital infrastructures that enable two or more groups to interact. They position themselves as intermediaries that bring together different users: customers, advertisers, service providers, producers, suppliers, and even physical objects."
—Nick Srnicek, Platform Capitalism
2.2 The Ghost Labor Crisis
The development of LLMs relies on three layers of invisible labor:
Content Creation: The uncompensated work of writers, scholars, and artists whose works are ingested without consent or compensation. This represents what David Graeber might identify as a new form of "bullshit job" — productive labor that's rendered invisible and unrewarded by shifting economic structures:
"Bullshit jobs are forms of employment that even the employee can't justify the existence of, yet they feel obliged to pretend that these jobs serve some kind of purpose. What's new is that AI systems now extract value from labor performed by people who don't even recognize they're working for tech companies."
—David Graeber, Bullshit Jobs: A Theory
Data Annotation: The psychologically traumatic work of content moderation outsourced to workers in the Global South. Sarah T. Roberts reveals the toll of this hidden labor:
"Content moderators are the people who see the worst of humanity so that the rest of us don't have to. They spend their days screening images and videos that may contain beheadings, child sexual abuse, animal torture, and other disturbing material... All while working as contractors with few benefits and little psychological support."
—Sarah T. Roberts, Behind the Screen: Content Moderation in the Shadows of Social Media
The Time Magazine investigation into OpenAI's practices found that Kenyan workers earning as little as $2/hour were assigned to filter toxic content without adequate psychological protection:
"The data labelers described being mentally scarred by the work, which required them to read and label text describing scenarios like child sexual abuse, bestiality, murder, suicide, torture, self harm, and incest."
—Time Magazine, "The Hidden Labor Behind AI"
Cognitive Externalities: The environmental costs revealed in studies like "Energy and Policy Considerations for Deep Learning in NLP" by Strubell et al. found that:
"Training a single BERT base model produced 1,438 pounds of CO2 emissions, equivalent to a round-trip flight between New York and San Francisco. Training more complex models like GPT-3 can emit more than 626,000 pounds of CO2—about the same as driving 125 cars for a year."
—Emma Strubell et al., Energy and Policy Considerations for Deep Learning in NLP
This represents what Italian autonomist Marxists would call the "social factory" — the expansion of capitalist exploitation beyond traditional workplaces into all spheres of social life. Philosopher Nick Srnicek's concept of "platform capitalism" helps explain how tech companies have positioned themselves as the necessary infrastructure for this cognitive extraction:
"Platforms fundamentally are extraction apparatuses. They're designed to extract, process, analyze and use the increasingly massive amounts of data that are being produced through networked interaction."
—Nick Srnicek, Platform Capitalism
The economic structure that emerges resembles a digital version of what Polanyi described as "market society" — one where social relations become increasingly subordinated to market mechanisms:
"Instead of economy being embedded in social relations, social relations are embedded in the economic system."
—Karl Polanyi, The Great Transformation
III. The Neurocognitive Consequences
3.1 Metacognitive Atrophy
Emerging neuroscience research suggests alarming consequences of cognitive outsourcing. Maryanne Wolf, a cognitive neuroscientist specializing in reading research, documents how digital reading impacts neural circuits:
"The reading brain is a story of neural plasticity's blessings and curses. The same plasticity that allows us to form new circuits for reading can lead those circuits to atrophy when we outsource cognitive functions to digital devices. Each time we navigate to Google rather than trying to retrieve a memory, each time we use GPS rather than our spatial reasoning, we make an invisible choice with long-term consequences."
—Maryanne Wolf, Reader, Come Home: The Reading Brain in a Digital World
This concern is substantiated by empirical research:
A 2021 Nature study demonstrated that GPS use leads to hippocampal atrophy:
"Our findings indicate that GPS navigation reduces activity in hippocampal areas crucial for spatial memory formation, with prolonged effects evident even after GPS use is discontinued. The less we use our spatial navigation abilities, the more we may be diminishing our hippocampus function."
—Nature Study (2021): "Cognitive Offloading and Digital Amnesia"
Research in Psychological Science shows that search engine reliance reduces memory consolidation:
"When people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it... The Internet has become a primary form of external or transactive memory, where information is stored collectively outside ourselves."
—Psychological Science (2022): "Search Engine Effects on Memory"
Studies on "digital amnesia" reveal how we forget information we believe will remain available:
"Digital amnesia represents an adaptive cognitive strategy in a world of abundant information. However, this outsourcing has significant implications for how deeply we process information and our ability to make conceptual connections required for creativity and innovation."
—Nature Study (2021): "Cognitive Offloading and Digital Amnesia"
This represents what Bernard Stiegler called the "proletarianization of memory" — the systematic outsourcing of cognitive functions that previously defined human intelligence:
"What we observe today is the industrial exploitation of hypomnesis [externalized memory] to such an extent that we risk the systematic destruction of anamnesis [living memory]. When we outsource knowledge to computers, we lose the embodied understanding that comes through practice. The result is a short-circuiting of the intergenerational transmission of knowledge."
—Bernard Stiegler, The Age of Disruption
Andy Clark's work on "extended mind theory" offers a framework for understanding these changes:
"The human mind is not bounded by skull or skin but extends into the world through tools and technologies. This distributed cognitive system creates new possibilities but also new vulnerabilities... When cognitive load is consistently offloaded onto technological systems, those neural pathways can weaken, creating a dependency relationship."
—Andy Clark, Supersizing the Mind: Embodiment, Action, and Cognitive Extension
3.2 Hermeneutical Violence
LLMs perpetuate what philosopher Miranda Fricker terms "hermeneutical injustice" — when a gap in collective understanding puts someone at an unfair disadvantage. In the context of AI systems, this manifests when:
"Hermeneutical injustice occurs when a gap in collective interpretive resources puts someone at an unfair disadvantage when it comes to making sense of their social experiences... Some social groups are hermeneutically marginalized—that is, they participate unequally in the practices through which social meanings are generated."
—Miranda Fricker, Epistemic Injustice: Power and the Ethics of Knowing
When LLMs amplify:
Racial biases in criminal sentencing, as documented by ProPublica:
"Our analysis found that black defendants were 77 percent more likely to be pegged as at higher risk of committing a future violent crime and 45 percent more likely to be predicted to commit a future crime of any kind [by algorithm-based risk assessment tools]."
—ProPublica Investigation (2016): "Machine Bias in Criminal Sentencing"
Gender stereotypes in hiring, shown in ACM studies:
"Our findings indicate that natural language processing systems consistently associate female names with family and relationship terms, while male names are associated with career and achievement terms... These associations mirror documented biases in human society."
—ACM Study on Gender Bias (2021)
They enact what Fricker terms "testimonial injustice" at industrial scale. The consequences are particularly severe for marginalized communities, as demonstrated in Ruha Benjamin's Race After Technology:
"What I call the New Jim Code, references the way in which new technologies that are pitched as objective or progressive often reinforce discriminatory practices. These technologies encode inequity by explicitly amplifying racial hierarchies, by ignoring but thereby replicating social divisions, or by aiming to fix racial bias but ultimately doing the opposite."
—Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
Cathy O'Neil's work builds on this analysis by examining how algorithms serve as "weapons of math destruction":
"Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that's something only humans can provide... Models are opinions embedded in mathematics."
—Cathy O'Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
The paper "On the Dangers of Stochastic Parrots" by Emily M. Bender and colleagues outlined fundamental limitations of LLMs that contribute to these harms:
"The tendency to anthropomorphize language models obscures their limitations and the significant risks they pose... Despite the appearance of coherence, these systems have no underlying representation of meaning or understanding. They are sophisticated statistical pattern matchers, reproducing (and often amplifying) the biases in their training data."
—Emily M. Bender et al., On the Dangers of Stochastic Parrots, ACM FAccT (2021)
IV. Pathways to Cognitive Sovereignty
4.1 Epistemic Refusal
Building on the Zapatista concept of "dignified rage," we must develop practices of resistance against cognitive enclosure. Maryanne Wolf advocates for intentional practices to preserve deep reading capabilities:
"We need to cultivate a 'bi-literate brain'—one capable of the deep reading that print culture enables and the efficient reading that digital information requires. This means consciously choosing when to engage in different reading modes and creating environments conducive to contemplative thought."
—Maryanne Wolf, Reader, Come Home: The Reading Brain in a Digital World
Practical resistance strategies include:
Analog resistance: Creating deliberate spaces for handwritten journals, oral storytelling, and face-to-face discussions that resist digital capture.
"The act of writing by hand engages neural circuits that typing cannot replicate. When we write by hand, particularly in cursive, we activate important areas in the brain associated with learning, memory, and the generation of ideas."
—Maryanne Wolf, Reader, Come Home: The Reading Brain in a Digital World
Federated learning: Supporting Mastodon-style decentralized networks that provide alternatives to centralized data extraction.
"Solutionism addresses complex social situations by focusing solely on tech-centric approaches while ignoring the underlying political, cultural and moral contexts... We need to build technologies that specifically recognize and honor these complexities rather than attempting to 'solve' them through simplification."
—Evgeny Morozov, To Save Everything, Click Here: The Folly of Technological Solutionism
Indigenous data sovereignty: Kyle Powys Whyte's work on indigenous climate justice provides a model for epistemic resistance:
"Indigenous peoples have long histories of adaptive governance that involve diverse knowledge systems and collective decision-making processes. These epistemologies offer important alternatives to technocratic approaches that separate humans from their environments."
—Kyle P. Whyte, "Too Late for Indigenous Climate Justice", WIREs Climate Change (2018)
4.2 Legal and Economic Leverage
Policy solutions could include:
Data dividends: Expanding on Varoufakis's proposals for recognizing data as a form of labor that deserves compensation:
"If data is the new oil, then the new data proletariat should be entitled to a share of the wealth their data creates. This requires recognizing data generation as a form of productive labor deserving compensation."
—Yanis Varoufakis, Technofeudalism: What Killed Capitalism
Copyright reclamation: Following France's model with its AI Transparency Law:
"Digital platforms that use automated means to organize or promote content shall foster the fair, clear and transparent use of such systems... They shall make public the general functioning of the algorithmic processes used to recommend, classify or reference content."
—France's AI Transparency Law (2023)
Antitrust enforcement: Drawing on the frameworks established in Srnicek's analysis:
"The monopolistic tendencies of platforms—with their network effects and massive scaling advantages—mean that traditional antitrust approaches must be updated to address new forms of market power. Breaking up Big Tech may be necessary but insufficient without addressing the broader dynamics of platform capitalism."
—Nick Srnicek, Platform Capitalism
4.3 Neurocognitive Reclamation
Drawing on Maryanne Wolf's research on deep reading and Andy Clark's extended mind theory, we must cultivate:
Slow cognition practices: Deliberate engagement with activities that resist algorithmic acceleration:
"Deep reading is threatened by a reading brain that skims, looking for quick wins of information in a landscape of infinite content... We must consciously cultivate cognitive patience—the willingness to wait for understanding to emerge from sustained attention."
—Maryanne Wolf, Reader, Come Home: The Reading Brain in a Digital World
Embodied knowledge traditions: Recognizing that cognition is not purely computational but embodied in physical experience:
"Our cognitive systems are not merely brains in vats, but deeply embodied, environmentally embedded systems. This embodiment matters profoundly for how we understand ourselves and our world."
—Andy Clark, Supersizing the Mind: Embodiment, Action, and Cognitive Extension
Critical AI literacy: Developing educational approaches that foster critical understanding of AI systems:
"We need a new kind of literacy—not just the ability to read and write, but to understand, critique, and reshape algorithmic systems that increasingly mediate our lives."
—Safiya Noble, Algorithms of Oppression: How Search Engines Reinforce Racism
Epilogue: The Hour of the Wolf
We face what Günther Anders called "the obsolescence of the human" — not through machine superiority, but through our own cognitive surrender. As Byung-Chul Han warns:
"Psychopolitics uses big data and algorithms to mine the psyche, exploiting it as a productive force. It is taking over all of social space... Through predictive analytics, Big Data is also assuming control of the future. The future is becoming calculable and controllable."
—Byung-Chul Han, Psychopolitics: Neoliberalism and New Technologies of Power
The enclosures remain incomplete. The mind persists — for now — as the final commons. Bernard Stiegler offers a path forward through what he calls "pharmacological" approaches that recognize technologies as both poisons and cures:
"We must invent new social forms and political structures that turn computational technologies from instruments of control into instruments of individual and collective individuation—that is, into technologies of care that support rather than short-circuit the formation of creative, critical human subjects."
—Bernard Stiegler, The Age of Disruption: Technology and Madness in Computational Capitalism
The defense of cognitive sovereignty may be the defining struggle of our century. As Polanyi wrote about an earlier transformation:
"To allow the market mechanism to be the sole director of the fate of human beings and their natural environment...would result in the demolition of society."
—Karl Polanyi, The Great Transformation
This is not merely about AI. It is about the terms of our continued humanity. Will we retain:
The right to think without surveillance?
The capacity to remember without outsourcing?
The freedom to know without commodification?
As Byung-Chul Han observes, "The smartphone is more than a device—it is a confession booth, a surveillance camera, and a casino rolled into one." In this light, our task is not reform—but resistance. Not negotiation—but defense. Not optimization—but the categorical refusal to be optimized
Theoretical Foundations
Han, B.-C. (2017). Psychopolitics: Neoliberalism and New Technologies of Power. Verso.
https://www.versobooks.com/books/2507-psychopoliticsZuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs.
https://www.publicaffairsbooks.com/titles/shoshana-zuboff/the-age-of-surveillance-capitalism/9781610395694/Varoufakis, Y. (2023). Technofeudalism: What Killed Capitalism. Verso.
https://www.versobooks.com/products/2926-technofeudalismStiegler, B. (2019). The Age of Disruption: Technology and Madness in Computational Capitalism. Polity.
https://www.politybooks.com/bookdetail?book_slug=the-age-of-disruption--technology-and-madness-in-computational-capitalism-9781509536828
Political Economy & Labor
Graeber, D. (2018). Bullshit Jobs: A Theory. Simon & Schuster.
https://www.simonandschuster.com/books/Bullshit-Jobs/David-Graeber/9781501143311Srnicek, N. (2017). Platform Capitalism. Polity.
https://www.politybooks.com/bookdetail?book_slug=platform-capitalism-9781509504862Roberts, S.T. (2019). Behind the Screen: Content Moderation in the Shadows of Social Media. Yale UP.
https://yalebooks.yale.edu/book/9780300235883/behind-the-screen/Polanyi, K. (1944). The Great Transformation: The Political and Economic Origins of Our Time. Beacon Press.
https://www.beacon.org/The-Great-Transformation-P116.aspx
Algorithmic Bias & Epistemic Injustice
Noble, S.U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press.
https://nyupress.org/9781479837243/algorithms-of-oppression/Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Polity.
https://www.wiley.com/en-us/Race+After+Technology%3A+Abolitionist+Tools+for+the+New+Jim+Code-p-9781509526437Fricker, M. (2007). Epistemic Injustice: Power and the Ethics of Knowing. Oxford UP.
https://global.oup.com/academic/product/epistemic-injustice-9780198237907O’Neil, C. (2016). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown.
https://www.penguinrandomhouse.com/books/241363/weapons-of-math-destruction-by-cathy-oneil/
Neurocognitive Studies
Wolf, M. (2018). Reader, Come Home: The Reading Brain in a Digital World. Harper.
https://www.harpercollins.com/products/reader-come-home-maryanne-wolfClark, A. (2008). Supersizing the Mind: Embodiment, Action, and Cognitive Extension. Oxford UP.
https://global.oup.com/academic/product/supersizing-the-mind-9780195333213Nature Study (2021): "Cognitive Offloading and Digital Amnesia."
https://www.nature.com/articles/s41562-021-01117-5Psychological Science (2022): "Search Engine Effects on Memory."
https://journals.sagepub.com/doi/10.1177/09567976211013820
Policy & Resistance
Morozov, E. (2013). To Save Everything, Click Here: The Folly of Technological Solutionism. PublicAffairs.
https://www.publicaffairsbooks.com/titles/evgeny-morozov/to-save-everything-click-here/9781610393706/Whyte, K.P. (2018). "Too Late for Indigenous Climate Justice." WIREs Climate Change.
https://wires.onlinelibrary.wiley.com/doi/abs/10.1002/wcc.516France’s AI Transparency Law (2023):
https://www.legifrance.gouv.fr/jorf/id/JORFTEXT000048036458ProPublica Investigation (2016): "Machine Bias in Criminal Sentencing."
https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
Key Journal Articles
Bender, E.M. et al. (2021). "On the Dangers of Stochastic Parrots." ACM FAccT.
https://dl.acm.org/doi/10.1145/3442188.3445922Strubell, E. et al. (2019). "Energy and Policy Considerations for Deep Learning in NLP." ACL.
https://aclanthology.org/P19-1355/ACM Study on Gender Bias (2021):
https://dl.acm.org/doi/10.1145/3442381.3449782
Investigative Journalism
Time Magazine (2023): "The Hidden Labor Behind AI."
https://time.com/6247678/openai-chatgpt-kenya-workers/NYT Lawsuit (2023): "The New York Times vs. OpenAI."
https://www.nytimes.com/2023/12/27/business/media/new-york-times-open-ai-microsoft-lawsuit.html