Skip to content

Surveillance Capitalism

Behavioral Prediction, Data Extraction, and the Commodification of Experience

Critical Theory Wiki Contributors

Introduction

Surveillance capitalism is an economic formation in which private capital unilaterally claims human experience as free raw material for translation into behavioral data. This data is then analyzed, packaged, and sold as “prediction products” to businesses seeking to anticipate and influence human behavior. The concept, developed by Shoshana Zuboff in her 2019 magnum opus The Age of Surveillance Capitalism, describes not simply increased surveillance under capitalism but a fundamentally new mutation of capitalism itself.

Where industrial capitalism commodified labor and nature, surveillance capitalism commodifies lived experience and personal behavior. Where traditional capitalism sold products to customers, surveillance capitalism’s actual customers are businesses buying predictions about user behavior—users themselves are not customers but sources of raw material. Where earlier capitalisms faced resistance from organized labor, surveillance capitalism operates largely unopposed, having emerged in regulatory voids and cloaked itself in technological inevitability.

Surveillance capitalism is pioneered by Google, Facebook, Amazon, Microsoft, and other tech giants, but extends far beyond them into insurance, healthcare, retail, finance, and increasingly all sectors. It represents what Zuboff calls a “rogue mutation” of capitalism that threatens democracy, autonomy, and human freedom through unprecedented asymmetries of knowledge and power. Understanding surveillance capitalism is essential for grasping how contemporary capitalism accumulates value, exercises power, and shapes subjectivity in the 21st century.

Key Figures

Related Thinkers:

  • Shoshana Zuboff (1951-present) - Foundational theorist, The Age of Surveillance Capitalism (2019)
  • Michel Foucault (1926-1984) - Biopolitics, disciplinary power
  • Nick Srnicek (1982-present) - Platform Capitalism (2016)
  • Karl Marx (1818-1883) - Alienation, commodity fetishism
  • Evgeny Morozov (1984-present) - Tech solutionism critique

📖 Essential Reading: Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (PublicAffairs, 2019)

Historical Emergence

The Dotcom Crash and Google’s Invention (2001-2003)

Surveillance capitalism was invented by Google in response to investor pressure following the 2001 dotcom crash. Google had revolutionary search technology (PageRank algorithm) but struggled to monetize it. Initial plans—licensing to corporations, basic keyword advertising—generated insufficient revenue. Investors demanded profitability or threatened to withdraw funding.

In 2002-2003, Google engineers discovered that the vast quantities of user data they accumulated—searches, click patterns, locations, behaviors—could be analyzed to predict which ads users would click. This “behavioral surplus”—data beyond what’s needed to improve search—became valuable raw material. By analyzing surplus, Google could offer advertisers unprecedented targeting: ads shown to specific users at specific moments when they’re most likely to click.

This innovation was revolutionary. Previously, advertising was somewhat indiscriminate—you advertised to demographics or contexts hoping to reach interested customers. Google offered pinpoint behavioral targeting based on actual user data. Click-through rates soared; advertisers paid premiums; Google’s revenue exploded. The company discovered a new vein of profit: surveillance.

Crucially, this occurred without users’ knowledge or consent. Google unilaterally decided that data generated by users’ interactions would be appropriated as corporate asset. Users thought they were searching; Google realized they were also producing raw material for a new economic process.

Facebook’s Expansion (2006-2012)

Facebook, founded 2004, initially had no business model. Mark Zuckerberg resisted advertising, fearing it would alienate users. But as Facebook scaled to hundreds of millions of users, investor pressure mounted. Facebook couldn’t simply copy Google’s search advertising—social networking required different approach.

Facebook’s innovation was social surveillance. By analyzing not just individual behaviors but social networks—who you friend, what you like, what friends post, how you interact—Facebook could make even more precise behavioral predictions. Your behavior could be predicted not just from your own data but from your network’s patterns. Facebook monetized social life itself.

The “Like” button (2009) and “Open Graph” protocol (2010) extended Facebook surveillance across the internet. Like buttons on external websites tracked users even when they weren’t on Facebook. Third-party apps accessed users’ and their friends’ data. Surveillance expanded from Facebook’s platform to much of the web.

Facebook’s mobile app (2012+) added location tracking, creating permanent surveillance of users’ physical movements. Combined with behavioral data, this enabled extraordinarily valuable behavioral prediction—not just what users think or feel but where they go, what they do offline, and how digital information translates into physical behavior.

The Surveillance Capitalism Ecosystem (2010s)

By mid-2010s, surveillance capitalism had become dominant business model for digital platforms:

  • Amazon: Tracks purchases, browsing, voice commands (Alexa), reading (Kindle), viewing (Prime Video), and increasingly physical retail (Whole Foods, Amazon Go)
  • Microsoft: Enterprise surveillance through Office 365, LinkedIn, Windows telemetry
  • Apple: Initially positioned as privacy alternative but increasingly surveils through App Store, Apple Pay, Health, location tracking
  • Twitter: Behavioral surveillance optimized for real-time prediction
  • TikTok: Algorithmic surveillance of attention patterns

Beyond tech giants, surveillance capitalism spread:

  • Insurance: Telematics tracking driving; wearables monitoring health; IoT devices surveilling homes
  • Retail: Facial recognition, behavioral tracking, purchase prediction
  • Finance: Credit scoring based on behavioral data; algorithmic trading
  • Healthcare: Patient monitoring, prediction of medical compliance
  • Smart cities: Ubiquitous sensors tracking movement, behavior, interactions

The 2016 Election and Cambridge Analytica Scandal

Facebook’s role in 2016 U.S. election and Brexit revealed surveillance capitalism’s political dangers. Cambridge Analytica, political consulting firm, harvested 87 million Facebook users’ data without consent, using it for targeted political advertising and potential psychological manipulation.

The scandal exposed that surveillance infrastructure built for selling products could easily be weaponized for political manipulation. The same systems predicting consumer behavior could predict political preferences and target individualized propaganda. Behavioral surplus—initially appropriated for commercial purposes—proved valuable for authoritarian politics.

This wasn’t aberration but logical extension. Surveillance capitalism’s business model is predicting and influencing behavior. Whether the desired behavior is buying products or voting certain ways differs only in application. The infrastructure is inherently dual-use: commercial and political control.

GDPR and Regulatory Response (2018-Present)

Growing awareness of surveillance capitalism’s risks generated regulatory responses. EU’s General Data Protection Regulation (GDPR, 2018) established:

  • Right to access personal data
  • Right to delete personal data
  • Right to opt-out of processing
  • Requirements for clear consent
  • Heavy penalties for violations

California followed with California Consumer Privacy Act (CCPA, 2018, strengthened 2020). Other jurisdictions enacted similar legislation. Yet surveillance capitalists adapted, using consent theater (dense privacy policies, deceptive interfaces) to maintain data extraction while claiming compliance.

More fundamentally, individual consent frameworks don’t address structural asymmetries. Even if users could meaningfully consent (most can’t—services are too essential, terms too complex), individual action can’t challenge systemic power relations. Surveillance capitalism requires collective political response, not just individual consumer choice.

COVID-19 and Surveillance Expansion (2020-2023)

The pandemic dramatically expanded surveillance capitalism. Contact tracing apps, vaccine passports, remote monitoring, and work-from-home surveillance normalized unprecedented data collection. Public health emergencies justified surveillance that would previously provoke resistance.

Zoom, Microsoft Teams, Slack, and collaboration platforms introduced workplace surveillance—monitoring productivity, attention, and behavior during remote work. Students faced proctoring software monitoring them during remote exams. Delivery workers’ movements were tracked constantly to verify pandemic safety protocols.

This revealed surveillance capitalism’s adaptability. Crisis provides opportunities for expansion that become normalized. “Temporary” emergency measures become permanent infrastructure. Public health justifications provide cover for commercial and political surveillance.

Core Mechanisms

Behavioral Surplus Extraction

Behavioral surplus is data generated by users beyond what’s needed to provide services. When you search Google, some data is needed to return results. But Google also captures:

  • Your location
  • Previous searches
  • Time spent on results
  • Pages you visit after
  • Connected devices
  • Voice patterns (if using voice search)
  • And thousands more data points

This surplus has no service value for you—you’d get the same search results without Google capturing it. But it has immense value for prediction. Aggregated across billions of users over years, behavioral surplus becomes raw material for machine learning systems predicting future behavior with uncanny accuracy.

Extraction is automatic, continuous, and asymmetric. You can’t opt out without abandoning essential services. You typically don’t know what’s captured, how it’s used, or who accesses it. The relationship is fundamentally unequal—total visibility of you, total opacity of the surveillance apparatus.

The Prediction Imperative

Surveillance capitalism’s economic imperative is improving behavioral prediction. More data enables better predictions; better predictions command higher prices from customers (advertisers, insurers, etc.) seeking to influence behavior.

This creates relentless pressure to:

  1. Capture more data: Expand surveillance to new domains (homes via IoT, bodies via wearables, thoughts via interfaces)
  2. Analyze more deeply: More sophisticated machine learning, more granular behavioral patterns
  3. Intervene more directly: Move from predicting behavior to modifying it (what Zuboff calls “actuating”)

Traditional capitalism’s imperative is profit through production or trade. Surveillance capitalism’s imperative is certainty through prediction and control. The goal isn’t making better products but making better predictions about—and eventually control over—people.

Prediction Products and Behavioral Futures Markets

Surveillance capitalists sell “prediction products”—forecasts about what users will do, think, feel, or want. These are sold in behavioral futures markets—commodified human futures.

Examples:

  • Google sells predictions about which ads users will click
  • Facebook sells predictions about users likely to buy certain products
  • Insurance companies buy predictions about who will have accidents or health problems
  • Political consultants buy predictions about persuadable voters
  • Employers buy predictions about employee behavior

These markets are opaque. Users don’t know predictions are being made about them, what predictions say, who buys them, or how they’re used. Unlike traditional commodity markets, suppliers (users) receive no compensation, have no rights, and often don’t know they’re supplying raw material.

Economies of Action

Zuboff distinguishes surveillance capitalism’s “economies of action” from traditional “economies of scale.” Rather than reducing production costs through volume, surveillance capitalism seeks to reduce uncertainty about behavior.

Every data point slightly reduces uncertainty. Billions of data points across millions of people create probabilistic knowledge enabling behavior prediction and modification. The more comprehensive the surveillance, the less uncertain behavior becomes. Perfect surveillance approaches perfect prediction—and perfect control.

This drives expansion beyond digital platforms into physical world. Smart speakers listen in homes; wearables monitor bodies; smart cities track movement; vehicles report to manufacturers and insurers. The goal is comprehensive surveillance of lived experience, making all of life legible to capital.

Instrumentarianism and Behavioral Modification

Surveillance capitalism’s terminal stage is what Zuboff calls instrumentarianism—not just predicting behavior but modifying it. Prediction naturally evolves toward control; knowing what you’ll do next enables making you do it.

This occurs through:

  • Subtle nudges: Interface design steering behaviors (dark patterns, defaults, timing)
  • Algorithmic curation: Controlling information flows to shape perceptions
  • Gamification: Reward systems training desired behaviors
  • Social proof: Showing what “everyone else” is doing to induce conformity
  • Dynamic pricing: Personalized prices exploiting individual willingness to pay

Facebook’s notorious 2012 emotional contagion experiment demonstrated this. Without users’ knowledge or consent, Facebook manipulated news feeds to show more positive or negative content, successfully inducing corresponding emotional states. This proved behavioral modification at scale was possible—and that platforms would attempt it.

Contemporary Manifestations

Personalized Surveillance Advertising

Surveillance advertising is more sophisticated and invasive than ever. Systems track users across devices, websites, and physical locations. They analyze not just what you click but how you move your mouse, how long you pause, facial expressions (via webcams), voice patterns, and biometric data.

Real-time bidding systems instantly auction access to individual users—when you load a webpage, automated systems bid for your attention based on predictions about you. This happens hundreds of times daily without your knowledge. You are constantly being evaluated, priced, and auctioned.

“Free” services aren’t free—you pay with comprehensive surveillance of your life. The transaction is deliberately obscured through complexity, speed, and legal terms no one reads. The appearance of free choice masks fundamentally coercive extraction.

Algorithmic Management and Worker Surveillance

Surveillance capitalism extends into labor. Platforms like Uber, Amazon, and DoorDash use algorithmic management—automated systems monitoring, evaluating, and disciplining workers based on continuous data capture.

Amazon warehouse workers are tracked by systems measuring productivity per second. Drivers are monitored by in-vehicle cameras recording continuously. Call center workers are evaluated by AI analyzing tone, language, and emotion. Office workers face productivity monitoring software tracking keystrokes, websites, time away from computer.

This represents return to scientific management’s time-motion studies—but automated, continuous, and comprehensive. Workers become transparent to capital while management algorithms remain opaque to workers. The power asymmetry is absolute.

Social Credit and Reputation Systems

China’s social credit system exemplifies surveillance capitalism’s fusion with state power. Citizens are continuously monitored through pervasive cameras, digital transactions, social media, and IoT devices. Behaviors are scored; low scores restrict access to services, travel, employment, and education.

While presented as uniquely Chinese authoritarianism, Western countries develop parallel systems through ostensibly private means: credit scores, background checks, tenant screening, hiring algorithms, insurance risk assessment. These aren’t comprehensive single systems but networked private surveillance creating similar effects—behavioral control through ubiquitous monitoring and scoring.

Predictive Policing and Carceral Surveillance

Law enforcement increasingly uses predictive policing—algorithms analyzing data to predict where crimes will occur or who will commit them. This extends surveillance capitalism’s logic to criminal justice: extract data, make predictions, preemptively intervene.

The systems disproportionately target communities of color, encoding and amplifying existing biases. They create feedback loops: algorithms predict crime in over-policed Black neighborhoods, more policing generates more arrests, more arrests train algorithms to predict more crime there. Surveillance becomes self-fulfilling prophecy justifying continued surveillance.

Educational Surveillance

Remote learning dramatically expanded student surveillance. Proctoring software monitors students during exams—tracking eye movements, recording audio/video, analyzing behavior, flagging “suspicious” activity. Learning management systems track every interaction: time spent, pages visited, assignment performance.

This generates behavioral data used to predict student outcomes, automate grading, and individualize instruction. It trains students to accept constant monitoring as normal, creating subjects comfortable with—or resigned to—surveillance. Educational surveillance is simultaneously immediate commercial application and long-term normalization project.

Health and Insurance Surveillance

Wearables, health apps, and telemedicine generate enormous health data. Insurance companies use this for risk assessment, dynamic pricing, and behavioral modification. “Voluntary” wellness programs require monitoring in exchange for premium discounts—coercive opt-ins marketed as empowerment.

Genetic testing companies (23andMe, Ancestry) extract genetic data ostensibly for ancestry information but monetize through pharmaceutical partnerships and potential insurance applications. Your genetic information becomes asset owned by corporations, used in ways you can’t control or even know about.

Smart Home and IoT Surveillance

Smart speakers (Alexa, Google Home), thermostats (Nest), doorbells (Ring), TVs, and appliances continuously generate data. These devices monitor presence, movements, conversations, routines, preferences, and visitors. Data flows to manufacturers, partners, and potentially law enforcement.

Amazon’s Ring creates privatized neighborhood surveillance networks. Users surveil each other, feeding data to Amazon and police through information-sharing partnerships. Community surveillance becomes gamified consumer product.

Automotive Surveillance

Connected vehicles report to manufacturers: location, driving behaviors, braking patterns, speed, maintenance needs. This data is sold to insurance companies for risk assessment and dynamic pricing. It flows to law enforcement in investigations. Manufacturers use it to design future vehicles and services.

The shift to electric vehicles and autonomous driving will intensify automotive surveillance. Tesla already captures extensive data including camera feeds of vehicles’ surroundings. Autonomous vehicles will require comprehensive sensing of environments—surveillance extending from occupants to everyone near vehicles.

Financial Surveillance

Fintech companies (PayPal, Venmo, Cash App, crypto platforms) monitor every transaction. They analyze spending patterns, social connections (who you pay/receive from), and behavioral data to assess creditworthiness, detect fraud, and target advertising.

China’s Alipay and WeChat Pay achieve near-total financial surveillance—essentially all transactions flow through these platforms, making every purchase trackable. Western countries haven’t reached this level, but the trajectory is clear: toward comprehensive financial surveillance enabling both state and corporate control.

Resistance and Alternatives

Privacy Regulations and Rights

GDPR, CCPA, and similar regulations establish privacy rights, but face limitations:

  • Consent theater: Complex policies, deceptive design, false choices maintain surveillance under guise of consent
  • Individual focus: Privacy rights frame surveillance as individual problem solvable through personal choices, ignoring collective dimensions
  • Compliance gaming: Platforms minimize compliance while maintaining core surveillance business model
  • Enforcement gaps: Under-resourced regulators face massive, wealthy corporations with armies of lawyers

More fundamental privacy frameworks are needed: collective data rights, democratic governance of data, stringent limits on collection/use regardless of consent, reversal of burden (companies must prove necessity rather than individuals proving harm).

Breaking Up Big Tech

Antitrust enforcement targeting tech giants’ monopoly power would disrupt surveillance capitalism’s economies of scale. Smaller companies couldn’t amass comprehensive data profiles or achieve surveillance capitalism’s scope.

Yet this faces challenges:

  • Network effects: Breaking up platforms might just create smaller monopolies or eventual recombination
  • Surveillance model: Unless business model itself changes, smaller companies would still pursue surveillance
  • Political resistance: Tech giants’ enormous wealth and political power make effective antitrust difficult

Breaking up surveillance capitalists is necessary but insufficient without also prohibiting surveillance business models.

Data Dignity and Data as Labor

Proposals to compensate individuals for data (“data dignity” or “data as labor” movements) recognize that behavioral surplus is labor—unpaid work generating value for capital. Just as industrial workers organized for fair wages, data workers should be compensated for value they create.

Jaron Lanier’s Who Owns the Future? (2013) proposed micropayments for data contributions. “Data dividend” proposals suggest platforms pay users for data. Glen Weyl and Eric Posner’s Radical Markets (2018) proposed “data as labor” frameworks.

Critics argue this legitimates surveillance rather than challenging it. Paying for data doesn’t address surveillance’s intrinsic harms: asymmetric power, behavioral control, erosion of autonomy. Commodifying personal data may worsen rather than solve problems. Moreover, individual data has minimal value—power comes from aggregation.

Public Digital Infrastructure

Alternative: public, democratically governed digital infrastructure not dependent on surveillance. Public search engines, social media, cloud storage funded through taxes or subscriptions rather than surveillance.

This could provide essential digital services while respecting privacy. It would eliminate profit motive driving ever-expanding surveillance. Democratic governance could ensure platforms serve public rather than corporate interests.

Challenges include:

  • Political will: Neoliberal hostility to public ownership makes this difficult
  • Capture risks: State-owned platforms could enable government surveillance
  • Innovation concerns: Whether public institutions can develop competitive alternatives

Yet historical precedent exists: public libraries, postal services, broadcasting. Digital infrastructure could be reconceived as public utility rather than commercial extraction apparatus.

Platform Cooperativism and Data Cooperatives

Worker and user-owned platform cooperatives offer alternative model. Rather than venture-capital-funded surveillance platforms extracting profit, cooperatives would be democratically governed by members, prioritizing their interests over growth and surveillance.

Data cooperatives would collectivize data governance—members pool data but control how it’s used through democratic processes. This addresses individual powerlessness (your data alone has minimal value) through collective action while avoiding corporate appropriation.

Examples remain small-scale, facing challenges competing with surveillance capitalists’ resources and network effects. Success requires supportive policy: public investment, procurement preferences, regulatory advantages.

Privacy-Preserving Technologies

Technical solutions include:

  • Encryption: End-to-end encryption preventing platform access to content
  • Privacy-preserving computation: Federated learning, differential privacy enabling analysis without raw data access
  • Decentralization: Peer-to-peer protocols, federated systems avoiding centralized surveillance
  • Anonymization: Tor, VPNs, private browsers obscuring identity

These help but face limits. Technical solutions can’t solve political-economic problems. Surveillance capitalists adapt, finding new extraction methods. And most users lack expertise/motivation to implement privacy technologies—structural solutions require structural change.

Abolition of Surveillance Capitalism

Most radical proposal: outlaw surveillance business models entirely. Just as certain practices are banned regardless of consent (selling organs, slavery), perhaps behavioral surveillance should be prohibited.

This would require:

  • Prohibiting collection of behavioral surplus beyond service provision
  • Banning sale of behavioral data and prediction products
  • Strict data minimization requirements
  • Heavy penalties for violations
  • Public alternatives to surveillance-funded services

Critics argue this is unrealistic given surveillance capitalism’s power. Proponents respond that recognizing something is difficult doesn’t make it less necessary. The alternative—consolidation of unprecedented power threatening democracy itself—is unacceptable.

Further Reading

Foundational Texts

  • Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs, 2019.
  • Zuboff, Shoshana. “Big Other: Surveillance Capitalism and the Prospects of an Information Civilization.” Journal of Information Technology 30.1 (2015): 75-89.

Platform Capitalism and Digital Economy

  • Srnicek, Nick. Platform Capitalism. Polity, 2016.
  • Cohen, Julie E. Between Truth and Power: The Legal Constructions of Informational Capitalism. Oxford University Press, 2019.
  • Sadowski, Jathan. Too Smart: How Digital Capitalism is Extracting Data, Controlling Our Lives, and Taking Over the World. MIT Press, 2020.

Surveillance Studies

  • Lyon, David. The Culture of Surveillance: Watching as a Way of Life. Polity, 2018.
  • Browne, Simone. Dark Matters: On the Surveillance of Blackness. Duke University Press, 2015.
  • Gilliom, John, and Torin Monahan. SuperVision: An Introduction to the Surveillance Society. University of Chicago Press, 2012.

Algorithmic Power

  • Pasquale, Frank. The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press, 2015.
  • Eubanks, Virginia. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin’s Press, 2018.
  • Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press, 2018.
  • Benjamin, Ruha. Race After Technology: Abolitionist Tools for the New Jim Code. Polity, 2019.

Privacy and Data Rights

  • Solove, Daniel J. Understanding Privacy. Harvard University Press, 2008.
  • Nissenbaum, Helen. Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford University Press, 2009.
  • Véliz, Carissa. Privacy Is Power: Why and How You Should Take Back Control of Your Data. Bantam Press, 2020.

Alternatives and Resistance

  • Morozov, Evgeny. To Save Everything, Click Here: The Folly of Technological Solutionism. PublicAffairs, 2013.
  • Lanier, Jaron. Who Owns the Future? Simon & Schuster, 2013.
  • Scholz, Trebor, and Nathan Schneider, eds. Ours to Hack and to Own: The Rise of Platform Cooperativism. OR Books, 2017.
  • Weyl, E. Glen, and Eric A. Posner. Radical Markets: Uprooting Capitalism and Democracy for a Just Society. Princeton University Press, 2018.

Critical Perspectives

  • Morozov, Evgeny. “Capitalism’s New Clothes.” The Baffler (2019).
  • Cohen, Nicole S. “The Valorization of Surveillance: Towards a Political Economy of Facebook.” Democratic Communiqué 22.1 (2008): 5-22.
  • Fuchs, Christian. “The Political Economy of Privacy on Facebook.” Television & New Media 13.2 (2012): 139-159.

See Also

  • Platform Capitalism
  • Data Extraction
  • Algorithmic Management
  • Behavioral Economics
  • Privacy Rights
  • Digital Labor
  • Attention Economy
  • Predictive Analytics
  • Biopolitics
  • Algorithmic Governmentality
  • Techno-Feudalism

How to Cite

MLA Format

Critical Theory Wiki Contributors. "Surveillance Capitalism." *Critical Theory Wiki*, 2025, https://criticaltheory.wiki//articles/surveillance-capitalism/.

APA Format

Critical Theory Wiki Contributors. (2025). Surveillance Capitalism. Critical Theory Wiki. https://criticaltheory.wiki//articles/surveillance-capitalism/

Chicago Format

Critical Theory Wiki Contributors. "Surveillance Capitalism." Critical Theory Wiki. 2025. https://criticaltheory.wiki//articles/surveillance-capitalism/.

Persistent URL: https://criticaltheory.wiki//articles/surveillance-capitalism/

This URL will remain stable and can be used for permanent citations.