Skip to content
All concepts

Concept

Prediction Products

Shoshana Zuboff's term for the manufactured outputs of surveillance capitalism: behavioural surplus processed through machine intelligence to produce actuarial predictions of what individuals will do now, soon, and later. These predictions are the actual commodity sold in behavioural futures markets — not the data itself, but the distilled forecast of future behaviour. Advertisers do not buy your browsing history; they buy the calculated probability that you will purchase, click, vote, or feel a particular way at a particular moment. The more accurately platforms can predict behaviour, the more valuable their inventory becomes — which is why nudging behaviour toward predictability is economically rational for platforms, independent of any intent to harm.

Prediction products are the commercial output of surveillance capitalism as Shoshana Zuboff defines it: not raw data, but the behavioural forecasts derived from processing vast quantities of behavioral surplus through machine learning systems. What is sold in the behavioural futures markets that constitute the core of the surveillance capitalist business model is not information about who you are but actuarial predictions about what you will do — "now, soon, and later," in Zuboff's formulation.

The distinction matters because it clarifies the actual value chain. Advertisers do not purchase browsing histories; they purchase probabilities. They are buying access to a person calculated to have a 73% likelihood of purchasing running shoes in the next two weeks, or a 61% probability of being persuadable on a particular political question. The prediction is the product; the data is only the input to its production.

This framing exposes a logic that is often obscured by discussions of privacy. The economic incentive for platforms is not simply to collect data but to produce increasingly accurate predictions — and accuracy improves with both the volume of data and the predictability of behaviour. A person whose behaviour is erratic and autonomous is a less valuable prediction subject than one whose behaviour is stable and responsive to stimuli. This creates a structural incentive, independent of any deliberate intent, for platforms to nudge users toward behavioural consistency and responsiveness — toward being better prediction subjects.

The machinery of prediction products includes not just advertising targeting but credit scoring, insurance pricing, hiring algorithms, and political micro-targeting. The same predictive infrastructure that tells an advertiser when to show you a shoe ad also tells a lender how likely you are to default, an insurer how likely you are to make a claim, and a political campaign which messages will move you. These applications share a common input — behavioral surplus — and a common output: a prediction sold to a third party whose interests in your behaviour are not identical to your own.

What this means practically is that the platform relationship is not bilateral — user and service — but trilateral: user, platform, and a largely invisible market of prediction buyers. The user's experience is optimised not for the user's benefit but for the production of better predictions. Features that make you more legible, more consistent, and more responsive to stimuli serve the prediction product even when they do not serve you.

Key Figures

SZ

Shoshana Zuboff

Originator of the surveillance capitalism framework and prediction products concept

MK

Michal Kosinski

Psychologist whose Cambridge research demonstrated the predictive power of Facebook likes

CO

Cathy O'Neil

Author, Weapons of Math Destruction — on the social effects of predictive algorithms

Further Reading