Skip to main contentSkip to navigationSkip to navigation
Facebook surveillance illustration
Illustration by Matt Kenyon
Illustration by Matt Kenyon

It’s not that we’ve failed to rein in Facebook and Google. We’ve not even tried

This article is more than 4 years old

The tech giants use our data not only to predict our behaviour but to change it. But we can resist this attack on democracy

In a BBC interview last week, Facebook’s vice-president, Nick Clegg, surprised viewers by calling for new “rules of the road” on privacy, data collection and other company practices that have attracted heavy criticism during the past year. “It’s not for private companies … to come up with those rules,” he insisted. “It is for democratic politicians in the democratic world to do so.”

Facebook’s response would be to adopt a “mature role”, not “shunning” but “advocating” the new rules. For a company that has fiercely resisted new laws, Clegg’s message aimed to persuade us that the page had turned. Yet his remarks sounded like Newspeak, as if to obscure ugly facts.

A few weeks earlier Facebook’s chiefs, Mark Zuckerberg and Sheryl Sandberg, snubbed a subpoena from the Canadian parliament to appear for questioning. Clegg then showcased Silicon Valley’s standard defence against the rule of law – warning that any restrictions resulting from “tech-lash” risked making it “almost impossible for tech to innovate properly”, and summoning the spectre of Chinese ascendance. “I can predict that … we will have tech domination from a country with wholly different sets of values.”

Both Facebook and Google have long relied on this misguided formula to shield them from law. In 2011, the former Google CEO Eric Schmidt warned that government overreach would foolishly constrain innovation, “We’ll move much faster than any government.”Then, in 2013, Google co-founder Larry Page complained that “old institutions like the law” impede the company’s freedom to “build really great things”. This rhetoric is a hand-me-down from another era when “Gilded Age” barons in the late-19th century United States insisted that there was no need for law when one had the “law of evolution”, the “laws of capital” and the “laws of industrial society”. As the historian David Nasaw put it, the millionaires preached that “democracy had its limits, beyond which voters and their elected representatives dared not trespass lest economic calamity befall the nation”.

The tech companies’ innovation rhetoric effectively blinded users and lawmakers for many years. Facebook and Google were regarded as innovative companies that sometimes made dreadful mistakes at the expense of our privacy. Since then the picture has sharpened. It’s easier to see that what we thought of as mistakes actually were the innovations – Google Glass, Facebook giving private information to developers, and more. Each of these was an expression of a larger breakthrough: the invention of what I call surveillance capitalism.

Surveillance capitalism is not the same as digital technology. It is an economic logic that has hijacked the digital for its own purposes. The logic of surveillance capitalism begins with unilaterally claiming private human experience as free raw material for production and sales. It wants your walk in the park, online browsing and communications, hunt for a parking space, voice at the breakfast table …

These experiences are translated into behavioural data. Some of this data may be applied to product or service improvements, and the rest is valued for its predictive power. These flows of predictive data are fed into computational products that predict human behaviour. A leaked Facebook document in 2018 describes its machine-learning system that “ingests trillions of data points every day” and produces “more than 6m predictions per second”. Finally, these prediction products are sold to business customers in markets that trade in human futures.

This economic logic was first invented at Google in the context of online targeted ads where the “clickthrough rate” was the first globally successful prediction product, and targeted ad markets were the first markets to specialise in human futures. During the first years of discovery and invention from 2000 to 2004, Google’s revenues increased by 3,590%. Right from the start it was understood that the only way to protect these revenues was to hide the operations that produce them, keeping “users” in the dark with practices designed to be undetectable and indecipherable.

Surveillance capitalism migrated to Facebook, Microsoft and Amazon – and became the default option in most of the tech sector. It now advances across the economy from insurance, to retail, finance, health, education and more, including every “smart” product and “personalised” service.

Markets in human futures compete on the quality of predictions. This competition to sell certainty produces the economic imperatives that drive business practices. Ultimately, it has become clear that the most predictive data comes from intervening in our lives to tune and herd our behaviour towards the most profitable outcomes. Data scientists describe this as a shift from monitoring to actuation. The idea is not only to know our behaviour but also to shape it in ways that can turn predictions into guarantees. It is no longer enough to automate information flows about us; the goal now is to automate us. As one data scientist explained to me: “We can engineer the context around a particular behaviour and force change that way … We are learning how to write the music, and then we let the music make them dance.”

These economic imperatives erode democracy from below and from above. At the grassroots, systems are designed to evade individual awareness, undermining human agency, eliminating decision rights, diminishing autonomy and depriving us of the right to combat. The big picture reveals extreme concentrations of knowledge and power. Surveillance capitalists know everything about us, but we know little about them. Their knowledge is used for others’ interests, not our own. Surveillance capitalism thrives in the absence of law. In a way, this is good news. We have not failed to rein in this rogue capitalism; we’ve not yet tried. More good news: our societies successfully confronted destructive forms of capitalism in the past, asserting new laws that tethered capitalism to the real needs of people. Democracy ended the Gilded Age. We have every reason to believe that we can be successful again.

The next great regulatory vision is likely to be framed by warriors for a democracy under threat: lawmakers, citizens and specialists, allied in the knowledge that only democracy can impose the people’s interests through law and regulation. The question is, what kind of regulation? Are existing approaches to privacy and antitrust law the answer? Both are critical but neither is adequate.

One example is privacy law’s call for “data ownership”. It’s a misleading notion because it legitimates the unilateral taking of human experience – your face, your phone, your refrigerator, your emotions – for translation into data in the first place. Even if we achieve “ownership” of the data we have provided to a company like Facebook, we will not achieve “ownership” of the predictions gleaned from it, or the fate of those products in its prediction markets. Data ownership is an individual solution when collective solutions are required. We will never own those 6m predictions produced each second. Surveillance capitalists know this. Clegg knows this. That is why they can tolerate discussions of “data ownership” and publicly invite privacy regulation.

What should lawmakers do? First, interrupt and outlaw surveillance capitalism’s data supplies and revenue flows. This means, at the front end, outlawing the secret theft of private experience. At the back end, we can disrupt revenues by outlawing markets that trade in human futures knowing that their imperatives are fundamentally anti-democratic. We already outlaw markets that traffic in slavery or human organs.

Second, research over the past decade suggests that when “users” are informed of surveillance capitalism’s backstage operations, they want protection, and they want alternatives. We need laws and regulation designed to advantage companies that want to break with surveillance capitalism. Competitors that align themselves with the actual needs of people and the norms of a market democracy are likely to attract just about every person on Earth as their customer.

Third, lawmakers will need to support new forms of collective action, just as nearly a century ago workers won legal protection for their rights to organise, to bargain collectively and to strike. Lawmakers need citizen support, and citizens need the leadership of their elected officials.

Surveillance capitalists are rich and powerful, but they are not invulnerable. They fear law. They fear lawmakers. They fear citizens who insist on a different path. Both groups are bound together in the work of rescuing the digital future for democracy. Mr Clegg, be careful what you wish for.

Most viewed

Most viewed