SHOSHANA ZUBOFF’S “The Age of Surveillance Capitalism” is already drawing comparisons to seminal socioeconomic investigations like Rachel Carson’s “Silent Spring” and Karl Marx’s “Capital.” Zuboff’s book deserves these comparisons and more: Like the former, it’s an alarming exposé about how business interests have poisoned our world, and like the latter, it provides a framework to understand and combat that poison. But “The Age of Surveillance Capitalism,” named for the now-popular term Zuboff herself coined five years ago, is also a masterwork of horror. It’s hard to recall a book that left me as haunted as Zuboff’s, with its descriptions of the gothic algorithmic daemons that follow us at nearly every instant of every hour of every day to suck us dry of metadata. Even those who’ve made an effort to track the technology that tracks us over the last decade or so will be chilled to their core by Zuboff, unable to look at their surroundings the same way.
An unavoidable takeaway of “The Age of Surveillance Capitalism” is, essentially, that everything is even worse than you thought. Even if you’ve followed the news items and historical trends that gird Zuboff’s analysis, her telling takes what look like privacy overreaches and data blunders, and recasts them as the intentional movements of a global system designed to violate you as a revenue stream. “The result is that both the world and our lives are pervasively rendered as information,” Zuboff writes. “Whether you are complaining about your acne or engaging in political debate on Facebook, searching for a recipe or sensitive health information on Google, ordering laundry soap or taking photos of your nine-year-old, smiling or thinking angry thoughts, watching TV or doing wheelies in the parking lot, all of it is raw material for this burgeoning text.”
Tech’s privacy scandals, which seem to appear with increasing frequency both in private industry and in government, aren’t isolated incidents, but rather brief glimpses at an economic and social logic that’s overtaken the planet while we were enjoying Gmail and Instagram. The cliched refrain that if you’re “not paying for a product, you are the product”? Too weak, says Zuboff. You’re not technically the product, she explains over the course of several hundred tense pages, because you’re something even more degrading: an input for the real product, predictions about your future sold to the highest bidder so that this future can be altered. “Digital connection is now a means to others’ commercial ends,” writes Zuboff. “At its core, surveillance capitalism is parasitic and self-referential. It revives Karl Marx’s old image of capitalism as a vampire that feeds on labor, but with an unexpected turn. Instead of labor, surveillance capitalism feeds on every aspect of every human’s experience.”
Zuboff recently took a moment to walk me through the implications of her urgent and crucial book. This interview was condensed and edited for clarity.
I was hoping you could say something about whatever semantic games Facebook and other similar data brokers are doing when they say they don’t sell data.
I remember sitting at my desk in my study early in 2012, and I was listening to a speech that [Google’s then-Executive Chair] Eric Schmidt gave somewhere. He was bragging about how privacy conscious Google is, and he said, “We don’t sell your data.” I got on the phone and started calling these various data scientists that I know and saying, “How can Eric Schmidt say we don’t sell your data, in public, knowing that it’s recorded? How does he get away with that?”It’s exactly the question I was trying to answer at the beginning of all this.
Let’s say you’re browsing, or you’re on Facebook putting stuff in a post. They’re not taking your words and going into some marketplace and selling your words. Those words, or if they’ve got you walking across the park or whatever, that’s the raw material. They’re just secretly scraping your private experience as raw material, and they’re stockpiling that raw material, constantly flowing through the pipes. They sell prediction products into a new marketplace. What are those guys really buying? They’re buying predictions of what you’re gonna do. There are a lot of businesses that want to know what you’re going to do, and they’re willing to pay for those predictions. That’s how they get away with saying, “We’re not selling your personal information.”That’s how they get away also with saying, as in the case of [recently implemented European privacy law] GDPR, “Yeah, you can have access to your data.” Because the data they’re going to give you access to is the data you already gave them. They’re not giving you access to everything that happens when the raw material goes into the sausage machine, to the prediction products.
Do you see that as substantively different than selling the raw material?
Why would they sell the raw material? Without the raw material, they’ve got nothing. They don’t want to sell raw material, they want to collect all of the raw material on earth and have it as proprietary. They sell the value added on the raw material.
It seems like what they’re actually selling is way more problematic and way more valuable.
That’s the whole point. Now we have markets of business customers that are selling and buying predictions of human futures. I believe in the values of human freedom and human autonomy as the necessary elements of a democratic society. As the competition of these prediction products heats up, it’s clear that surveillance capitalists have discovered that the most predictive sources of data are when they come in and intervene in our lives, in our real-time actions, to shape our action in a certain direction that aligns with the kind of outcomes they want to guarantee to their customers. That’s where they’re making their money. These are bald-faced interventions in the exercise of human autonomy, what I call the “right to the future tense.” The very idea that I can decide what I want my future to be and design the actions that get me from here to there, that’s the very material essence of the idea of free will.
“These are bald-faced interventions in the exercise of human autonomy.”
I write about the Senate committee back in the ’70s that reviewed behavioral modification…