How I learned to stop worrying and love surveillance capitalism
Mar 25, 2019
There are two dominant narratives about China: “China is awesome and we need to learn as much as possible from the country” or “China is super-scary and we have to make sure it doesn’t take over the world.”
Each of these narratives has a simplicity and emotional appeal that makes them extremely seductive, but ultimately both are inaccurate. That’s why “China hands”—those of us tied to China, its language, people, and culture—feel compelled to explain how the idea of China is so different from the reality on the ground.
Given the volume of exaggeration and hyperbole surrounding the conversation about China, those efforts are laudable, but they usually do not go far enough. We must wake up to the fact that almost all the current issues around technology are global—not regional or national—in nature. Not only is there a need to dial back the xenophobic hawkishness or, alternately, the manic enthusiasm for China’s rise, but we also have to look with clear eyes and an open mind at what is actually happening. And that begins at home.
In her book The Age of Surveillance Capitalism, Shoshana Zuboff draws back the veil on what American companies have done to create a pervasive, and unethical, understanding of human behavior.
Bound by practically zero legal restrictions, operating in near-total opacity, and benefiting from consumer ignorance, companies like Google and Facebook have built business empires on the collection and rendering of a practice known as “behavioral surplus.”
By using behavioral data unrelated to product or service improvement, these companies have created sophisticated behavioral futures markets where our predicted behavior is sold to buyers seeking improved margins.
Invoking utopian ideals such as increased leisure time, decreased social friction, and effortless decision-making, Google, Facebook, and increasingly Amazon have insidiously inserted themselves into our phones, our homes, and our cars—all to secure and protect precious supply lines.
Learning from robust psychological research (primarily behavioral and personality), these companies have discovered how to manipulate consumer behavior to deliver increasingly guaranteed outcomes.
Chinese companies, however, are much less sophisticated in this regard. Having emerged from an extremely different political, economic, and social history, Chinese companies had neither the technical know-how nor the broader environment to build businesses solely with behavioral surplus.
The country’s first-generation tech giants were all born in a very low-tech world, where the most direct way to monetize was through manual—as opposed to automatic—transactions. Alibaba had e-commerce, Tencent had virtual items (including profile decorations, in-game items, and stickers), and Baidu employed a large sales force.
This same logic still applies today, where the most effective way to monetize content, for example, is to sell it directly to users or incorporate it into a larger e-commerce offering.
I haven’t seen enough evidence yet—although I will continue to look and welcome any tips—that suggests companies like Xiaomi or Huawei, with their smart home product lines, even come close to the breadth and depth of Google and Amazon’s ambition to collect and monetize behavioral surplus.
Perhaps the only company approximating the Valley’s engineering-centric approach to consumer products is also the only company making headway outside of China. With their elite-level engineers who have created a content platform that only needs behavioral data and no initial social graph, Bytedance’s internal culture approaches a level of experimentation and data-driven “play” similar to Facebook’s.
Given the amount of justified concern over Facebook’s general lack of regard for user sovereignty, we should also be very concerned about Bytedance’s ability to understand and leverage user preference and behavior.
Unlike in the US, where surveillance technology has been driven by market forces and the government has furtively worked through tech companies to ensure “national security,” the entire surveillance technology ecosystem in China has been explicitly supported by the state.
As with many points of difference and friction, markets and companies in China are subordinate to the state and the Party. Most of the AI applications being developed are exactly in line with government needs; that includes surveillance and security, but also medicine, autonomous driving, and education.
And, as always, there are some very big questions about whether the country can ever fulfill its ambition of a complete social management system.
I am friends with many “China hands,” and definitely count myself among that group. We are inextricably tied to this country, its people, its language, and its culture for a myriad of reasons. We recognize the significance of China’s rise and choose not to be fearful. However, many do not go far enough, beyond the surface reality. I will not be the one to judge them or even China’s social management project. There is a lot that we can learn from each other, but we need to do so without glasses of any tint.
In China, AI will be used in ways antithetical to Western values—if not now, then definitely in the future. Whether that’s ultimately a good or bad thing, I’m honestly not sure.
What I can say is that the other end of the spectrum, a hawkishness verging on xenophobia that focuses on what China is doing (or can do) with AI, is just a distraction from us asking the important questions about how we want AI to be used in our own home countries.
The logic behind surveillance capitalism’s encroachment into our daily lives may differ in each country, but the outcome will be the same—an elite class separate from general society who shape and manage our behavior for goals only they can know.