The real reason why Facebook and Google won’t change

Mark Zuckerberg ushered in the new year pledging to address the many woes that now plague his company by “making sure people have control of their information,” and “ensuring our services improve people’s well-being.”

As much as we may want to believe him, Zuckerberg’s sudden turn toward accountability is impossible to take seriously. The problems Zuckerberg cited, including “election interference” and “hate speech and misinformation,” are by-products of the features of social networks, not bugs. How do we explain Facebook’s years of ignoring these developments? Some headlines have blamed the internet. Others criticize Facebook’s management. A powerful November exposé in The New York Times describes Facebook’s executives as having “stumbled,” first ignoring warning signs of meddling during the run-up to the 2016 U.S. presidential election and then trying to conceal them. Other analysts conclude that the problem is Facebook’s size, arguing that it should be broken up into smaller companies. Unfortunately, none of these explanations brings us any closer to grasping the real issue.

Facebook is an exemplary company—if you are a fan of “surveillance capitalism,” my term for businesses that create a new kind of marketplace out of our private human experiences. They hoover up all the behavioral data they can glean from our every move (literally, in terms of tracking our phones’ locations) and transform it with machine intelligence into predictions, as they learn to anticipate and even steer our future behavior. These predictions are traded in novel futures markets aimed at a new class of business customers.

Surveillance capitalism was invented by Google more than a decade ago when it discovered that the “data exhaust” clogging its servers could be combined with analytics to produce predictions of user behavior. At that time, the key action of interest was whether a user might click on an ad. The young company’s ability to commandeer its data surplus into click-through prognostications became the basis for an unusually lucrative sales process known as ad targeting. In 2008, when Facebook faced a financial crisis, Zuckerberg hired Google executive Sheryl Sandberg to port over this scheme. (Facebook and Google did not respond to a request for comment.)

Google’s and Facebook’s stunning success has inspired companies in insurance, retail, healthcare, finance, entertainment, education, transportation, and more to chase eye-popping surveillance business profit margins. Surveillance capitalists depend on the continuous expansion of their raw material (behavioral data) to drive revenue growth. This extraction imperative explains why Google expanded from search to email to mapping to trying to build entire cities. It’s why Amazon invested millions to develop the Echo and Alexa. It’s why there’s a proliferation of products that begin with the word smart, virtually all of which are simply interfaces to enable the unobstructed flow of behavioral data that previously wasn’t available, harvested from your kitchen to your bedroom.

Each of the issues that Zuckerberg now says he wants to fix have been longtime features of the Facebook experience. There are no less than 300 significant quantitative research studies on the relationships between social media use and mental health (most of them produced since 2013). Researchers now agree that social media introduces an unparalleled intensity and pervasiveness of “social comparison” processes, especially for young users who are almost constantly online. The results: amplified feelings of insecurity, envy, depression, social isolation, and self-objectification. One major study, published in the American Journal of Epidemiology, concluded: “Facebook use does not promote well-being. . . . Individual users might do well to curtail their use of social media and focus instead on real-world relationships.”

Indeed, Facebook has avidly sought to master social-comparison dynamics to manipulate human behavior. A 2012 article based on a collaboration between Facebook data scientist Adam Kramer and academic researchers—”A 61-Million-Person Experiment in Social Influence and Political Mobilization”—released in the journal Nature, detailed how the company planted voting-related cues in the News Feeds of 61 million Facebook users to leverage social-comparison processes and influence voting behavior in the run-up to the 2010 midterms. The team concluded that its efforts successfully triggered a “social contagion” that influenced real-world behavior, with 340,000 additional votes cast as a result.

Similar Articles:

Here's how much Facebook knows about you

Here's how much Facebook knows about you

What Facebook knows about you

What Facebook knows about you

Facebook reportedly gave tech giants greater access to users' data

Facebook reportedly gave tech giants greater access to users' data

Facebook Algorithms and Personal Data

Facebook Algorithms and Personal Data