Looking for news you can trust?
Subscribe to our free newsletters.
As Facebook assumed its central role in the information landscape, the company quietly made decisions that boosted profits with little regard to the consequences for privacy, politics, and the news industry. When those controversial calls and other travails associated with abuses of the platform gradually came to light, Facebook, as early investor Roger McNamee put it early this year, often followed a PR playbook of “deny, delay, deflect, dissemble.” A decadelong chronicle of lawsuits and leaks offers a window into the social-media giant and how its leader’s ethos to “move fast and break things” strained the public’s trust.
Facebook shifts user timelines away from simple chronology, introducing an algorithmic News Feed that gives the company influence over what goes viral.
The Autocracy App
Facebook begins a series of data-sharing partnerships with major companies, including Microsoft and Amazon. They are only revealed eight years later.
The Wall Street Journal reports Facebook is allowing apps like FarmVille to collect data regardless of users’ privacy settings.
The Federal Trade Commission announces a settlement with Facebook subjecting it to two decades of privacy audits, saying there was reason to believe the company violated federal law and “deceived consumers by telling them they could keep their information on Facebook private.”
After a four-year lawsuit, Facebook reaches a $9.5 million class-action settlement for allegedly breaching federal wiretap and video-rental privacy laws by publicly broadcasting users’ purchases and viewing habits onto other users’ News Feeds.
Facebook has had several opportunities to show that it understands its responsibility as the world’s largest social network, a platform that now has 2.23 billion active users worldwide, sees 4.75 billion pieces of content shared daily, and is responsible for one out of every five page views in the United States.
According to allegations in a 2018 court filing , a Facebook engineer realizes, but the company does not disclose, that its paid video advertising metrics vastly overstate views. Advertisers pour resources into videos based on these inflated numbers, and media companies follow suit.
Facebook realizes Cambridge Analytica has data it shouldn’t and asks the firm to delete any trace of it. It takes more than a year to confirm that it does. The public only learns about the breach 27 months later.
Facebook begins paying publishers such as BuzzFeed, the New York Times, and CNN multimillion-dollar sums to produce live videos.
Zuckerberg says it is a “pretty crazy idea” that fake news on Facebook could have influenced the election outcome in 2016.
The Intercept reports Cambridge Analytica, a Trump campaign affiliate, harvested data from more than 30 million Facebook users. Facebook says it “has not [yet] uncovered anything that suggests wrongdoing.”
Facebook publishes a white paper about coordinated misinformation campaigns on its platform but scrubs any direct references to Russia that its security team included, according to the New York Times .
Facebook’s chief security officer, Alex Stamos, discloses for the first time that Russians purchased ads in an effort to sow discord around the 2016 presidential election; a few weeks later, Facebook reveals that as many as 150 million people may have seen posts by the Kremlin-linked Internet Research Agency.
Zuckerberg says he regrets his “pretty crazy” comment, noting that discussion of the platform’s role in 2016 is “too important an issue to be dismissive” about.
Facebook decides to end payments for live videos, hamstringing media companies that invested in their production.
The Guardian and the New York Times publish reports based on documents from former Cambridge Analytica employee Christopher Wylie unveiling how the company harvested data on tens of millions of Facebook users and used it to target political messages, with Facebook aware of the practice but failing to alert users.
After Zuckerberg boasts that Facebook “systems” stemmed messages fomenting sectarian violence in Burma, civil society groups in the country fire back , saying they believe the company only acted after hearing their urgent complaints.
After declining previous requests to speak in front of lawmakers, Zuckerberg finally testifies before Congress. Senators ask basic questions about the company’s services and business model that reveal their unfamiliarity with his platform.
WhatsApp co-founder Jan Koum leaves his company (which Facebook acquired in 2014), reportedly after disagreements over protecting user data.
Testifying before Congress, Wylie claims that before Cambridge Analytica co-founder Steve Bannon became the Trump campaign’s CEO, he oversaw company efforts to suppress black voters.
After scrutiny and the threat of regulation, Facebook adds a “Paid for by” disclosure to political ads, a category in which it includes news publishers’ ads promoting content. It only exempts publishers from the policy six months later.
Slate goes public with the fact that Facebook’s algorithm changes have led to an 87 percent decline in referrals.
India’s Ministry of Electronics and Information Technology says Facebook “cannot evade accountability and responsibility” after five men are lynched as a result of an epidemic of disinformation on WhatsApp.
Facebook documents leaked to BuzzFeed show employees saw Trump’s victory as proof of its advertising tools’ effectiveness.
The Department of Housing and Urban Development announces a complaint alleging that Facebook allows landlords to keep users of certain races from seeing targeted housing ads.
A United Nations report criticizes Facebook’s response to hate speech fueling the ongoing ethnic cleansing of Rohingya Muslims in Burma as “slow and ineffective,” assailing the platform as a “useful instrument for those seeking to spread hate.”
Facebook’s vice president of global public policy, Joel Kaplan, sits behind Brett Kavanaugh during his congressional testimony on the sexual assault allegations of Christine Blasey Ford, sparking outrage among employees.
Facebook confirms suspicions aired by privacy researchers that it has been using phone numbers users provided only for security purposes to target them with ads.
What Facebook knows about you
On the eve of the 2018 midterm elections, Facebook announces the removal of 105 Facebook and Instagram accounts possibly linked to Russia’s Internet Research Agency.
The New York Times reports how Facebook responded to scandals by publicly vowing to change its practices, while privately focusing on minimizing the public relations impact. These steps included paying a conservative opposition-research firm, Definers Public Affairs, to go after Soros, who is also a top target of anti-Semitic extremists.
Image credits: Bannon: Brynn Anderson/AP; Duterte: Kaname Yoneyama/The Yomiuri Shimbun/AP Images; Nix: Christian Charisius/DPA/ZUMA; Putin: Mikhail Metzel/TASS/ZUMA; Sandberg: Jose Luis Magana/AP; Zuckerberg: Nati Harnik/AP; Facebook
Following what you do on Facebook: The company has near-total awareness of every move you make on its website or in its apps, including: Following what you say on Facebook Messenger: Facebook does scan your chat messages, but it isn't exactly reading them— it runs an automated scan for child pornography and other banned content.