Disgusted by the actions of the company’s brass and far right investors such as Bannon, Wylie left the company in 2014, because, he writes, “otherwise I risked catching the same disease of mind and spirit.” After leaving Cambridge Analytica, and before the 2016 election, Wylie says he tried to warn Facebook and the White House about the manipulation of American voters. But at that point, no one imagined a Trump victory. “They didn’t care,” he says. But in March 2017, two months after Trump’s inauguration, Wylie was contacted by Guardian journalist Carole Cadwalladr — and later the New York Times — and Cambridge Analytica’s work was exposed in stories published a year later.
Facebook has since been hit with a $5 billion penalty by the FTC. Cambridge Analytica dissolved. As for what happened to the scraped data? No one is quite sure.I spoke to Wylie about how the propaganda spread by Cambridge Analytica and the Trump campaign influenced American voters and why he’s worried about the 2020 election, among other subjects.
Our conversation has been condensed and edited for clarity.Hope Reese The political strategy of inciting fear is not new; Nixon used it, for instance. How did Cambridge Analytica take it to the next level? And could you see the propaganda having an impact? Christopher Wylie
This is really important for people to understand: Data sets are connected to each other. When you subscribed to a magazine two years ago, it feels disconnected to you liking something on Facebook today. But if I acquire both of those data sets, I can put them together. Including when you registered to vote, who you voted for in a primary, if you’ve responded to a poll before. The people who would be targeted are called the “targeting universe.” Imagine it as a list of specific people.So an algorithm goes through a bunch of data, makes a list of individuals. And those people would be put into a campaign, like, “the immigrants are coming,” or “Obama’s going to take your guns,” or whatever. And the people from that list who keep engaging over and over again would receive an invite. So if you know that 30 percent of this particular invite group went to that event, you’d know that there’s almost a one in three chance that this person went versus that person went. Everything is tracked — when you click on stuff, when you share stuff. Imagine you are a target. You’re sitting in your living room, and you see ads for a group, and you click on it, and you join that group, and you start having conversations. A couple days later, you get a share from somebody in the group about some kind of weird thing that Obama is doing. And you’re slightly outraged by it. And then you keep clicking on stuff and then a week later you get a phone call, which is a poll to ask your opinion about something.
Ocasio-Cortez stumps Zuckerberg with questions on far right and Cambridge Analytica.Mark Zuckerberg faced a grueling examination from the Democratic lawmaker Alexandria Ocasio-Cortez on Wednesday, with questions over the Cambridge Analytica scandal and Facebook’s reluctance to police political advertising.
When you’re talking on the phone to some random polling company, you’re not thinking that that’s connected to, like, the things that you saw last week, the chats you had last week. And if you respond in a particular way, you get put into a new target group where they try to push more content. If you engage at a certain rate, somebody might send you a message or an email saying, “Hey, do you want to come to this event?” You don’t suspect that you’re in a target universe, cause you don’t even know what that is — you’ve never even heard of it.
What you were doing in your living room two weeks ago, that phone call or email, or a knock on the door from a canvasser — you don’t see how they’re connected, but they are.
You were becoming uncomfortable with what was happening at Cambridge Analytica, but it felt abstract for a while — until a video you saw made it feel more real to you.Christopher Wylie
Some people in the target universe would get invited to stream focus groups or events and those would often be filmed. It becomes a lot more real when you go from looking at a record ID number...to actually seeing a video of somebody filled with rage about something that’s completely made up. They don’t understand that what they’re angry about was specifically crafted and curated to make them feel that way, about something that may or may not be real.
I looked at that and thought: This is not just a game of math. It’s not increased rates and increased numbers here, and decreased numbers there, with database ID numbers. All of a sudden, there’s an actual person who looks like they’re about to break a chair because they’re so angry about something that you know, but they don’t know, was made up, that they are there because they’ve been clicking on stuff and they’ve been manipulated to feel this way.
Hope ReeseObama’s political campaign also created and spread targeted ads on Facebook. How was what happened in the Trump campaign different?
The Great Hack covers one of 2018’s biggest tech controversies: the revelation that political consulting firm Cambridge Analytica secretly collected 87 million Facebook users’ data. The film is more interested in Cambridge Analytica than data policy Brittany Kaiser’s story is by far the most interesting part of The Great Hack.
The Obama campaign didn’t rely on scaled disinformation. Cambridge Analytica was trying to identify people who were prone to conspiratorial thinking or paranoid ideation and exacerbate those latent characteristics with those people. The Obama campaign focused on identifying people who typically didn’t vote or were infrequent in their voting habits. So people of color or single women with children — there are structural obstacles to voting, so motivating them to vote was a big focus.
I don’t think there’s an innate problem with targeting in campaigns. If you care about the environment, I should be talking to you about the environment. Where the line gets crossed is where you start to effectively stalk a person, going beyond just an issue and looking at: How does the person make decisions? What are the emotional vulnerabilities of the person and how can I exploit that? And in terms of transparency, when the Obama campaign did advertising, you were aware that you’re seeing an ad.
Hope ReeseAt Cambridge Analytica, Russian businessmen visited the office frequently — but at the time, Russia was not on anyone’s radar. How much of Russian involvement in the US election happened via Cambridge Analytica?
Christopher WylieAt the time, it was weird — but there was a lot of weird things all happening at the same time. Steve Bannon was weird. Everything that the company did was weird. But when the Russian involvement started to come into the public consciousness, I thought — wait a second. This company was advising Donald Trump, and Russian businessmen were coming in left, right, and center. I’m not saying there was a conspiracy. But there was so much frequent contact, where we explained over and over and over again, to people connected to Russian intelligence services, “Hey, we have all this data. We have this AI. This is how we’ve done it.” Literally, our presentation in St. Petersburg was about the efficacy of using voter targeting in the US using social media data. There were a lot of opportunities for exploitation.
How can we begin to regulate the use of private data from Facebook or other social media platforms?
Christopher WylieI am not a policy expert — I’m a dude who works with tech. But I have noticed a couple of things that I find concerning and irritating about how policy makers talk about tech. There’s this notion that “the law can’t keep up with technology.” That technology moves so fast that we can never create rules that keep up with it. I’ve heard that so many times from members of Congress. But I point out that we have all kinds of safety regulations for aerospace, nutrition, power plants, cancer medicine and pharmaceuticals — for the types of fertilizers and pesticides that are allowed or not. These are all products of technology. The difference is that we have technically competent regulators that are empowered by the law to make decisions on the public’s behalf, without a debate in Congress. So Silicon Valley is like, “Well, you don’t understand the algorithms, so how are you going to debate in Congress?” But they also don’t understand how a nuclear power plant works. So the debate in Congress is: Should we have people who know how this works in power to make rules about the safety of these nuclear power plants? Yes. Cool. Let’s create the Department of Energy. And they make regulations. So the first thing is that we need to get over this idea that just because it’s software, somehow the law can’t keep up.
Hope ReeseHow much of a problem will this be in America’s 2020 election?
If a relatively small company in London, within a couple of years, can build up a sophisticated capacity to target and deliberately manipulate a subset of American voters, enough to push certain candidates over the line. Even if Cambridge Analytica has dissolved, the same people are working on the Trump campaign. And there is no way to confirm that the data sets that they amassed are actually gone.
If a company like Cambridge Analytica can do it, what happens when China becomes the next Cambridge Analytica? What happens when North Korea or Iran becomes the next Cambridge Analytica? Cambridge Analytica was first. But these are countries that have more than enough capacity to replicate the work that Cambridge Analytica was doing. And probably go further. This is why I was so upset with Facebook: It’s about the fact that we have unsafe platforms that are causing a huge risk to the integrity of democracy in the United States and around the Western world. Look at Mark Zuckerberg’s speech, where he said, essentially, “Well, disinformation — you’re just going to have to deal with that.” Why is it that he, unilaterally, gets to decide how much or how little disinformation is part of our electoral process?
The most egregious thing is what Cambridge Analytica has exposed: That we have relegated the security and integrity of our democracy to a private company that doesn’t really want to do anything about it.Hope Reese is a writer based in Louisville, Kentucky, currently living in Istanbul. Her work has appeared in the Atlantic, the Boston Globe, and Vice.