Facebook founder speaks out amid fears over election advertising and rise of far right
The Facebook boss, Mark Zuckerberg , has set out how he believes the social network and the internet should be regulated.
The firm’s founder and chief executive said there was a need for governments and regulators to have “a more active role”.
Zuckerberg said he believed new regulation was needed in four areas – harmful content, election integrity, privacy and data portability.
In an editorial published online in the Washington Post and on his own page, Zuckerberg said: “Every day we make decisions about what speech is harmful, what constitutes political advertising, and how to prevent sophisticated cyberattacks.
When former chancellor of the exchequer George Osborne revealed his 11-year-old was “desperate to have a Facebook account”, he was invited to bring them to visit the firm's office The memo records Sandberg’s meeting with Osborne, as documented by Marne Levine, then Facebook’s vice-president of global public policy.
“These are important for keeping our community safe. But if we were starting from scratch, we wouldn’t ask companies to make these judgments alone.
“I believe we need a more active role for governments and regulators. By updating the rules for the internet, we can preserve what’s best about it – the freedom for people to express themselves and for entrepreneurs to build new things – while also protecting society from broader harms.”
Zuckerberg said legislation was important for “protecting elections” and it should be updated, adding that Facebook had already made “significant changes around political ads”.
The Autocracy App
With European Union elections due to take place in May, Facebook has already said any advertisers in the EU will undergo tighter checks , which will require documents confirming their identity and location to be submitted, amid fears of foreign interference.
The company said all advertising relating to politics on both Facebook and Instagram in the EU must be clearly labelled, including who funded it.
Any advertising not properly registered will be blocked from mid-April, the social network warned.
Facebook has had several opportunities to show that it understands its responsibility as the world’s largest social network, a platform that now has 2.23 billion active users worldwide, sees 4.75 billion pieces of content shared daily, and is responsible for one out of every five page views in the United States.
“However, deciding whether an ad is political isn’t always straightforward. Our systems would be more effective if regulation created common standards for verifying political actors,” Zuckerberg wrote.
“Online political advertising laws primarily focus on candidates and elections, rather than divisive political issues where we’ve seen more attempted interference.
“Some laws only apply during elections, although information campaigns are nonstop. And there are also important questions about how political campaigns use data and targeting.
09/06/17 Facebook’s chief security officer, Alex Stamos, discloses for the first time that Russians purchased ads in an effort to sow discord around the 2016 presidential election; a few weeks later, Facebook reveals that as many as 150 million people may have seen posts by the Kremlin-linked Internet Research Agency.
“We believe legislation should be updated to reflect the reality of the threats and set standards for the whole industry.”
On the subject of harmful content, he said, Facebook continually reviewed its policies with experts, but added: “At our scale we’ll always make mistakes and decisions that people disagree with.”
He said Facebook was creating an independent body to enable people to challenge its decisions.
Zuckerberg said effective privacy and data protection needed a “globally harmonised framework”, advocating more countries adopting rules such as GDPR – the EU’s general data protection regulation – as a common framework.
Facebook recently said it would no longer allow content supporting white nationalism and white separatism , in the wake of a white supremacist terror attack on mosquesin New Zealand that left 50 people dead.
It was one of the social media sites that had come under fire for failing to promptly take down video footage of the attack, which had been livestreamed via a mobile app.
Facebook had also been caught up in a scandal around “improper data-gathering practices” during elections in the US by the defunct political consultancy Cambridge Analytica .