The UK's Digital, Culture, Media and Sport select committee has finished an 18-month investigation into fake news, data sharing, and disinformation and has proven heavily critical of Facebook's business values and data-sharing practices.
The 111-page report (.PDF), conducted by UK parliamentary authorities, accuses Facebook of considering profit "before anything else."
More security news
- Forgot password? Five reasons why you need a password manager
- Winnie The Pooh takes over Reddit due to Chinese investment, censorship fears
- Should you be scared of your laptop’s webcam?
- iPhone snooping: Apple cracks down on apps that secretly record taps, keystrokes
Facebook CEO Mark Zuckerberg was deemed a figure who continually "fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world's biggest companies."
The investigation will likely make uncomfortable reading for the social networking giant, as a large section of the report focuses on how Facebook handled the Cambridge Analytica scandal , as well as the dubious relationship between the company and app developers over the last decade.
UK regulators say that Facebook has deliberately sought to "frustrate" the committee during the investigation by "giving incomplete, disingenuous and at times misleading answers to our questions," and the report even goes so far to say that Zuckerberg has shown "contempt" towards the governing body.
"Even if Mark Zuckerberg doesn't believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world," Damian Collins MP, Chair of the DCMS Committee said. "Evidence uncovered by my committee shows he still has questions to answer yet he's continued to duck them, refusing to respond to our invitations directly or sending representatives who don't have the right information."
Despite Facebook's apparent attempts to scupper the investigation, the report concludes that "Facebook intentionally and knowingly violated both data privacy and anti-competition laws" and also calls for an overhaul of regulation in the industry.
British Parliament on Wednesday published a cache of secret Facebook documents it obtained last month from a company suing the social network. Collins said a recurring theme of the papers was the "idea of linking access to friends data to the financial value of the [app] developers' relationship with Facebook."
The key takeaways of the report are below:
- Facebook is either "unwilling" or "unable" to prevent malicious content, including revenge porn, hate speech, and propaganda spread by sources including Russia.
- Facebook should not be allowed to behave like 'digital gangsters' online, and company representatives should not be able to "consider themselves to be ahead of and beyond the law."
- The company has "a fundamental weakness in managing its responsibilities to the people whose data is used for its own commercial interests," and is only moved to act when " serious breaches become public."
- The Cambridge Analytica scandal was facilitated by Facebook's policies and a business model which made "data abuses easy."
- Six4Three was one among some developers of apps considered "too successful" which were then "starved" of data by Facebook, while others were forced to pay a high price for access to information.
- Facebook was also willing to "override its users' privacy settings in order to transfer data to some app developers."
- Facebook has taken "aggressive positions" against direct competitors, leading to data access denial -- or acquisitions.
- Facebook gained a "huge financial advantage" by collecting user data from sources including Android handsets and Onavo, and also considered granting Tinder access to user data in return for using one of its trademarks, Moments.
The committee has suggested that a "Compulsory Code of Ethics" should be established for technology companies which, overseen by an independent regulator, would give law enforcement the power to launch legal cases against organizations which fail to meet standards which would ensure user trust and data privacy, as well as tackle misinformation and fake news.
In addition, the report says that social media networks should be "obliged to take down known sources of harmful content, including proven sources of disinformation."
Should companies fail to comply, the committee says they should face heavy fines -- and a tech 'levy' of two percent should be introduced to pay for the extra workload of UK regulators.
"Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalized 'dark adverts' from unidentifiable sources, delivered through the major social media platforms we use every day," Collins says. "Much of this is directed from agencies working in foreign countries, including Russia. The big tech companies are failing in the duty of care they owe to their users to act against harmful content and to respect their data privacy rights."
The Autocracy App
TechRepublic: Network recovery advice: Experts weigh in
Facebook said in a statement that the company was "pleased to have made a significant contribution" to the investigation, and added that the firm was "open to meaningful regulation and support the committee's recommendation for electoral law reform."
"We have already made substantial changes so that every political ad on Facebook has to be authorized, state who is paying for it and then is stored in a searchable archive for seven years," Facebook said. "No other channel for political advertising is as transparent and offers the tools that we do."
Facebook's worst privacy scandals and data...
SEE FULL GALLERY
Facebook's privacy problems: a roundup