That damaging accusation was published in a report detailing the findings of an 18-month investigation into online disinformation and “fake news” by the U.K. Parliament’s digital, culture, media and sport committee. On Monday, the committee concluded that Facebook should no longer be able to regulate itself because it deliberately broke privacy and competition laws.
Lawmakers claimed the social network failed to prevent Russia from manipulating elections and actively sought to obstruct its inquiry into its business practices. The report said founder and CEO Mark Zuckerberg refused on three occasions to give evidence, sending junior employees to field questions from the committee instead. In conclusion, policymakers said the time has come for an independent regulator to enforce a compulsory code of ethics.
"Companies like Facebook should not be allowed to behave like 'digital gangsters' in the online world, considering themselves to be ahead of and beyond the law," the report said. "Facebook's handling of personal data, and its use for political campaigns, are prime and legitimate areas for inspection by regulators, and it should not be able to evade all editorial responsibility for the content shared by its users across its platforms."
Throughout the report, lawmakers accused Facebook of prioritizing shareholder profits over user privacy rights. The committee even argued that the social network could have avoided the Cambridge Analytica data scandal, had it respected the terms of an agreement struck with U.S. regulators in 2011 to limit how much developers can access user data.
The report also dismissed Zuckerberg’s claim that the social network has never sold user data as “simply untrue.” Citing internal documents from software firm Six4Three, the committee concluded that Facebook "intentionally and knowingly" sold private data without asking users for their permission.
Facebook responded by saying it is “open to meaningful regulation,” adding that it “supports the committee’s recommendation for electoral law reform.” The company’s public policy manager Karim Palant said the social network has been investing in people, machine learning and artificial intelligence to tackle the problem and is now better equipped than it was a year ago.
“We have already made substantial changes so that every political ad on Facebook has to be authorized, state who is paying for it and then is stored in a searchable archive for seven years,” Palant said. “While we still have more to do, we are not the same company we were a year ago.”