‘DIGITALGANGSTER’

It seems clear that Facebook was, at the very least, in violation of its Federal Trade Commission settlement.

It is the most damning report ever made by a foreign government on Facebook. A 108-page report by the United Kingdom parliament has likened Facebook to a gangster – digital gangster. 

“Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law,” said the UK House of Commons Digital, Culture, Media and Sport Committee’s Final Report,  “Disinformation and ‘Fake News’: Final Reportreleased Feb. 18, 2019.

Said the Guardian newspaper: “The scale of the report – it drew from 170 written submissions and evidence from 73 witnesses who were asked more than 4,350 questions – is without precedent. And it’s what contributes to making its conclusions so damning: that the government must now act. That Facebook must be regulated. That Britain’s electoral laws must be re-written from the bottom up; the report is unequivocal, they are not “fit for purpose”. And that the government must now open an independent investigation into foreign interference in all British elections since 2014.”

Facebook is one of the world’s largest social media companies, if not the largest, with 1.5 billion active daily users on mobile phones alone, out of 2.32 billion active users monthly.  Five new profiles are created every second.  About 300 million photos are uploaded daily.  Average time spent on Facebook is 20 minutes.  Facebook has a market cap of $463.7 billion, fifth or sixth largest in the world behind Microsoft, Apple, Amazon, Alphabet, and Berkshire.

Excerpts and recommendations from the UK Parliament’s Report:

Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law.

The Competitions and Market Authority (CMA) should conduct a comprehensive audit of the operation of the advertising market on social media.

Given the contents of the Six4Three documents that we have published, it should also investigate whether Facebook specifically has been involved in any anti-competitive practices and conduct a review of Facebook’s business practices towards other developers, to decide whether Facebook is unfairly using its dominant market position in social media to decide which businesses should succeed or fail.

Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites.

A new category of tech company is formulated, which tightens tech companies’ liabilities, and which is not necessarily either a ‘platform’ or a ‘publisher’.

This approach would see the tech companies assume legal liability for content identified as harmful after it has been posted by users.

By choosing not to appear before the Committee, Mark Zuckerberg has shown contempt towards both the UK Parliament and the ‘International Grand Committee’, involving members from nine legislatures from around the world.

Clear legal liabilities should be established for tech companies to act against harmful or illegal content on their sites. There is now an urgent need to establish independent regulation.

A compulsory Code of Ethics should be established, overseen by an independent regulator, setting out what constitutes harmful content. The independent regulator would have statutory powers to monitor relevant tech companies; this would create a regulatory system for online content that is as effective as that for offline content industries.

A Code of Ethics should be similar to the Broadcasting Code issued by Ofcom—which is based on the guidelines established in section 319 of the 2003 Communications Act.

The Code of Ethics should be developed by technical experts and overseen by the independent regulator, in order to set down in writing what is and is not acceptable on social media. This should include harmful and illegal content that has been referred to the companies for removal by their users, or that should have been easy for tech companies themselves to identify.

The process should establish clear, legal liability for tech companies to act against agreed harmful and illegal content on their platform and such companies should have relevant systems in place to highlight and remove ‘types of harm’ and to ensure that cyber security structures are in place.

If tech companies (including technical engineers involved in creating the software for the companies) are found to have failed to meet their obligations under such a Code, and not acted against the distribution of harmful and illegal content, the independent regulator should have the ability to launch legal proceedings against them, with the prospect of large fines being administered as the penalty for non-compliance with the Code.

The Cambridge Analytica scandal was facilitated by Facebook’s policies.

If it had fully complied with the FTC settlement, it would not have happened. The US Federal Trade Commission (FTC) Complaint of 2011 ruled against Facebook—for not protecting users’ data and for letting app developers gain as much access to user data as they liked, without restraint—and stated that Facebook built their company in a way that made data abuses easy.

Evidence obtained from the Six4Three court documents indicates that Facebook was willing to override its users’ privacy settings in order to transfer data to some app developers, to charge high prices in advertising to some developers, for the exchange of that data, and to starve some developers—such as Six4Three—of that data, thereby causing them to lose their business. — Tony Lopez