“Online Harms White Paper” Consultation closes 1 July 2019

16 April 2019

Author: Olivia O'Kane
Practice Area: Media and Defamation

Internet

1. Mark Zuckerberg’s argument for standardised regulation of the Internet

On 31 March 2019, Mark Zuckerberg published an opinion piece in the Washington Post, The Internet needs Rules, also posted to his Facebook page here. He called for “governments and regulators” around the world to help unify internet regulation.

On 2 April 2019 he arrived in Dublin to meet Irish politicians at Facebook European headquarters, Facebook Ireland Limited to discuss the regulation of social media, transparency in political advertising and the safety of young people and vulnerable adults.

Zuckerberg said:

“Every day, we make decisions about what speech is harmful, what constitutes political advertising, and how to prevent sophisticated cyberattacks.”

But he says companies alone should not be the ones to set up rules on what is acceptable.

“I believe we need a more active role for governments and regulators. By updating the rules for the Internet, we can preserve what’s best about it — the freedom for people to express themselves and for entrepreneurs to build new things — while also protecting society from broader harms.”

He argued that four areas required a standardised and Global approach:

1. Harmful content:

He suggested the creation of an independent body to review Facebook’s content moderation decisions. He also wants the formation of a set of standardised rules for harmful content.

2. Election integrity:

He explained that the inconsistency and inadequacy of existing laws differing for each Country throughout the World caused complexities for electoral advertising and media.

3. Privacy:

As for privacy, he argued that one standardised and unified approach to internet Governance could be built upon by using the EU General Data Protection Regulation, now codified in the UK in the Data Protection Act 2018.

4. Data portability:

He argues that legislation could establish and protect data portability rights. This would empower users with access to their data, and give them the ability to choose to take that data to other platforms.

Zuckerberg wrote:

“I believe Facebook has a responsibility to help address these issues, and I’m looking forward to discussing them with lawmakers around the world. We’ve built advanced systems for finding harmful content, stopping election interference and making ads more transparent.

But people shouldn’t have to rely on individual companies addressing these issues by themselves. We should have a broader debate about what we want as a society and how regulation can help. These four areas are important, but, of course, there’s more to discuss.”

2. Approaches to Regulating the Internet & Social Media

Social media companies rely on its members or users to create content on its platform. The legal framework that regulates social media organisations to date has relied upon those users to find out what content is illegal or harmful and it is for those users to regulate the social media companies by either notifying it to take down the content or logging complaints to have the content blocked.

If social media companies have been held liable in law and essentially regulated by its users, then a call by one of the world’s largest social media giants to have it more regulated and shift the burden of regulation from users to Governments, can only be a good thing.

With the growth of the internet and social media platforms, the ability to navigate the complexities of platform governance and legal liability has been complicated and at times burdensome.

What Facebook suggests is that there is an increased engagement of its users, governments and civil society to help achieve a Global unification and standardisation of internet governance and policing of harmful content and privacy violations.

A standardised regulatory approach, as opposed to a jigsaw of Worldwide legal and regulatory frameworks would in his view, help tame the beast that is the internet.

Following the recent atrocity in New Zealand and the tragic death of Molly Russell, whose father Ian raised awareness about consequences to young children about harmful content on Instagram. As well as many other horrific and tragic events streamed on social media platforms or consequences following content uploaded by its users, and even democracies being damaged by fake news and disinformation, Governments around the World have been trying to find solutions.

The International Grand Committee consists of a worldwide gathering of parliamentarians who meet to discuss the regulation of social media platforms, they held their first meeting in Westminster last November and will meet again in Ottawa, Canada, this coming May 2019. The International Grand Committee on Disinformation and Fake news is made up of 24 Members from nine international parliaments.

3. Discussion of The White Paper

The UK Government wants to take the lead. A white paper on online harms, published jointly by the Department for Digital, Culture, Media and Sport (DCMS) and the Home Office, proposes strict new rules should be introduced that require firms to take responsibility for their users and their safety, as well as the content that appears on their services. The White Paper sets out a programme of action to tackle content or activity that harms individual users, particularly children.

The UK proposes to be the first Country, to establish a coherent and effective approach to tackling online harm but also ensuring such a regime reflects a commitment to a free, open and secure internet.

What is proposed is a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services. Compliance with this duty of care will be overseen and enforced by an independent regulator. The regulator will have a suite of powers to take effective enforcement action against companies that have breached their statutory duty of care. This may include the powers to issue substantial fines and to impose liability on individual members of senior management.

They propose that the regulatory framework should apply to companies that allow users to share or discover user-generated content or interact with each other online. These services are offered by a very wide range of companies of all sizes, including social media platforms, file hosting sites, public discussion forums, messaging services and search engines.

An independent regulator will implement, oversee and enforce the new regulatory framework. It will have sufficient resources and the right expertise and capability to perform its role effectively.

To support this, the regulator will work closely with UK Research and Innovation (UKRI) and other partners to improve the evidence base. The regulator will set out expectations for companies to do what is reasonably practicable to counter harmful activity or content, depending on the nature of the harm, the risk of the harm occurring on their services, and the resources and technology available to them. Also, they will have a range of enforcement powers, including the power to levy substantial fines that will ensure that all companies in scope of the regulatory framework fulfil their duty of care.

The Paper suggests that Companies should invest in technology to develop safety tools to reduce the burden on users to stay safe online. Users want to be empowered to keep themselves and their children safe online, but the Government recognises that currently there is insufficient support in place and many feel vulnerable online.

While companies are supporting a range of positive initiatives, there is insufficient transparency about the level of investment and the effectiveness of different interventions. The regulator will have oversight of this investment.

The government suggests it will develop a new online media literacy strategy. This will be developed in broad consultation with stakeholders, including major digital, broadcast and news media organisations, the education sector, researchers and civil society. This strategy seeks to ensure a coordinated and strategic approach to online media literacy education and awareness for children, young people and adults.

4. Analysis of the White Paper

The UK will be the first country to establish a regulatory framework to tackle some of the issues mentioned above. However, enforcing these regulations may prove to be problematic. The UK is scheduled to leave the European Union on 31st October 2018. As such, this will leave the UK with a jurisdictional problem in trying to enforce these regulations on companies outside the UK, perhaps registered in the EU, Asia or America.

Also, a crucial component of the white paper is the commitment to enhancing both the online and physical safety of children. However, the framework will operate in conjunction with the EU’s e-Commerce Directive. This means that the company will still have to be put on notice before they remove any content. This raises the query as to whether the White Paper is proposing any heightened protection for children through this new framework. Also, the difficulty with this area is enforcement which the white paper does little to address. Effective enforcement would require significant funding and it is doubtful whether the Government would provide the necessary funding.

Furthermore, one of the white paper’s main focus is on the ‘protection of users from harm and not judging what is true and what is not’. This is a broad and somewhat vague description with no real definition of what type of harm is envisaged to be protected and there is an inherent risk of unnecessarily restricting the freedom of speech. This will leave a challenging balancing act for the regulator as to what constitutes disinformation. The legal framework that currently applies to online content enables courts to independently adjudicate what is unlawful harm and what is not. It will be interesting to consider what powers and as to what structure any potential regulator shall be constituted.

Ultimately creating a public discussion about the issues raised in the White Paper can only be positive.

5. Consultation

UK Government's new Online Harms White Paper, which proposes a new regulatory framework for social media. The full white paper is available here. The 102 page Paper is very detailed. They are inviting individuals and organisations to provide their views by responding to the questions set out throughout this White Paper. The consultation will be open for 12 weeks, from 8 April 2019 to 23:59 1 July 2019. You can respond online via the following link: https://dcms.eu.qualtrics.com/jfe/form/SV_5nm7sPoxilSoTg9

Back