Policy background
- As use of the internet has expanded, there is growing public concern about the prevalence and spread of illegal online content, as well as the risks to children’s safety arising from exposure to inappropriate content, such as pornography.
- Research and public polling has also highlighted users’ concerns about how platforms are applying their own terms and conditions, and how they respond to users’ complaints.
Existing Regulation of Online Services
- Prior to the enactment of this Act, most user-to-user and search services operating in the United Kingdom were not subject to any regulation concerning user safety for user generated content.
- A limited number of user-to-user services which are used in the United Kingdom are subject to the Video Sharing Platform regime set out in Part 4B of the Communications Act 2003 (the "VSP Regime"). Only services which meet the legal definition of a video sharing platform1 and have the required connection with the United Kingdom2 are in scope.
- Services subject to the VSP Regime are required to take measures to:
- Protect the public from videos and adverts likely to incite violence or hatred against a person on specified grounds including sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political opinion, membership of a national minority, disability, age and sexual orientation;
- Protect the public from material in videos or adverts where the inclusion of that material would be a criminal offence under laws relating to terrorism, child sexual abuse material, and racism and xenophobia;
- Protect under 18s from videos and adverts which have or would be likely to be given an R18 certificate,3 or which have been or would likely be refused a certificate by the British Board of Film Classification;4 and
- Protect under 18s from videos and adverts containing material that might impair their physical, mental or moral development.
- The VSP Regime does not set standards for the content of individual videos.
- OFCOM are responsible for enforcing video sharing platform providers’ compliance with their obligations under the VSP Regime. OFCOM have the power to give enforcement notifications (which may set out the steps required to remedy a contravention)5 and to impose financial penalties of up to £250,000 or 5% of qualifying revenue, whichever is greater.6 In certain circumstances, OFCOM may also suspend and/or restrict a service.7
The Online Harms White Paper
- The Online Harms White Paper
, published in April 2019, set out the Government’s intention to introduce a new regulatory framework to improve protections for users online. It was proposed that this objective would be achieved via a new duty of care on companies, and an independent regulator responsible for overseeing the online safety framework. The White Paper proposed that the regulatory framework should follow a proportionate and risk-based approach, and that the duty of care should be designed to ensure that all in-scope companies had appropriate systems and processes in place to address harmful content and improve the safety of their users.
- A public consultation on the White Paper proposals ran from 8 April 2019 to 1 July 2019. It received over 2,400 responses ranging from companies in the technology industry (including large tech giants and small and medium sized enterprises), academics, think tanks, children’s charities, rights groups, publishers, governmental organisations, and individuals.
- In February 2020, the Government published an initial response to the consultation
, providing an in-depth breakdown of the responses to each of the 18 consultation questions asked in relation to the White Paper proposals. The response also set out the Government's direction of travel in a number of key areas, including:
- How the new regulatory framework would ensure protections for users’ rights by including safeguards in the legislation;
- The differentiated approach to illegal and legal but harmful material;
- How the new requirements would be proportionate and risk-based, including clarifying who would not be captured by the proposed scope;
- A commitment to delivering a higher level of protection for children; and
- That the Government was minded to appoint OFCOM as the new regulator.
- In December 2020, the full Government response to the consultation
was published, outlining the final policy position for the online safety regulatory framework, and the Government's intention to enshrine it in law through the Online Safety Act. The response was split into seven parts:
- Part 1 stated that the regulatory framework would apply to companies whose services host user-generated content or facilitate interaction between users, one or more of whom is based in the United Kingdom, as well as to search engines.
- Part 2 outlined that the legislation would set out a general definition of the harmful content and activity covered by the duty of care. It also set out how all companies in scope would be required to understand the risk of harm to individuals on their services, and to put in place appropriate systems and processes to improve user safety and monitor their effectiveness.
- Part 3 confirmed that OFCOM would be appointed as the regulator, and outlined their regulatory functions and funding.
- Part 4 explained the proposed functions of the regulator, including their duty to set out codes of practice, enforcement powers, and user redress mechanisms.
- Part 5 outlined the role of technology, education, and awareness in tackling online harms.
- Part 6 explained how the new regulatory framework would fit into the wider digital landscape, including as part of the Government’s Digital Strategy.
- Part 7 provided the next steps for the regime, including the expected timings for the Online Safety Act.
Interim Codes of Practice
- The Government published two interim codes of practice covering terrorist content and child sexual exploitation and abuse (CSEA) content online alongside the full government response. These interim codes set out the voluntary action the Government expects providers to take to tackle the most serious categories of harmful content online before OFCOM issues codes of practice using the powers conferred by the Act.
Government Report on Transparency Reporting
- The first government report on transparency reporting in relation to online harms was published alongside the full government response. This presented the recommendations of the multi-stakeholder transparency working group, set up in October 2019, about how the transparency framework could work in practice within the new online harms regulatory framework.
Pre-legislative Scrutiny
- In May 2021 the Online Safety Bill was published in draft. A Joint Committee of MPs and Peers, chaired by Damian Collins MP, was established on 23 July 2021 to carry out pre-legislative scrutiny. The Joint Committee took evidence from over 50 witnesses and received over 200 pieces of written evidence. The Committee published its report and recommendations on 10 December 2021.
- The Government responded to the report on 17 March 2022 confirming that a number of substantive changes were made to the Act at introduction, including, but not limited to:
- Including priority offences in primary, rather than secondary legislation;
- Including a new standalone provision for non-user-generated pornography, meaning all providers of online pornography are in scope of the legislation;
- Including the Law Commission’s recommendations for new online communications offences;
- Amending the senior manager liability offence so that it would be commenced three months after Royal Assent;
- Including a new duty on Category 18 providers to offer optional user verification and user empowerment tools on their sites;
- Including a new duty on Category 1 and Category 2A providers to protect users from fraudulent advertising online; and
- Simplifying the definition of non-designated harmful content, and requiring Category 1 providers only to address categories of content that are legal but harmful to adults, which are designated in secondary legislation.
The Online Safety Act
- The legislation imposes legal requirements on:
- Providers of internet services which allow users to encounter content generated, uploaded or shared by other users, i.e. user-generated content ("user-to-user services");
- Providers of search engines which enable users to search multiple websites and databases ("search services"); and
- Providers of internet services on which provider pornographic content is published or displayed.
- The legislation requires providers of regulated user-to-user and search services to:
- Assess the risks of harm to those users present on the service;
- Take steps to mitigate and manage the risks of harm to individuals arising from illegal content and activity, and (for services likely to be accessed by children) content and activity that is harmful to children. Providers will also need to assess the risk of their services being used for the commission or facilitation of a priority offence and to design and operate their services to mitigate this risk;
- Put in place systems and processes which allow users and affected persons to report specified types of content and activity to the service provider;
- Establish a transparent and easy to use complaints procedure which allows for complaints of specified types to be made;
- Have particular regard to the importance of protecting users’ legal rights to freedom of expression and protecting users from a breach of a legal right to privacy when implementing safety policies and procedures; and
- Put in place systems and processes designed to ensure that detected but unreported CSEA content is reported to the NCA.
- Those user-to-user services which meet the Category 1 threshold conditions, specified by the Secretary of State, are subject to additional legal requirements, including to:
- Improve transparency and accountability and protect free speech. Those user-to-user services which meet the Category 1 threshold conditions must have systems and processes to ensure they only remove or restrict access to content, or ban or suspend users, where allowed by their terms of service, or where they otherwise have a legal obligation to do so.
- Carry out an assessment of the impact that safety policies and procedures will have on users’ legal rights to freedom of expression, including on access to and treatment of news publisher and journalistic content, and users’ privacy, and demonstrate the steps they have taken to mitigate any impact;
- Specify in a public statement the steps taken to protect users’ legal rights to freedom of expression and users’ privacy;
- Put in place systems and processes designed to ensure that the importance of the free expression of content of democratic importance is taken into account when making decisions about how to treat such content;
- Put in place systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about how to treat such content.
- Notify and offer a right of appeal to a recognised news publisher before removing or moderating its content, or taking action against its account;
- Put in place a dedicated and expedited complaints procedure that ensures that the decisions of the service provider to take action against a user because of a particular piece of journalistic content can be challenged;
- Offer optional user verification and user empowerment tools to adults on their sites; and proactively ask their registered adult users at the first possible opportunity how they would like the user empowerment content tools to be applied; and
- Put in place proportionate systems and processes to prevent the risk of users encountering fraudulent adverts.
- Those search services which meet the Category 2A threshold conditions are under a duty to produce annual transparency reports and to put in place proportionate systems and processes to prevent the risk of users encountering fraudulent adverts.
- Category 1, 2A, and 2B Services are also under duties to set out their policies on disclosing information to the parents of deceased child users, provide details about this in the terms of service or a publicly available statement, and operate a complaints procedure in relation to these duties.
- The Act confers new powers on OFCOM establishing them as the online safety regulator. OFCOM is responsible for enforcing the legal requirements imposed on service providers. The Act gives OFCOM the power to compel in-scope providers to provide information and to require an individual from an in-scope provider to attend an interview; powers of entry and inspection; and the power to require a service provider to undertake, and pay for, a report from a skilled person. OFCOM may also require information, or produce a report, in relation to the death of a child and share this information with a coroner.
- The Act confers new powers on OFCOM to require regulated user-to-user and search services to use accredited technology to deal with CSEA and terrorism content, or make best endeavours to develop or source technology to deal with CSEA content, where necessary and proportionate.
- The new powers conferred on OFCOM also include the power to give enforcement notifications (which may set out the steps required to remedy a contravention) and the power to impose financial penalties of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater. OFCOM can also, in certain circumstances, apply to the Courts for an order imposing business disruption measures on a provider.
- The Act requires OFCOM to produce codes of practice for service providers, setting out the recommended steps that providers can take in order to comply with the legal requirements described above. A provider may take different measures to those recommended in the codes of practice. A provider will be treated as having complied with the relevant legal obligation if the provider takes the steps recommended in the relevant code of practice for complying with that obligation.
- The Act also requires providers of internet services which make pornographic material available by way of the service (as opposed to enabling users to generate or share such content) to use age verification or age estimation (or both), to ensure that children are not normally able to encounter that pornographic content.
- The Act creates a false communications offence and a threatening communications offence. It also amends the existing communications offences in the Malicious Communications Act 1988, Malicious Communications (Northern Ireland) Order 1988, and Section 127 of the Communications Act 2003 to reflect this. It also creates a new "cyberflashing" offence, an offence of sending or showing flashing images electronically to people with epilepsy, and an offence of encouraging or assisting serious self-harm. It also inserts new intimate image abuse offences to the Sexual Offences Act 2003.
1 The legal test is set out in Section 368S of the Communications Act 2003. OFCOM have produced guidance on the definition of a video sharing platform which is available on the OFCOM website .
2 Sections 368S(3)-(5) of the Communications Act 2003 sets out when a video sharing platform will be regarded as having the required connection with the United Kingdom for the purposes of the VSP Regime. OFCOM have produced guidance in relation to when a video sharing platform will be regarded as established in the United Kingdom, which is available on the OFCOM website .
3 The R18 category is a special and legally-restricted classification, primarily for explicit videos of consenting sex or strong fetish material involving adults, and where the primary purpose of the material is sexual arousal or stimulation.
4 The BBFC’s current guidelines outline that material likely to be unsuitable for classification could include: material which is in breach of criminal law (or created through the commission of a criminal offence); material that appears to risk harm to individuals or to society such as, for example, the detailed portrayal of violence or dangerous acts, illegal drug use; and the portrayal or invitations to conduct sadistic violence, rape or other non-consensual sexual violent behaviour or other harmful violent activities.
5 Sections 368Z2 and 368Z3 of the Communications Act 2003.
6 Section 368Z4 of the Communications Act 2003.
7 Sections 368Z5 and 368Z6 of the Communications Act 2003.
8 Category 1 services is a subset of user-to-user services that is subject to additional duties.