Search Legislation

Online Safety Act 2023

Status:

This is the original version (as it was originally enacted).

CHAPTER 2Providers of user-to-user services: duties of care

User-to-user services: which duties apply, and scope of duties

7Providers of user-to-user services: duties of care

(1)Subsections (2) to (6) apply to determine which of the duties set out in this Chapter (and, in the case of combined services, Chapter 3) must be complied with by providers of regulated user-to-user services.

(2)All providers of regulated user-to-user services must comply with the following duties in relation to each such service which they provide—

(a)the duties about illegal content risk assessments set out in section 9,

(b)the duties about illegal content set out in section 10(2) to (8),

(c)the duty about content reporting set out in section 20,

(d)the duties about complaints procedures set out in section 21,

(e)the duties about freedom of expression and privacy set out in section 22(2) and (3), and

(f)the duties about record-keeping and review set out in section 23(2) to (6).

(3)Additional duties must be complied with by providers of particular kinds of regulated user-to-user services, as follows.

(4)All providers of regulated user-to-user services that are likely to be accessed by children must comply with the following duties in relation to each such service which they provide—

(a)the duties about children’s risk assessments set out in section 11, and

(b)the duties to protect children’s online safety set out in section 12(2) to (13).

(5)All providers of Category 1 services must comply with the following duties in relation to each such service which they provide—

(a)the duty about illegal content risk assessments set out in section 10(9),

(b)the duty about children’s risk assessments set out in section 12(14),

(c)the duties about assessments related to adult user empowerment set out in section 14,

(d)the duties to empower adult users set out in section 15,

(e)the duties to protect content of democratic importance set out in section 17,

(f)the duties to protect news publisher content set out in section 18,

(g)the duties to protect journalistic content set out in section 19,

(h)the duties about freedom of expression and privacy set out in section 22(4), (6) and (7), and

(i)the duties about record-keeping set out in section 23(9) and (10).

(6)All providers of combined services must comply with the following duties in relation to the search engine of each such service which they provide—

(a)if the service is not a Category 2A service and is not likely to be accessed by children, the duties set out in Chapter 3 referred to in section 24(2);

(b)if the service is not a Category 2A service and is likely to be accessed by children, the duties set out in Chapter 3 referred to in section 24(2) and (4);

(c)if the service is a Category 2A service not likely to be accessed by children, the duties set out in Chapter 3 referred to in section 24(2) and (5);

(d)if the service is a Category 2A service likely to be accessed by children, the duties set out in Chapter 3 referred to in section 24(2), (4) and (5).

(7)For the meaning of “likely to be accessed by children”, see section 37.

(8)For the meaning of “Category 1 service”, see section 95 (register of categories of services).

8Scope of duties of care

(1)A duty set out in this Chapter which must be complied with in relation to a user-to-user service that includes regulated provider pornographic content does not extend to—

(a)the regulated provider pornographic content, or

(b)the design, operation or use of the service so far as relating to that content.

See Part 5 for the duties which relate to regulated provider pornographic content, and the meaning of that term.

(2)A duty set out in this Chapter which must be complied with in relation to a combined service does not extend to—

(a)the search content of the service,

(b)any other content that, following a search request, may be encountered as a result of subsequent interactions with internet services, or

(c)anything relating to the design, operation or use of the search engine.

(3)A duty set out in this Chapter which must be complied with in relation to a user-to-user service extends only to—

(a)the design, operation and use of the service in the United Kingdom, and

(b)in the case of a duty that is expressed to apply in relation to users of a service, the design, operation and use of the service as it affects United Kingdom users of the service.

Illegal content duties for user-to-user services

9Illegal content risk assessment duties

(1)This section sets out the duties about risk assessments which apply in relation to all regulated user-to-user services.

(2)A duty to carry out a suitable and sufficient illegal content risk assessment at a time set out in, or as provided by, Schedule 3.

(3)A duty to take appropriate steps to keep an illegal content risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

(4)Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient illegal content risk assessment relating to the impacts of that proposed change.

(5)An “illegal content risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a)the user base;

(b)the level of risk of individuals who are users of the service encountering the following by means of the service—

(i)each kind of priority illegal content (with each kind separately assessed), and

(ii)other illegal content,

taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(c)the level of risk of the service being used for the commission or facilitation of a priority offence;

(d)the level of risk of harm to individuals presented by illegal content of different kinds or by the use of the service for the commission or facilitation of a priority offence;

(e)the level of risk of functionalities of the service facilitating the presence or dissemination of illegal content or the use of the service for the commission or facilitation of a priority offence, identifying and assessing those functionalities that present higher levels of risk;

(f)the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by individuals;

(g)the nature, and severity, of the harm that might be suffered by individuals from the matters identified in accordance with paragraphs (b) to (f);

(h)how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(6)In this section references to risk profiles are to the risk profiles for the time being published under section 98 which relate to the risk of harm to individuals presented by illegal content.

(7)See also—

(a)section 23(2) and (10) (records of risk assessments), and

(b)Schedule 3 (timing of providers’ assessments).

10Safety duties about illegal content

(1)This section sets out the duties about illegal content which apply in relation to regulated user-to-user services (as indicated by the headings).

All services

(2)A duty, in relation to a service, to take or use proportionate measures relating to the design or operation of the service to—

(a)prevent individuals from encountering priority illegal content by means of the service,

(b)effectively mitigate and manage the risk of the service being used for the commission or facilitation of a priority offence, as identified in the most recent illegal content risk assessment of the service, and

(c)effectively mitigate and manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service (see section 9(5)(g)).

(3)A duty to operate a service using proportionate systems and processes designed to—

(a)minimise the length of time for which any priority illegal content is present;

(b)where the provider is alerted by a person to the presence of any illegal content, or becomes aware of it in any other way, swiftly take down such content.

(4)The duties set out in subsections (2) and (3) apply across all areas of a service, including the way it is designed, operated and used as well as content present on the service, and (among other things) require the provider of a service to take or use measures in the following areas, if it is proportionate to do so—

(a)regulatory compliance and risk management arrangements,

(b)design of functionalities, algorithms and other features,

(c)policies on terms of use,

(d)policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content,

(e)content moderation, including taking down content,

(f)functionalities allowing users to control the content they encounter,

(g)user support measures, and

(h)staff policies and practices.

(5)A duty to include provisions in the terms of service specifying how individuals are to be protected from illegal content, addressing each paragraph of subsection (3), and (in relation to paragraph (a)) separately addressing terrorism content, CSEA content (see section 59 and Schedule 6) and other priority illegal content.

(6)A duty to apply the provisions of the terms of service referred to in subsection (5) consistently.

(7)A duty to include provisions in the terms of service giving information about any proactive technology used by a service for the purpose of compliance with a duty set out in subsection (2) or (3) (including the kind of technology, when it is used, and how it works).

(8)A duty to ensure that the provisions of the terms of service referred to in subsections (5) and (7) are clear and accessible.

Additional duty for Category 1 services

(9)A duty to summarise in the terms of service the findings of the most recent illegal content risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to individuals).

Interpretation

(10)In determining what is proportionate for the purposes of this section, the following factors, in particular, are relevant—

(a)all the findings of the most recent illegal content risk assessment (including as to levels of risk and as to nature, and severity, of potential harm to individuals), and

(b)the size and capacity of the provider of a service.

(11)In this section “illegal content risk assessment” has the meaning given by section 9.

(12)See also, in relation to duties set out in this section, section 22 (duties about freedom of expression and privacy).

User-to-user services likely to be accessed by children

11Children’s risk assessment duties

(1)This section sets out the duties about risk assessments which apply in relation to regulated user-to-user services that are likely to be accessed by children (in addition to the duties about risk assessments set out in section 9 and, in the case of services likely to be accessed by children which are Category 1 services, the duties about assessments set out in section 14).

(2)A duty to carry out a suitable and sufficient children’s risk assessment at a time set out in, or as provided by, Schedule 3.

(3)A duty to take appropriate steps to keep a children’s risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

(4)Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient children’s risk assessment relating to the impacts of that proposed change.

(5)Where a children’s risk assessment of a service identifies the presence of non-designated content that is harmful to children, a duty to notify OFCOM of—

(a)the kinds of such content identified, and

(b)the incidence of those kinds of content on the service.

(6)A “children’s risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a)the user base, including the number of users who are children in different age groups;

(b)the level of risk of children who are users of the service encountering the following by means of the service—

(i)each kind of primary priority content that is harmful to children (with each kind separately assessed),

(ii)each kind of priority content that is harmful to children (with each kind separately assessed), and

(iii)non-designated content that is harmful to children,

giving separate consideration to children in different age groups, and taking into account (in particular) algorithms used by the service and how easily, quickly and widely content may be disseminated by means of the service;

(c)the level of risk of harm to children presented by different kinds of content that is harmful to children, giving separate consideration to children in different age groups;

(d)the level of risk of harm to children presented by content that is harmful to children which particularly affects individuals with a certain characteristic or members of a certain group;

(e)the extent to which the design of the service, in particular its functionalities, affects the level of risk of harm that might be suffered by children, identifying and assessing those functionalities that present higher levels of risk, including functionalities—

(i)enabling adults to search for other users of the service (including children), or

(ii)enabling adults to contact other users (including children) by means of the service;

(f)the different ways in which the service is used, including functionalities or other features of the service that affect how much children use the service (for example a feature that enables content to play automatically), and the impact of such use on the level of risk of harm that might be suffered by children;

(g)the nature, and severity, of the harm that might be suffered by children from the matters identified in accordance with paragraphs (b) to (f), giving separate consideration to children in different age groups;

(h)how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(7)In this section references to risk profiles are to the risk profiles for the time being published under section 98 which relate to the risk of harm to children presented by content that is harmful to children.

(8)See also—

(a)section 23(2) and (10) (records of risk assessments), and

(b)Schedule 3 (timing of providers’ assessments).

12Safety duties protecting children

(1)This section sets out the duties to protect children’s online safety which apply in relation to regulated user-to-user services that are likely to be accessed by children (as indicated by the headings).

All services

(2)A duty, in relation to a service, to take or use proportionate measures relating to the design or operation of the service to effectively—

(a)mitigate and manage the risks of harm to children in different age groups, as identified in the most recent children’s risk assessment of the service (see section 11(6)(g)), and

(b)mitigate the impact of harm to children in different age groups presented by content that is harmful to children present on the service.

(3)A duty to operate a service using proportionate systems and processes designed to—

(a)prevent children of any age from encountering, by means of the service, primary priority content that is harmful to children;

(b)protect children in age groups judged to be at risk of harm from other content that is harmful to children (or from a particular kind of such content) from encountering it by means of the service.

(4)The duty set out in subsection (3)(a) requires a provider to use age verification or age estimation (or both) to prevent children of any age from encountering primary priority content that is harmful to children which the provider identifies on the service.

(5)That requirement applies to a provider in relation to a particular kind of primary priority content that is harmful to children in every case except where—

(a)a term of service indicates (in whatever words) that the presence of that kind of primary priority content that is harmful to children is prohibited on the service, and

(b)that policy applies in relation to all users of the service.

(6)If a provider is required by subsection (4) to use age verification or age estimation for the purpose of compliance with the duty set out in subsection (3)(a), the age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child.

(7)Age verification or age estimation to identify who is or is not a child user or which age group a child user is in are examples of measures which (if not required by subsection (4)) may be taken or used (among others) for the purpose of compliance with a duty set out in subsection (2) or (3).

(8)The duties set out in subsections (2) and (3) apply across all areas of a service, including the way it is designed, operated and used as well as content present on the service, and (among other things) require the provider of a service to take or use measures in the following areas, if it is proportionate to do so—

(a)regulatory compliance and risk management arrangements,

(b)design of functionalities, algorithms and other features,

(c)policies on terms of use,

(d)policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content,

(e)content moderation, including taking down content,

(f)functionalities allowing for control over content that is encountered, especially by children,

(g)user support measures, and

(h)staff policies and practices.

(9)A duty to include provisions in the terms of service specifying—

(a)how children of any age are to be prevented from encountering primary priority content that is harmful to children (with each kind of primary priority content separately covered);

(b)how children in age groups judged to be at risk of harm from priority content that is harmful to children (or from a particular kind of such content) are to be protected from encountering it, where they are not prevented from doing so (with each kind of priority content separately covered);

(c)how children in age groups judged to be at risk of harm from non-designated content that is harmful to children (or from a particular kind of such content) are to be protected from encountering it, where they are not prevented from doing so.

(10)A duty to apply the provisions of the terms of service referred to in subsection (9) consistently.

(11)If a provider takes or uses a measure designed to prevent access to the whole of the service or a part of the service by children under a certain age, a duty to—

(a)include provisions in the terms of service specifying details about the operation of the measure, and

(b)apply those provisions consistently.

(12)A duty to include provisions in the terms of service giving information about any proactive technology used by a service for the purpose of compliance with a duty set out in subsection (2) or (3) (including the kind of technology, when it is used, and how it works).

(13)A duty to ensure that the provisions of the terms of service referred to in subsections (9), (11) and (12) are clear and accessible.

Additional duty for Category 1 services

(14)A duty to summarise in the terms of service the findings of the most recent children’s risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to children).

13Safety duties protecting children: interpretation

(1)In determining what is proportionate for the purposes of section 12, the following factors, in particular, are relevant—

(a)all the findings of the most recent children’s risk assessment (including as to levels of risk and as to nature, and severity, of potential harm to children), and

(b)the size and capacity of the provider of a service.

(2)So far as a duty set out in section 12 relates to non-designated content that is harmful to children, the duty is to be taken to extend only to addressing risks of harm from the kinds of such content that have been identified in the most recent children’s risk assessment (if any have been identified).

(3)References in section 12(3)(b) and (9)(b) and (c) to children in age groups judged to be at risk of harm from content that is harmful to children are references to children in age groups judged to be at risk of such harm as assessed by the provider of a service in the most recent children’s risk assessment of the service.

(4)The duties set out in section 12(3) and (9) are to be taken to extend only to content that is harmful to children where the risk of harm is presented by the nature of the content (rather than the fact of its dissemination).

(5)The duties set out in section 12 extend only to such parts of a service as it is possible for children to access.

(6)For the purposes of subsection (5), a provider is only entitled to conclude that it is not possible for children to access a service, or a part of it, if age verification or age estimation is used on the service with the result that children are not normally able to access the service or that part of it.

(7)In section 12 and this section “children’s risk assessment” has the meaning given by section 11.

(8)See also, in relation to duties set out in section 12, section 22 (duties about freedom of expression and privacy).

Category 1 services

14Assessment duties: user empowerment

(1)This section sets out the duties about assessments related to adult user empowerment which apply in relation to Category 1 services (in addition to the duties about risk assessments set out in section 9 and, in the case of Category 1 services likely to be accessed by children, section 11).

(2)A duty to carry out a suitable and sufficient assessment for the purposes of section 15(2) at a time set out in, or as provided by, Schedule 3.

(3)A duty to take appropriate steps to keep such an assessment up to date.

(4)Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient assessment for the purposes of section 15(2) relating to the impacts of that proposed change.

(5)An assessment of a service “for the purposes of section 15(2)” means an assessment of the following matters—

(a)the user base;

(b)the incidence of relevant content on the service;

(c)the likelihood of adult users of the service encountering, by means of the service, each kind of relevant content (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(d)the likelihood of adult users with a certain characteristic or who are members of a certain group encountering relevant content which particularly affects them;

(e)the likelihood of functionalities of the service facilitating the presence or dissemination of relevant content, identifying and assessing those functionalities more likely to do so;

(f)the different ways in which the service is used, and the impact of such use on the likelihood of adult users encountering relevant content;

(g)how the design and operation of the service (including the business model, governance, use of proactive technology, measures to strengthen adult users’ control over their interaction with user-generated content, and other systems and processes) may reduce or increase the likelihood of adult users encountering relevant content.

(6)In this section “relevant content” means content to which section 15(2) applies (content to which user empowerment duties set out in that provision apply).

(7)See also—

(a)section 23(9) and (10) (records of assessments), and

(b)Schedule 3 (timing of providers’ assessments).

15User empowerment duties

(1)This section sets out the duties to empower adult users which apply in relation to Category 1 services.

(2)A duty to include in a service, to the extent that it is proportionate to do so, features which adult users may use or apply if they wish to increase their control over content to which this subsection applies.

(3)The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to effectively—

(a)reduce the likelihood of the user encountering content to which subsection (2) applies present on the service, or

(b)alert the user to content present on the service that is a particular kind of content to which subsection (2) applies.

(4)A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) (“control features”) are made available to all adult users and are easy to access.

(5)A duty to operate a service using a system or process which seeks to ensure that all registered adult users are offered the earliest possible opportunity, in relation to each control feature included in the service, to take a step indicating to the provider that—

(a)the user wishes to retain the default setting for the feature (whether that is that the feature is in use or applied, or is not in use or applied), or

(b)the user wishes to change the default setting for the feature.

(6)The duty set out in subsection (5)

(a)continues to apply in relation to a user and a control feature for so long as the user has not yet taken a step mentioned in that subsection in relation to the feature;

(b)no longer applies in relation to a user once the user has taken such a step in relation to every control feature included in the service.

(7)A duty to include clear and accessible provisions in the terms of service specifying which control features are offered and how users may take advantage of them.

(8)A duty to summarise in the terms of service the findings of the most recent assessment of a service under section 14 (assessments related to the duty set out in subsection (2)).

(9)A duty to include in a service features which adult users may use or apply if they wish to filter out non-verified users.

(10)The features referred to in subsection (9) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to effectively—

(a)prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b)reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

16User empowerment duties: interpretation

(1)In determining what is proportionate for the purposes of section 15(2), the following factors, in particular, are relevant—

(a)all the findings of the most recent assessment under section 14, and

(b)the size and capacity of the provider of the service.

(2)Section 15(2) applies to content that—

(a)is regulated user-generated content in relation to the service in question, and

(b)is within subsection (3), (4) or (5).

(3)Content is within this subsection if it encourages, promotes or provides instructions for—

(a)suicide or an act of deliberate self-injury, or

(b)an eating disorder or behaviours associated with an eating disorder.

(4)Content is within this subsection if it is abusive and the abuse targets any of the following characteristics—

(a)race,

(b)religion,

(c)sex,

(d)sexual orientation,

(e)disability, or

(f)gender reassignment.

(5)Content is within this subsection if it incites hatred against people—

(a)of a particular race, religion, sex or sexual orientation,

(b)who have a disability, or

(c)who have the characteristic of gender reassignment.

(6)The duty set out in section 15(5) applies in relation to all registered adult users, not just those who begin to use a service after that duty begins to apply.

(7)In section 15 and this section—

  • disability” means any physical or mental impairment;

  • injury” includes poisoning;

  • non-verified user” means a user who—

    (a)

    is an individual, whether in the United Kingdom or outside it, and

    (b)

    has not verified their identity to the provider of a service;

  • race” includes colour, nationality, and ethnic or national origins.

(8)In section 15 and this section—

(a)references to features include references to functionalities and settings, and

(b)references to religion include references to a lack of religion.

(9)For the purposes of section 15 and this section, a person has the characteristic of gender reassignment if the person is proposing to undergo, is undergoing or has undergone a process (or part of a process) for the purpose of reassigning the person’s sex by changing physiological or other attributes of sex, and the reference to gender reassignment in subsection (4) is to be construed accordingly.

(10)See also, in relation to duties set out in section 15, section 22 (duties about freedom of expression and privacy).

17Duties to protect content of democratic importance

(1)This section sets out the duties to protect content of democratic importance which apply in relation to Category 1 services.

(2)A duty to operate a service using proportionate systems and processes designed to ensure that the importance of the free expression of content of democratic importance is taken into account when making decisions about—

(a)how to treat such content (especially decisions about whether to take it down or restrict users’ access to it), and

(b)whether to take action against a user generating, uploading or sharing such content.

(3)A duty to ensure that the systems and processes mentioned in subsection (2) apply in the same way to a wide diversity of political opinion.

(4)A duty to include provisions in the terms of service specifying the policies and processes that are designed to take account of the principle mentioned in subsection (2), including, in particular, how that principle is applied to decisions mentioned in that subsection.

(5)A duty to ensure that—

(a)the provisions of the terms of service referred to in subsection (4) are clear and accessible, and

(b)those provisions are applied consistently.

(6)In determining what is proportionate for the purposes of subsection (2), the size and capacity of the provider of a service, in particular, is relevant.

(7)For the purposes of this section content is “content of democratic importance”, in relation to a user-to-user service, if—

(a)the content is—

(i)news publisher content in relation to that service, or

(ii)regulated user-generated content in relation to that service; and

(b)the content is or appears to be specifically intended to contribute to democratic political debate in the United Kingdom or a part or area of the United Kingdom.

(8)In this section, the reference to “taking action” against a user is to giving a warning to a user, or suspending or banning a user from using a service, or in any way restricting a user’s ability to use a service.

(9)For the meaning of “news publisher content” and “regulated user-generated content”, see section 55.

18Duties to protect news publisher content

(1)This section sets out the duties to protect news publisher content which apply in relation to Category 1 services.

(2)Subject to subsections (4), (5) and (8), a duty, in relation to a service, to take the steps set out in subsection (3) before—

(a)taking action in relation to content present on the service that is news publisher content, or

(b)taking action against a user who is a recognised news publisher.

(3)The steps referred to in subsection (2) are—

(a)to give the recognised news publisher in question a notification which—

(i)specifies the action that the provider is considering taking,

(ii)gives reasons for that proposed action by reference to each relevant provision of the terms of service,

(iii)where the proposed action relates to news publisher content that is also journalistic content, explains how the provider took the importance of the free expression of journalistic content into account when deciding on the proposed action, and

(iv)specifies a reasonable period within which the recognised news publisher may make representations,

(b)to consider any representations that are made, and

(c)to notify the recognised news publisher of the decision and the reasons for it (addressing any representations made).

(4)If a provider of a service reasonably considers that the provider would incur criminal or civil liability in relation to news publisher content present on the service if it were not taken down swiftly, the provider may take down that content without having taken the steps set out in subsection (3).

(5)A provider of a service may also take down news publisher content present on the service without having taken the steps set out in subsection (3) if that content amounts to a relevant offence (see section 59 and also subsection (10) of this section).

(6)Subject to subsection (8), if a provider takes action in relation to news publisher content or against a recognised news publisher without having taken the steps set out in subsection (3), a duty to take the steps set out in subsection (7).

(7)The steps referred to in subsection (6) are—

(a)to swiftly notify the recognised news publisher in question of the action taken, giving the provider’s justification for not having first taken the steps set out in subsection (3),

(b)to specify a reasonable period within which the recognised news publisher may request that the action is reversed, and

(c)if a request is made as mentioned in paragraph (b)—

(i)to consider the request and whether the steps set out in subsection (3) should have been taken prior to the action being taken,

(ii)if the provider concludes that those steps should have been taken, to swiftly reverse the action, and

(iii)to notify the recognised news publisher of the decision and the reasons for it (addressing any reasons accompanying the request for reversal of the action).

(8)If a recognised news publisher has been banned from using a service (and the ban is still in force), the provider of the service may take action in relation to news publisher content present on the service which was generated or originally published or broadcast by the recognised news publisher without complying with the duties set out in this section.

(9)For the purposes of this section, a provider is not to be regarded as taking action in relation to news publisher content in the following circumstances—

(a)a provider takes action in relation to content which is not news publisher content, that action affects related news publisher content, the grounds for the action only relate to the content which is not news publisher content, and it is not technically feasible for the action only to relate to the content which is not news publisher content;

(b)a provider takes action against a user, and that action affects news publisher content that has been uploaded to or shared on the service by the user.

(10)Section 192 (providers’ judgements about the status of content) applies in relation to judgements by providers about whether news publisher content amounts to a relevant offence as it applies in relation to judgements about whether content is illegal content.

(11)Any provision of the terms of service has effect subject to this section.

(12)In this section—

(a)references to “news publisher content” are to content that is news publisher content in relation to the service in question;

(b)references to “taking action” against a person are to giving a warning to a person, or suspending or banning a person from using a service, or in any way restricting a person’s ability to use a service.

(13)In this section references to “taking action” in relation to content are to—

(a)taking down content,

(b)restricting users’ access to content, or

(c)adding warning labels to content, except warning labels normally encountered only by child users,

and also include references to taking any other action in relation to content on the grounds that it is content of a kind which is the subject of a relevant term of service (but not otherwise).

(14)A “relevant term of service” means a term of service which indicates to users (in whatever words) that the presence of a particular kind of content, from the time it is generated, uploaded or shared on the service, is not tolerated on the service or is tolerated but liable to result in the provider treating it in a way that makes it less likely that other users will encounter it.

(15)Taking any step set out in subsection (3) or (7) does not count as “taking action” for the purposes of this section.

(16)See—

  • section 19 for the meaning of “journalistic content”;

  • section 55 for the meaning of “news publisher content”;

  • section 56 for the meaning of “recognised news publisher”.

19Duties to protect journalistic content

(1)This section sets out the duties to protect journalistic content which apply in relation to Category 1 services.

The duties

(2)A duty to operate a service using proportionate systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about—

(a)how to treat such content (especially decisions about whether to take it down or restrict users’ access to it), and

(b)whether to take action against a user generating, uploading or sharing such content.

(3)A duty, in relation to a decision by a provider to take down content or to restrict access to it, to make a dedicated and expedited complaints procedure available to a person who considers the content to be journalistic content and who is—

(a)the user who generated, uploaded or shared the content on the service, or

(b)the creator of the content (see subsections (14) and (15)).

(4)A duty to make a dedicated and expedited complaints procedure available to users of a service in relation to a decision by the provider of the service to take action against a user because of content generated, uploaded or shared by the user which the user considers to be journalistic content.

(5)A duty to ensure that—

(a)if a complaint about a decision mentioned in subsection (3) is upheld, the content is swiftly reinstated on the service;

(b)if a complaint about a decision mentioned in subsection (4) is upheld, the action against the user is swiftly reversed.

(6)Subsections (3) and (4) do not require a provider to make a dedicated and expedited complaints procedure available to a recognised news publisher in relation to a decision if the provider has taken the steps set out in section 18(3) in relation to that decision.

(7)A duty to include provisions in the terms of service specifying—

(a)by what methods content present on the service is to be identified as journalistic content;

(b)how the importance of the free expression of journalistic content is to be taken into account when making decisions mentioned in subsection (2);

(c)the policies and processes for handling complaints in relation to content which is, or is considered to be, journalistic content.

(8)A duty to ensure that—

(a)the provisions of the terms of service referred to in subsection (7) are clear and accessible, and

(b)those provisions are applied consistently.

Interpretation

(9)In determining what is proportionate for the purposes of subsection (2), the size and capacity of the provider of a service, in particular, is relevant.

(10)For the purposes of this Part content is “journalistic content”, in relation to a user-to-user service, if—

(a)the content is—

(i)news publisher content in relation to that service, or

(ii)regulated user-generated content in relation to that service;

(b)the content is generated for the purposes of journalism; and

(c)the content is UK-linked.

(11)For the purposes of this section content is “UK-linked” if—

(a)United Kingdom users of the service form one of the target markets for the content (or the only target market), or

(b)the content is or is likely to be of interest to a significant number of United Kingdom users.

(12)In this section references to “taking action” against a user are to giving a warning to a user, or suspending or banning a user from using a service, or in any way restricting a user’s ability to use a service.

(13)In this section the reference to the “creator” of content is to be read in accordance with subsections (14) and (15).

(14)The creator of news publisher content is the recognised news publisher in question.

(15)The creator of content other than news publisher content is—

(a)an individual who—

(i)created the content, and

(ii)is in the United Kingdom; or

(b)an entity which—

(i)created the content, and

(ii)is incorporated or formed under the law of any part of the United Kingdom.

(16)For the meaning of “news publisher content”, “regulated user-generated content” and “recognised news publisher”, see sections 55 and 56.

Duties about content reporting and complaints procedures

20Duty about content reporting

(1)This section sets out the duty about content reporting which applies in relation to all regulated user-to-user services.

(2)A duty to operate a service using systems and processes that allow users and affected persons to easily report content which they consider to be content of a kind specified below (with the duty extending to different kinds of content depending on the kind of service, as indicated by the headings).

All services

(3)Illegal content.

Services likely to be accessed by children

(4)Content that is harmful to children, present on a part of a service that it is possible for children to access.

Interpretation

(5)In this section “affected person” means a person, other than a user of the service in question, who is in the United Kingdom and who is—

(a)the subject of the content,

(b)a member of a class or group of people with a certain characteristic targeted by the content,

(c)a parent of, or other adult with responsibility for, a child who is a user of the service or is the subject of the content, or

(d)an adult providing assistance in using the service to another adult who requires such assistance, where that other adult is a user of the service or is the subject of the content.

(6)For the purposes of subsection (4), a provider is only entitled to conclude that it is not possible for children to access a service, or a part of it, if age verification or age estimation is used on the service with the result that children are not normally able to access the service or that part of it.

(7)See also—

(a)section 22 (duties about freedom of expression and privacy), and

(b)section 72(5)(a) (reporting of content that terms of service allow to be taken down or restricted).

21Duties about complaints procedures

(1)This section sets out the duties about complaints procedures which apply in relation to all regulated user-to-user services.

(2)A duty to operate a complaints procedure in relation to a service that—

(a)allows for relevant kinds of complaint to be made (as set out under the headings below),

(b)provides for appropriate action to be taken by the provider of the service in response to complaints of a relevant kind, and

(c)is easy to access, easy to use (including by children) and transparent.

(3)A duty to include in the terms of service provisions which are easily accessible (including to children) specifying the policies and processes that govern the handling and resolution of complaints of a relevant kind.

All services

(4)The following kinds of complaint are relevant for all services—

(a)complaints by users and affected persons about content present on a service which they consider to be illegal content;

(b)complaints by users and affected persons if they consider that the provider is not complying with a duty set out in—

(i)section 10 (illegal content),

(ii)section 20 (content reporting), or

(iii)section 22(2) or (3) (freedom of expression and privacy);

(c)complaints by a user who has generated, uploaded or shared content on a service if that content is taken down on the basis that it is illegal content;

(d)complaints by a user of a service if the provider has given a warning to the user, suspended or banned the user from using the service, or in any other way restricted the user’s ability to use the service, as a result of content generated, uploaded or shared by the user which the provider considers to be illegal content;

(e)complaints by a user who has generated, uploaded or shared content on a service if—

(i)the use of proactive technology on the service results in that content being taken down or access to it being restricted, or given a lower priority or otherwise becoming less likely to be encountered by other users, and

(ii)the user considers that the proactive technology has been used in a way not contemplated by, or in breach of, the terms of service (for example, by affecting content not of a kind specified in the terms of service as a kind of content in relation to which the technology would operate).

Services likely to be accessed by children

(5)The following kinds of complaint are relevant for services that are likely to be accessed by children—

(a)complaints by users and affected persons about content, present on a part of a service that it is possible for children to access, which they consider to be content that is harmful to children;

(b)complaints by users and affected persons if they consider that the provider is not complying with a duty set out in section 12 (children’s online safety);

(c)complaints by a user who has generated, uploaded or shared content on a service if that content is taken down, or access to it is restricted, on the basis that it is content that is harmful to children;

(d)complaints by a user of a service if the provider has given a warning to the user, suspended or banned the user from using the service, or in any other way restricted the user’s ability to use the service, as a result of content generated, uploaded or shared by the user which the provider considers to be content that is harmful to children;

(e)complaints by a user who is unable to access content because measures used to comply with a duty set out in section 12(2) or (3) have resulted in an incorrect assessment of the user’s age.

Category 1 services

(6)The relevant kind of complaint for Category 1 services is complaints by users and affected persons if they consider that the provider is not complying with a duty set out in—

(a)section 15 (user empowerment),

(b)section 17 (content of democratic importance),

(c)section 18 (news publisher content),

(d)section 19 (journalistic content), or

(e)section 22(4), (6) or (7) (freedom of expression and privacy).

Interpretation

(7)In this section “affected person” has the meaning given by section 20.

(8)For the purposes of subsection (5)(a), a provider is only entitled to conclude that it is not possible for children to access a service, or a part of it, if age verification or age estimation is used on the service with the result that children are not normally able to access the service or that part of it.

(9)See also—

(a)section 22 (duties about freedom of expression and privacy), and

(b)section 72(6) (complaints procedure relating to content that terms of service allow to be taken down or restricted).

Cross-cutting duties

22Duties about freedom of expression and privacy

(1)This section sets out the duties about freedom of expression and privacy which apply in relation to regulated user-to-user services (as indicated by the headings).

All services

(2)When deciding on, and implementing, safety measures and policies, a duty to have particular regard to the importance of protecting users’ right to freedom of expression within the law.

(3)When deciding on, and implementing, safety measures and policies, a duty to have particular regard to the importance of protecting users from a breach of any statutory provision or rule of law concerning privacy that is relevant to the use or operation of a user-to-user service (including, but not limited to, any such provision or rule concerning the processing of personal data).

Additional duties for Category 1 services

(4)A duty—

(a)when deciding on safety measures and policies, to carry out an assessment of the impact that such measures or policies would have on—

(i)users’ right to freedom of expression within the law, and

(ii)the privacy of users; and

(b)to carry out an assessment of the impact of adopted safety measures and policies on the matters mentioned in paragraph (a)(i) and (ii).

(5)An impact assessment relating to a service must include a section which considers the impact of the safety measures and policies on the availability and treatment on the service of content which is news publisher content or journalistic content in relation to the service.

(6)A duty to—

(a)keep an impact assessment up to date, and

(b)publish impact assessments.

(7)A duty to specify in a publicly available statement the positive steps that the provider has taken in response to an impact assessment to—

(a)protect users’ right to freedom of expression within the law, and

(b)protect the privacy of users.

Interpretation

(8)In this section—

  • impact assessment” means an impact assessment under subsection (4);

  • safety measures and policies” means measures and policies designed to secure compliance with any of the duties set out in—

    (a)

    section 10 (illegal content),

    (b)

    section 12 (children’s online safety),

    (c)

    section 15 (user empowerment),

    (d)

    section 20 (content reporting), or

    (e)

    section 21 (complaints procedures).

(9)Any reference in this section to the privacy of users or steps taken to protect the privacy of users is to be construed in accordance with subsection (3).

(10)See—

  • section 19 for the meaning of “journalistic content”;

  • section 55 for the meaning of “news publisher content”.

23Record-keeping and review duties

(1)This section sets out the record-keeping and review duties which apply in relation to regulated user-to-user services (as indicated by the headings).

All services

(2)A duty to make and keep a written record, in an easily understandable form, of all aspects of every risk assessment under section 9 or 11, including details about how the assessment was carried out and its findings.

(3)A duty to make and keep a written record of any measures taken or in use to comply with a relevant duty which—

(a)are described in a code of practice and recommended for the purpose of compliance with the duty in question, and

(b)apply in relation to the provider and the service in question.

In this section such measures are referred to as “applicable measures in a code of practice”.

(4)If alternative measures have been taken or are in use to comply with a relevant duty, a duty to make and keep a written record containing the following information—

(a)the applicable measures in a code of practice that have not been taken or are not in use,

(b)the alternative measures that have been taken or are in use,

(c)how those alternative measures amount to compliance with the duty in question, and

(d)how the provider has complied with section 49(5) (freedom of expression and privacy).

(5)If alternative measures have been taken or are in use to comply with a duty set out in section 10(2) or (3) or 12(2) or (3), the record required under subsection (4) of this section must also indicate whether such measures have been taken or are in use in every area listed in section 10(4) or 12(8) (as the case may be) in relation to which there are applicable measures in a code of practice.

(6)A duty to review compliance with the relevant duties in relation to a service—

(a)regularly, and

(b)as soon as reasonably practicable after making any significant change to any aspect of the design or operation of the service.

(7)OFCOM may provide that particular descriptions of providers of user-to-user services are exempt from any or all of the duties set out in this section, and may revoke such an exemption.

(8)OFCOM must publish details of any exemption or revocation under subsection (7), including reasons for the revocation of an exemption.

Additional duties for Category 1 services

(9)A duty to make and keep a written record, in an easily understandable form, of all aspects of every assessment under section 14 (assessments related to the adult user empowerment duty set out in section 15(2)), including details about how the assessment was carried out and its findings.

(10)As soon as reasonably practicable after making a record of an assessment as required by subsection (2) or (9), or revising such a record, a duty to supply OFCOM with a copy of the record (in full).

Interpretation

(11)In this section—

  • alternative measures” means measures other than measures which are (in relation to the provider and the service in question) applicable measures in a code of practice;

  • code of practice” means a code of practice published under section 46;

  • relevant duties” means the duties set out in—

    (a)

    section 10 (illegal content),

    (b)

    section 12 (children’s online safety),

    (c)

    section 15 (user empowerment),

    (d)

    section 17 (content of democratic importance),

    (e)

    section 19 (journalistic content),

    (f)

    section 20 (content reporting), and

    (g)

    section 21 (complaints procedures),

    and for the purposes of subsection (6), also includes the duties set out in sections 18 (news publisher content), 71 and 72 (duties about terms of service), and 75 (deceased child users).

Back to top

Options/Help

Print Options

You have chosen to open The Whole Act

The Whole Act you have selected contains over 200 provisions and might take some time to download. You may also experience some issues with your browser, such as an alert box that a script is taking a long time to run.

Would you like to continue?

You have chosen to open The Whole Act as a PDF

The Whole Act you have selected contains over 200 provisions and might take some time to download.

Would you like to continue?

You have chosen to open The Whole Act without Schedules

The Whole Act without Schedules you have selected contains over 200 provisions and might take some time to download. You may also experience some issues with your browser, such as an alert box that a script is taking a long time to run.

Would you like to continue?

You have chosen to open The Whole Act without Schedules as a PDF

The Whole Act without Schedules you have selected contains over 200 provisions and might take some time to download.

Would you like to continue?

You have chosen to open the Whole Act

The Whole Act you have selected contains over 200 provisions and might take some time to download. You may also experience some issues with your browser, such as an alert box that a script is taking a long time to run.

Would you like to continue?

You have chosen to open the Whole Act without Schedules

The Whole Act without Schedules you have selected contains over 200 provisions and might take some time to download. You may also experience some issues with your browser, such as an alert box that a script is taking a long time to run.

Would you like to continue?

You have chosen to open Schedules only

The Schedules you have selected contains over 200 provisions and might take some time to download. You may also experience some issues with your browser, such as an alert box that a script is taking a long time to run.

Would you like to continue?

Close

Legislation is available in different versions:

Latest Available (revised):The latest available updated version of the legislation incorporating changes made by subsequent legislation and applied by our editorial team. Changes we have not yet applied to the text, can be found in the ‘Changes to Legislation’ area.

Original (As Enacted or Made): The original version of the legislation as it stood when it was enacted or made. No changes have been applied to the text.

Close

See additional information alongside the content

Show Explanatory Notes for Sections: Displays relevant parts of the explanatory notes interweaved within the legislation content.

Close

Opening Options

Different options to open legislation in order to view more content on screen at once

Close

Explanatory Notes

Text created by the government department responsible for the subject matter of the Act to explain what the Act sets out to achieve and to make the Act accessible to readers who are not legally qualified. Explanatory Notes were introduced in 1999 and accompany all Public Acts except Appropriation, Consolidated Fund, Finance and Consolidation Acts.

Close

More Resources

Access essential accompanying documents and information for this legislation item from this tab. Dependent on the legislation item being viewed this may include:

  • the original print PDF of the as enacted version that was used for the print copy
  • lists of changes made by and/or affecting this legislation item
  • confers power and blanket amendment details
  • all formats of all associated documents
  • correction slips
  • links to related legislation and further information resources
Close

More Resources

Use this menu to access essential accompanying documents and information for this legislation item. Dependent on the legislation item being viewed this may include:

  • the original print PDF of the as enacted version that was used for the print copy
  • correction slips

Click 'View More' or select 'More Resources' tab for additional information including:

  • lists of changes made by and/or affecting this legislation item
  • confers power and blanket amendment details
  • all formats of all associated documents
  • links to related legislation and further information resources