Online Safety Bill, Remaining Stages, House of Commons, 17 January 2023

The Local Government Association (LGA) supports the overall aims of the Online Safety Bill (OSB), which makes provisions for Ofcom to regulate certain internet services. The regulation proposed in this Bill is aimed at ensuring platforms have systems and processes in place to deal with illegal and harmful content and their associated risk.

View allCommunity safety articles

Key messages

  • The Local Government Association (LGA) supports the overall aims of the Online Safety Bill (OSB), which makes provisions for Ofcom to regulate certain internet services. The regulation proposed in this Bill is aimed at ensuring platforms have systems and processes in place to deal with illegal and harmful content and their associated risk.
  • Under the provisions of the Bill, all regulated services will have a duty of care to remove illegal content and, if services are deemed accessible by children, a duty to protect children from harm.
  • Further to this, the Bill places all companies within its scope that meet certain thresholds into three different categories (category 1, 2A and 2B), which will be subject to different requirements. This categorisation process is based primarily on size and functionality of the platform, and it is critical that the Government gets it right as category 1 services will be subject to additional duties. Given the current thresholds we are concerned that many smaller user-to-user platforms will not be subject to category 1 requirements, even if they host significant volumes of harmful content. We therefore continue to urge the Government to take a risk rather than size-based approach to categorising companies, so that all platforms with a high-risk profile, including smaller platforms, are subject to the strongest duties to prevent harm. We welcome the government’s amendment, incorporated into the Bill at the re-convened committee stage to require Ofcom to maintain a register of services that are close to meeting category 1 thresholds.
  • The Bill returned to committee stage at the end of 2022 following the Government’s decision to remove provisions relating to legal but harmful content in regards to adults. The LGA recognises the delicate balance this legislation must maintain between preserving users’ freedom of expression and civil liberties whilst also protecting users from harmful content. Previously, we welcomed the inclusion of the user verification duty and user empowerment duty into the Bill as they provide choice to adult users on what content and users they want to engage with, whilst allowing users to remain anonymous should they want or need too. We support the strengthening of user empowerment duties in the government’s amendments that were accepted at committee stage. The LGA has previously called for definitions of harms that the Bill will seek to protect against to be included, and welcomes the Government’s moves to incorporate these, while having concerns that they are too limited in regards to protecting elected members.
  • The removal of clauses around harmful content for adults move the onus from the provider to regulate such content on to the user to disengage. The LGA has called for more robust provisions to manage abuse that falls below the new communication-based criminal threshold and tackle mis and disinformation. We believe that the Bill should be an opportunity to deal with these growing areas of concern and are concerned that the wholesale removal of clauses on legal but harmful content has weakened the ability to do that.
  • The LGA’s report ‘Debate Not Hate’ highlights that councillors are experiencing increasing levels of online intimidation, abuse and threats, which can prevent elected members from representing the communities they serve and undermine public trust in democratic processes. The LGA Councillor Census 2022 found that seven in 10 councillors reported experiencing abuse and/or intimidation over the last 12 months, the majority of which is carried out online or via social media. We believe that some of the government’s amendments make it harder for the bill to effectively tackle online abuse of councillors.

Definitions of harm

  • The central premise of the Bill is to protect users from harm and associated risk. The LGA has previously called for the government to set out what specific harms platforms will need to protect users against. We welcome the inclusion of a list of harms in the Bill following government amendments at re-convened committee stage, although for adults these will now only apply to user empowerment tools, and have concerns at the exclusion of lower-level abuse or harassment.
  • The amended Bill now incorporates into the duties on adult user-empowerment duties a list of the forms of content that would be defined as harmful and which the user should have access to tools to control their exposure to. The definition incorporates encouragement, promotion or instructions for suicide, self-harm and eating disorders; or content which is abusive towards or incites hatred towards people with a protected characteristic. Given recent developments, such as the removal, subsequently rescinded, of suicide prevention prompts on Twitter in December 2022, the LGA welcomes the specific inclusion of suicide and self-harm on the face of the Bill.

LGA view

  • It is right that the Online Safety Bill protects the freedom of speech of users, including the freedom of users to criticise issues such as government policy. However, as it stands, the failure of social media companies to address abuse and harassment is driving people away from their platforms, reducing their opportunity to contribute to important debates and limiting their own freedom of speech. Evidence shows that those with certain characteristics, including women, disabled people and those from black and minority ethnic backgrounds receive more abuse than others, and this can result in people being less likely to speak out about complex issues or engage in online debate; failure to tackle this contributes to the existing challenges we face in relation to representation and inclusion of minority groups. While the LGA welcomes that these amendments recognise that those with certain protected characteristics are particularly targeted for abuse and harassment online, we do not agree with the onus being placed on individuals to take steps to avoid such abuse or harassment.

  • Councils are concerned that democracy can be impacted through abuse and harassment experienced on social media. This can particularly impact on those from minority groups, for example, research by Amnesty International analysing tweets that mentioned women MPs in the run up to the 2017 General Election found that the 20 Black, Asian and Minority Ethnic MPs received 41 per cent of the abuse, despite making up less than 12 per cent of the those in the study. This type of treatment can make people less willing to speak out on important topics and, as the National Assembly for Wales found, can even lead to people standing down from political office. It is of vital democratic importance that people from all backgrounds feel safe to stand for election, helping to ensure that our governments reflect our communities. However, with those with certain protected characteristics more likely to be targeted with abuse, the ways in which social media currently operates makes this increasingly difficult.

  • One of the challenges of placing responsibility on users to deal with online abuse themselves is that, even if a user ignores or blocks a particular user, this does not always stop that content being published, potentially stirring up ill feeling online. Evidence has shown that people are promoted content that links with their existing views, leading to ‘echo chambers’ online where abusive content can be shared and amplified. This material can become increasingly extreme over time, providing a possible gateway into other harmful platforms and an increasing interest in violence. Therefore while tools to support users are important, preventing harmful content, in particular violent and threatening content, from being published and shared in the first place by building safety into platform design is essential.

  • The LGA is concerned that the definitions of content to be covered by user empowerment tools in the amended Bill are limited and may not incorporate substantial elements of the abuse and harassment that councillors and other elected politicians face on-line, whether as a result of protected characteristics, or simply by virtue of their being public office holders. Providers of social media platforms should be encouraged to introduce specific safeguards for those holding elected office, including fast track routes to report abuse, intimidation and harassment.

  • Content that encourages, promotes, or instructs users to engage in harmful behaviour should be considered harmful content within the Bill. We are concerned at the limited reference and provisions to tackle harms from online mis- and disinformation in the Bill.

New clauses 4–7, tabled by Alex Davies-Jones MP

  • These new clauses seek to introduce the concept of ‘harm to adults and society’ to the face of the bill as a lighter-touch approach to regulating content aimed at adult users than the now-removed ‘legal but harmful’ portions of the Bill. Content covered by these new clauses would be defined by the Secretary of State under secondary legislation, and would need to present a material risk of significant harm to an ‘appreciable number’ of adults in the UK.
  • Service providers would be required to undertake risk assessments in relation to content that is harmful to adults and society, including harm relating to individuals with a protected characteristic, to set out a summary of this assessment in their terms of service, and details of how they will treat content identified as posing a risk of such harm.
  • Ofcom would obtain powers to set out minimum standards for the provisions that service providers would be expected to take in relation to material that is potentially harmful to adults and society, and to review annually the extent to which providers are meeting those standards.

LGA view

  • The LGA supports these amendments. They would require service providers to consider how they would approach issues around misinformation, content that could be harmful to democracy and issues of harassment and intimidation of elected members and candidates which would not be covered by the Bill as it stands after committee stage.

  • These amendments would recalibrate responsibility for adult online safety so that it is shared between user and service provider. The amended Bill after committee stage provides substantial and welcome tools to empower adult users to calibrate their online experience and the content they see, but means that the onus lies almost solely on the user to take steps in relation to potentially harmful content. It should not be the sole responsibility of, for example, a victim of online harassment, to take action to avoid such content. 

New Clause 8, tabled by Vicky Ford MP

  • This amendment clarifies that all duties in the Bill that relate to content promoting self-harm should also include content promoting eating disorders. 

LGA view

  • The LGA supports this amendment. Eating disorders are a form of self-harm and their promotion online represent a significant threat to the mental and physical health of individuals, particularly younger and vulnerable people. It would therefore be very welcome to have clarity on the face of the Bill that content promoting eating disorders should be treated as within the scope of the duties imposed on service providers.

Categorisation of internet services

  • All online services will be designated as a category 1, 2A or 2B services, their category will be dependent on the number of users (size) and the functionalities of that service. However, the thresholds for each category have not yet been determined and will be set out in secondary legislation.
  • Crucially, only services that are ‘user-to-user’ services (an internet service which allows users to generate, upload or share content) and meet category 1 thresholds will be subject to additional duties. The Government has suggested that category 1 platforms will be those that are the highest risk and with the largest user-bases, such as the main social media platforms.
  • At the re-convened committee stage, government amendments were added to the Bill which would require Ofcom to maintain a list of platforms or other online services that are close to the threshold for classification as category 1 services, enabling faster action to impose greater safety responsibilities when emerging platforms become influential. The LGA strongly welcomes this. However, we are concerned that at committee stage the minister again insisted that both categorisation and this register would be subject to user number thresholds, rather than a risk-based approach.

LGA view

  • The LGA is concerned that many smaller user-to-user platforms could be left out of scope of category 1 requirements, even if they host large volumes of harmful material.
  • A recent academic study, 'Conceptualizing "Dark Platforms"', looked at how COVID-19 related content, especially conspiracy theories, was communicated on 'dark platforms' such as 8Kun. It found that digital platforms that are less regulated can be used for hosting content and content creators that may not be tolerated by more mainstream platforms. As currently drafted, there is a risk that the Bill could inadvertently push harmful content onto these sites.
  • We therefore urge the Government to set out as soon as possible what companies will fall into which category and reconsider their approach to categorising services. Instead, the Government should take a ‘risk-based’ approach to categorising services to ensure that all platforms with a high-risk profile, including smaller platforms, fall within the scope of category 1.
  • The fast-moving nature of the internet means that content, trends, or new sites can gain traction amongst users quickly, including harmful content and associated risks. In undertaking its duties under this Bill, it is important that Ofcom is able to react rapidly and nimbly to emerging platforms and risks, including being able to re-categorise to higher levels speedily. We therefore strongly support the new register of ‘emerging category 1 services’, but it is vital that Ofcom has adequate resources to act swiftly and comprehensively in a complex and rapidly changing online environment. 

Protecting children online

  • Councils have a duty under the Children Act 2004 to work with local police and health partners to safeguard and promote the welfare of children in their area. Child exploitation and grooming is a serious and growing crime. While the exploitation of children by criminals has, sadly, been happening for a long time, the risks to children and young people continue to increase, including as a result of criminals using online spaces to meet, groom and exploit their victims.
  • According to a freedom of information request from the National Society for the Prevention of Cruelty to Children (NSPCC), in 2020/21, online grooming offences reached record levels with the number of sexual communications with child offences in England and Wales increasing by almost 70 per cent in three years.
  • The LGA strongly welcomes the Government’s ambition to ensure children are safe online, and we further welcome that provisions relating to child protection have been excluded from the amendments relating to harmful content for adults. However, we are concerned that the move away from a regulatory approach to harmful content for adults increases risks around children and young people accessing misinformation, content relating to self-harm and suicide, or extremist content, given the porous nature of the internet. We would urge the government to engage with child protection experts in identifying what extra steps can be taken to address this.
  • At re-convened committee stage, government amendments were added to the Bill requiring Ofcom to provide guidance, including examples, to service providers on content they should consider to be harmful to children, as well as what content should be covered by user empowerment tools for adults. The amendments set out that the guidance should specifically include encouragement to suicide, self-harm, eating disorders, abuse or incitement to hatred based on protected characteristics as content that should be covered. The LGA welcomes the creation of this new duty. Ofcom should engage with child protection experts and children themselves in developing this guidance.

New Clause 3, tabled by Kirsty Blackman MP

  • This amendment introduces a range of new duties on category 1 services in relation to providing features that would empower all child users of their services to increase their control over harmful or potentially harmful content. These would include options to only encounter content by users they have approved; the ability to filter out private messages from non-verified users or adult users.

LGA view

  • The LGA supports this amendment. It is important that protecting children from harm does not become the responsibility of the child user and the onus to provide such protection remains with the service provider. However, this amendment would require worthwhile additional tools to be provided to child users that would enable potentially harmful content to be flagged and for interactions with other users to be restricted.
  • The LGA supports the recommendation from the Online Safety Bill Joint Committee that the Bill should tackle design risks by a "responsibility on service providers to have in place systems and processes to identify reasonably foreseeable risks of harm and take proportionate steps to mitigate risks of harm". We believe this amendment would represent a step towards fulfilling that recommendation in regards to child users.