Online Safety Bill, Committee Stage (Recommittal), House of Commons, 13 & 15 December 2022


Key messages

  • Councils have a number of statutory responsibilities that are impacted by social media and online harms, including protection of children, public health messaging and the impact of dis and mis-information, mental health and wellbeing of adults and children, social care for vulnerable adults and those with learning disabilities and autism, and the holding of local elections. Councillors, democracy and local elections are also impacted by the social media and the online environment. Councils are also users of social media to reach residents and handle their enquiries.
  • The Local Government Association (LGA) supports the overall aims of the Online Safety Bill (OSB), which makes provisions for Ofcom to regulate certain internet services. The regulation proposed in this Bill is aimed at ensuring platforms have systems and processes in place to deal with illegal and harmful content and their associated risk. 
  • Under the provisions of the Bill, all regulated services will have a duty of care to remove illegal content and, if services are deemed accessible by children, a duty to protect children from harm. 
  • Further to this, the Bill places all companies within its scope that meet certain thresholds into three different categories (category 1, 2A and 2B), which will be subject to different requirements. This categorisation process is based primarily on size and functionality of the platform, and it is critical that the Government gets it right as category 1 services will be subject to additional duties. Given the current thresholds we are concerned that many smaller user-to-user platforms will not be subject to category 1 requirements, even if they host significant volumes of harmful content. We therefore continue to urge the Government to take a risk rather than size-based approach to categorising companies, so that all platforms with a high-risk profile, including smaller platforms, are subject to the strongest duties to prevent harm. We welcome the government’s amendment to require Ofcom to maintain a register of services that are close to meeting category 1 thresholds.
  • The Bill has been returned to committee stage following the Government’s decision to remove provisions relating to legal but harmful content in regards to adults. The LGA recognises the delicate balance this legislation must maintain between preserving users’ freedom of expression and civil liberties whilst also protecting users from harmful content. Previously, we welcomed the inclusion of the user verification duty and user empowerment duty into the Bill as they provide choice to adult users on what content and users they want to engage with, whilst allowing users to remain anonymous should they want or need too. We support the strengthening of user empowerment duties proposed in the government’s amendments at Committee Stage. The LGA has previously called for definitions of harms that the Bill will seek to protect against to be included, and welcomes the Government’s moves to incorporate these, while having concerns that they are too limited in regards to protecting elected members.
  • The removal of clauses around harmful content for adults move the onus from the provider to regulate such content on to the user to disengage. The LGA has called for more robust provisions to manage abuse that falls below the new communication-based criminal threshold and tackle mis and disinformation. We believe that the Bill should be an opportunity to deal with these growing areas of concern and are concerned that the wholesale removal of clauses on legal but harmful content weakens the ability to do that.
  • The LGA’s report ‘Debate Not Hate’ highlights that councillors are experiencing increasing levels of online intimidation, abuse and threats, which can prevent elected members from representing the communities they serve and undermine public trust in democratic processes. The LGA Councillor Census 2022 found that seven in 10 councillors reported experiencing abuse and/or intimidation over the last 12 months, the majority of which is carried out online or via social media. We believe that some of the government’s amendments make it harder for the bill to effectively tackle online abuse of councillors.  

Definitions of harm

  • The central premise of the Bill is to protect users from harm and associated risk. The LGA has previously called for the government to set out what specific harms platforms will need to protect users against. We welcome the inclusion of a list of harms in the government’s amendments, although for adults these will now only apply to user empowerment tools, and have concerns at the exclusion of lower-level abuse or harassment.

Government Amendments 6, 7 and 15, tabled by Paul Scully MP

  • Amendments 6 and 7 delete clauses 12 and 13, which previously created duties for service providers to undertake adults’ risk assessments and to set out how the service will protect adults’ online safety based on the risk assessment, in relation to legal but harmful content. Protection of adults from such content would be sought to be achieved through user-empowerment duties, as set out in clause 14, which would allow adult users to opt-in to measures to protect from legal but potentially harmful content. 
  • Amendment 15 incorporates into the duties on adult user-empowerment duties a list of the forms of content that would be defined as harmful and which the user should have access to tools to control their exposure to. The definition incorporates encouragement, promotion or instructions for suicide, self-harm and eating disorders; or content which is abusive towards or incites hatred towards people with a protected characteristic.

LGA view 

  • It is right that the Online Safety Bill protects the freedom of speech of users, including the freedom of users to criticise issues such as government policy. However, as it stands, the failure of social media companies to address abuse and harassment is driving people away from their platforms, reducing their opportunity to contribute to important debates and limiting their own freedom of speech. Evidence shows that those with certain characteristics, including women, disabled people and those from black and minority ethnic backgrounds receive more abuse than others, and this can result in people being less likely to speak out about complex issues or engage in online debate; failure to tackle this contributes to the existing challenges we face in relation to representation and inclusion of minority groups. While the LGA welcomes that these amendments recognise that those with certain protected characteristics are particularly targeted for abuse and harassment online, we regret that it places the onus on individuals to take steps to avoid such abuse or harassment. 
  • Councils are concerned that democracy can be impacted through abuse and harassment experienced on social media as outlined at paragraph 33. This can particularly impact on those from minority groups, for example, research by Amnesty International analysing tweets that mentioned women MPs in the run up to the 2017 General Election found that the 20 Black, Asian and Minority Ethnic MPs received 41 per cent of the abuse, despite making up less than 12 per cent of the those in the study. This type of treatment can make people less willing to speak out on important topics and, as the National Assembly for Wales found, can even lead to people standing down from political office. It is of vital democratic importance that people from all backgrounds feel safe to stand for election, helping to ensure that our governments reflect our communities. However, with those with certain protected characteristics more likely to be targeted with abuse, the ways in which social media currently operates makes this increasingly difficult.
  • One of the challenges of placing responsibility on users to deal with online abuse themselves is that, even if a user ignores or blocks a particular user, this does not always stop that content being published, potentially stirring up ill feeling online. Evidence has shown that people are promoted content that links with their existing views, leading to ‘echo chambers’ online where abusive content can be shared and amplified. While tools to support users are important, preventing harmful content, in particular violent and threatening content, from being published and shared in the first place by building safety into platform design is essential.
  • We are concerned that the definitions of content to be covered by user empowerment tools in amendment 15 are limited and may not incorporate substantial elements of the abuse and harassment that councillors and other elected politicians face on-line, whether as a result of protected characteristics, or simply by virtue of their being public office holders. Providers of social media platforms should be encouraged to introduce specific safeguards for those holding elected office, including fast track routes to report abuse, intimidation and harassment.
  • Content that encourages, promotes, or instructs users to engage in harmful behaviour should be considered harmful content within the Bill. We are concerned at the limited reference and provisions to tackle harms from online mis- and disinformation in the Bill or the amendments. 

Categorisation of internet services

  • All online services will be designated as a category 1, 2A or 2B services, their category will be dependent on the number of users (size) and the functionalities of that service. However, the thresholds for each category have not yet been determined and will be set out in secondary legislation.  
  • Crucially, only services that are ‘user-to-user’ services (an internet service which allows users to generate, upload or share content) and meet category 1 thresholds will be subject to additional duties. The Government has suggested that category 1 platforms will be those that are the highest risk and with the largest user-bases, such as the main social media platforms. 

LGA View 

  • The LGA are concerned that many smaller user-to-user platforms could be left out of scope of category 1 requirements, even if they host large volumes of harmful material.  
  • A recent academic study, conceptualizing “Dark Platforms,” looked at how Covid-19 related content, especially conspiracy theories, was communicated on “dark platforms” such as 8Kun. It found that digital platforms that are less regulated can be used for hosting content and content creators that may not be tolerated by more mainstream platforms. As currently drafted, there is a risk that the Bill could inadvertently push harmful content onto these sites.  
  • We therefore urge the Government to set out as soon as possible what companies will fall into which category and reconsider their approach to categorising services. Instead, the Government should take a ‘risk-based’ approach to categorising services to ensure that all platforms with a high-risk profile, including smaller platforms, fall within the scope of category 1.  
  • The fast-moving nature of the internet means that content, trends, or new sites can gain traction amongst users quickly, including harmful content and associated risks. In undertaking its duties under this Bill, it is important that Ofcom is able to react rapidly and nimbly to emerging platforms and risks, including being able to re-categorise to higher levels speedily.

Government Amendment NC7, tabled by Paul Scully MP 

  • This amendment requires Ofcom to maintain a register of ‘emerging category 1 services’ – these are defined as services where user numbers are at least 75% of the threshold (yet to be determined) for category 1 classification and at least one of the functionality requirements for category 1 classification.
  • The register will be required to be kept up-to-date and based on an assessment of all regulated user-to-user services.

 LGA View 

  • We support this amendment. The LGA welcomed the Minister’s reassurances during Second Reading of the Bill that companies can move between categories, and different parts of a large conglomerate can be regulated differently depending on their activities. However, we have raised concerns previously about the time this might take Ofcom. This amendment will help ensure Ofcom proactively identifies emerging risky services and is ready to assess and add these companies to the Category 1 register without undue delay. This is strongly welcomed by the LGA. 
  • The fast-moving nature of the internet means that content, trends, or new sites can gain traction amongst users quickly, including harmful content and associated risks.
  • More broadly, we urge Government to ensure Ofcom has adequate resources in order to respond quickly to both emerging harms and new platforms. 

Protecting children online

  • Councils have a duty under the Children Act 2004 to work with local police and health partners to safeguard and promote the welfare of children in their area. Child exploitation and grooming is a serious and growing crime. While the exploitation of children by criminals has, sadly, been happening for a long time, the risks to children and young people continue to increase, including as a result of criminals using online spaces to meet, groom and exploit their victims.  
  • According to a freedom of information request from the National Society for the Prevention of Cruelty to Children (the “NSPCC”), in 2020/21, online grooming offences reached record levels with the number of sexual communications with child offences in England and Wales increasing by almost 70 per cent in three years. 
  • The LGA strongly welcomes the Government’s ambition to ensure children are safe online, and we further welcome that provisions relating to child protection have been excluded from the amendments relating to harmful content for adults. However, we are concerned that the move away from a regulatory approach to harmful content for adults increases risks around children and young people accessing misinformation, content relating to self-harm and suicide, or extremist content, given the porous nature of the internet. We would urge the government to engage with child protection experts in identifying what extra steps can be taken to address this. 

Government Amendment NC1, tabled by Paul Scully MP 

  • This amendment requires Ofcom to provide guidance, including examples, to service providers on content they should consider to be harmful to children, as well as what content should be covered by user empowerment tools for adults.
  • Government amendment 15 provides broad headings for the forms of subject matter that would be covered by NC1, including encouragement to suicide, self-harm, eating disorders, abuse or incitement to hatred based on protected characteristics. 

LGA View 

  • The LGA welcomes the creation of a new duty for Ofcom to provide guidance to service providers on what material represents a threat to children and young people. Ofcom should engage with child protection experts and children themselves in developing this guidance.

Amendment 98, tabled by Kirsty Blackman MP

  • The amendment would require service providers to take steps to mitigate the harm caused to children by habit-forming features of their service. 

LGA View 

  • The LGA supports the recommendation from the Online Safety Bill Joint Committee recommends that the Bill should tackle design risks by a ‘responsibility on service providers to have in place systems and processes to identify reasonably foreseeable risks of harm and take proportionate steps to mitigate risks of harm’. It was disappointing that these recommendations were not subsequently incorporated in the bill. Addiction and habit-forming features, especially on social media sites, can have substantial impacts on childrens’ mental health, and as a foreseeable harm should be mitigated against by service providers.