Online Safety Bill, Report Stage, House of Commons, 12 July 2022

The Local Government Association (LGA) supports the overall aims of the Online Safety Bill (OSB), which makes provisions for Ofcom to regulate certain internet services.

View allCommunity safety articles

Key messages

  • The Local Government Association (LGA) supports the overall aims of the Online Safety Bill (OSB), which makes provisions for Ofcom to regulate certain internet services. The regulation proposed in this Bill is aimed at ensuring platforms have systems and processes in place to deal with illegal and harmful content and their associated risk.
  • Under the provisions of the Bill, all regulated services will have a duty of care to remove illegal content and, if services are deemed accessible by children, a duty to protect children from harm.
  • Further to this, the Bill places all companies within its scope that meet certain thresholds into three different categories (category 1, 2A and 2B), which will be subject to different requirements. This categorisation process is based primarily on size and functionality of the platform, and it is critical that the Government gets it right as category 1 services will be subject to additional duties. Given the current thresholds we are concerned that many smaller user-to-user platforms will not be subject to category 1 requirements, even if they host significant volumes of harmful content. We therefore urge the Government to take a risk rather than size-based approach to categorising companies, so that all platforms with a high-risk profile, including smaller platforms, are subject to the strongest duties to prevent harm.
  • The central aim of the Bill is to protect users from harm and associated risk. The Bill provides a definition of harm, but the Government will set out in secondary legislation ‘priority categories’ of types of harm that certain services will have to take action against. Therefore, it remains unclear what type of harms platforms will need to protect users against. We urge the Government to set out their proposed categories of harmful content for both children and adults as soon as possible and consult with independent experts to define and prioritise harmful content. Ofcom must also be given adequate resources so that they can be agile and produce guidance at pace in line with emerging types of online harm.  
  • The LGA recognises the delicate balance this legislation must maintain between preserving users’ freedom of expression and civil liberties whilst also protecting users from harmful content.  We welcomed the inclusion of the user verification duty and user empowerment duty into the Bill as they provide choice to adult users on what content and users they want to engage with, whilst allowing users to remain anonymous should they want or need too.
  • The Bill introduces four new communication based criminal offences: a harm-based communication offence, a false communication offence, a threatening communication offence and a cyber-flashing offence. This is a positive step forward, however progress in holding perpetrators to account will be dependent on the police and CPS being given adequate resources, training, and comprehensive guidance to ensure these offences are used appropriately. The LGA previously called for cyber-flashing to be made a criminal offence, so we welcome its inclusion in the Bill. To deliver the best outcomes for young people, it is vital the police and CPS are instructed to take a safeguarding approach for children and young people when applying this offence, rather than a criminal approach.
  • The approach set out within the Bill to regulate harmful content, create new offences that recognise online harms and enable adult users to decide what content and users they interact with is a positive step forward. However, the Government and Ofcom should go further with more robust provisions to manage abuse that falls below the new communication-based criminal threshold and tackle mis and disinformation. This should include amending the Bill to ensure the democratic and journalist protections do not inadvertently protect perpetrators of abuse.
  • Our new report ‘Debate Not Hate’ highlights that councillors are experiencing increasing levels of online intimidation, abuse and threats, which can prevent elected members from representing the communities they serve and undermine public trust in democratic processes. The LGA Councillor Census 2022 found that seven in 10 councillors reported experiencing abuse and/or intimidation over the last 12 months, the majority of which is carried out online or via social media. We want to work with Government to ensure this Bill makes the progress that is needed to make the internet safer for our communities.

Categorisation of internet services

  • All online services will be designated as a category 1, 2A or 2B services, their category will be dependent on the number of users (size) and the functionalities of that service. However, the thresholds for each category have not yet been determined and will be set out in secondary legislation.
  • Crucially, only services that are ‘user-to-user’ services (an internet service which allows users to generate, upload or share content) and meet category 1 thresholds will be subject to additional duties. The Government has suggested that category 1 platforms will be those that are the highest risk and with the largest user-bases, such as the main social media platforms.

LGA View

  • The LGA are concerned that many smaller user-to-user platforms could be left out of scope of category 1 requirements, even if they host large volumes of harmful material.
  • A recent academic study, conceptualizing “Dark Platforms,” looked at how Covid-19 related content, especially conspiracy theories, was communicated on “dark platforms” such as 8Kun. It found that digital platforms that are less regulated can be used for hosting content and content creators that may not be tolerated by more mainstream platforms. As currently drafted, there is a risk that the Bill could inadvertently push harmful content onto these sites.
  • We therefore urge the Government to set out as soon as possible what companies will fall into which category and reconsider their approach to categorising services. Instead, the Government should take a ‘risk-based’ approach to categorising services to ensure that all platforms with a high-risk profile, including smaller platforms, fall within the scope of category 1.
  • The Government have committed to undertaking further research to determine whether there is sufficient evidence to expand the duties on small but risky platforms. Whilst this step is welcome, the LGA believe the evidence already exists to regulate more thoroughly smaller platforms with higher risk.

Amendment NC1, tabled by Sir Jeremy Wright MP

  • This amendment would enable Ofcom when re-categorising services to require services to comply immediately with such duties on a provisional basis pending full re-assessment.
  • Currently Ofcom will be given 6 – 18 months from when the Act is passed to undertake research and recommend to Government which categories individual services will fall under. Any decisions by the Secretary of State to categorise services will be conditional on reviewing Ofcom’s research. However, it remains unclear how quickly Ofcom would be able to re-categorise services or categorise new services. 

LGA View

  • We support this amendment. The LGA welcomed the Minister’s reassurances during Second Reading of the Bill that companies can move between categories, and different parts of a large conglomerate can be regulated differently depending on their activities. However, we have raised concerns previously about the time this might take Ofcom.
  • The Government have committed to bringing forward amendments within the House of Lords to ensure Ofcom can identify and publish a list of companies that are close to the Category 1 thresholds. This is to ensure Ofcom proactively identifies emerging risky companies and is ready to assess and add these companies to the Category 1 register without delay. This is strongly welcomed by the LGA.
  • The fast-moving nature of the internet means that content, trends, or new sites can gain traction amongst users quickly, including harmful content and associated risks. This amendment would enable Ofcom to respond quickly to emerging harms and impose duties immediately whilst they re-assess the service’s category.
  • More broadly, we urge Government to ensure Ofcom has adequate resources in order to respond quickly to both emerging harms and new platforms.

 

Protecting children online

  • Councils have a duty under the Children Act 2004 to work with local police and health partners to safeguard and promote the welfare of children in their area. Child exploitation and grooming is a serious and growing crime. While the exploitation of children by criminals has, sadly, been happening for a long time, the risks to children and young people continue to increase, including as a result of criminals using online spaces to meet, groom and exploit their victims.
  • According to a freedom of information request from the National Society for the Prevention of Cruelty to Children (the “NSPCC”), in 2020/21, online grooming offences reached record levels with the number of sexual communications with child offences in England and Wales increasing by almost 70 per cent in three years.
  • The LGA strongly welcomes the Government’s ambition to ensure children are safe online, however there are a number of ways this Bill can be strengthened to ensure they effectively tackle the range of ways in which abusers use social networks.

Amendment 17, tabled by Alex Davies-Jones MP

  • This amendment would incorporate into the relevant child safety duties a requirement to consider cross-platform risk.
  • The LGA welcomed the assurances from Government during Committee Stage that if Ofcom finds that services are not engaging in appropriate collaborative behaviour, which means they are not discharging their duty to protect people and children, it can intervene.

LGA View

  • Online abuse and exploitation often happens across multiple platforms, for example young people will often be playing a game together on one platform whilst talking with one another on a separate service.  For example, they could meet on a more ‘child-friendly’ site before moving to another that offers alternative contact options.
  • Given the cross-platform risk posed to children, we would ask that the Government goes further and instructs Ofcom to specifically address cross-platform risks. This should include placing a clear requirement on platforms to co-operate and respond to cross-platform harms when discharging their safety duties.

Amendment 162, tabled by Kirsty Blackman MP – child user condition

  • The amendment removes the word ‘significant’ from the child user condition in clause 31. It has the effect of removing the requirement for an online platform to have a ‘significant’ number of child users for it to be subject to the child safety duty, and replaces it with ‘a number’ of child users.

LGA View

  • Given the risk of harm to children on social media platforms, we are concerned about the proposed ‘child user condition’ within Clause 31 of the Bill. At present, platforms will only have to comply with the child safety duty if they have a significant number of child users or children form a significant part of each user base. This condition could therefore inadvertently result in some services claiming they are excluded from the child safety duties because children do not account for a ‘significant’ portion of their user base.
  • We urge Parliamentarians to remove this child user condition from the Bill and replace it with a risk-based framework for assessing children’s access to a platform and likely exposure to harmful content on it. We would also support Parliamentarians removing the child user condition in its entirety so that all services had to adhere to relevant child safety duties.

Definitions of harm

  • The central premise of the Bill is to protect users from harm and associated risk. Whilst the Bill provides a definition of harm and sets out how harmful content will be prioritised for child and adult users; it remains unclear what specific harms platforms will need to protect users against.
  • The LGA believes that content that encourages, promotes, or instructs users in harmful behaviour should be considered harmful content within the Bill. We would urge the Government to set out during the passage of the Bill how they will designate harms into these categories for both children and adult users.

Amendment 20, tabled by Alex Davies-Jones MP

  • This amendment would require the Secretary of State’s designation of “priority content that is harmful to adults” to include public health-related misinformation or disinformation, and misinformation or disinformation spread by a foreign state.

LGA View

  • We support this amendment and urge the Government to put on record their intention of making public health related mis or disinformation a priority harm.
  • Content that encourages, promotes, or instructs users to engage in harmful behaviour should be considered harmful content within the Bill. We are concerned at the limited reference and provisions to tackle harms from online mis- and disinformation in the Bill. Whilst ‘priority content’ harmful to adults might include mis and dis information, this is yet to be clarified.

Protection of democratic and journalistic content

  • Clauses 15 and 16 place a duty for category 1 services to protect content of democratic and journalistic importance. This includes decisions about whether to take the content down, restrict access to it, or act against a user of the service. For example, category 1 services could adopt processes to identify democratically and journalistic important content and ensure users have access to this, even where it might otherwise be removed.
  • As currently drafted, it is unclear if broad interpretations of clauses 15 and 16 could unintentionally protect harmful disinformation and abuse that could be classified by some as political speech or journalism.

Amendment 10, tabled by Sir Jeremy Wright MP

  • This amendment would refine the definition of journalistic content in clause 16, by removing the protection it gives to ‘regulated user-generated content’. This amendment has the effect of only protecting content of journalistic importance that is published by a regulated news publisher.

LGA View

  • Councillors, particularly those with protected characteristics, are experiencing increasing levels of online intimidation, abuse and threats, which can prevent elected members from representing the communities they serve and undermine public trust in democratic processes. The LGA Councillor Census 2022 data found that seven in 10 councillors reported experiencing abuse and/or intimidation over the last 12 months. 73 per cent of respondents with multiple experiences of abuse stated that they received abuse by social media.  
  • Our research also found that people perpetrating abuse against councillors sometimes defined themselves as ‘citizen journalists,’ and there is evidence that police apply a very high threshold for taking action even when abuse could constitute a hate crime. The LGA regularly hears reports of councillors being subject to abuse and campaigns of disinformation- often with the intent of smearing the councillors’ character-  which is being defended by the perpetrator as journalism.
  • Democratic debate and high-quality journalism (from large organisations through to citizen journalists providing important local information) are essential democratic principles which must be protected. However, there must also be safeguards to ensure that this is not abused and used to protect actions which amount to harassment, actions which intend to intimidate a councillor into changing their position or stoke division through misinformation.
  • We therefore welcome this amendment which refines the definition of journalistic content by removing the ability of citizen journalists to claim protection under this clause.
  • More broadly, we would like to ensure that the definition of regulated news publisher is clear and robust. It is unclear if far-right figures, for example Tommy Robinson, could be categorised as a regulated news publisher if they have an official website. It is imperative that all journalistic content should still not be allowed to be abusive or spread misinformation.