Online Safety Bill, Third Reading, House of Lords - 6 September 2023

The LGA recognises the delicate balance this legislation must maintain between preserving users’ freedom of expression and civil liberties whilst also protecting users from harmful content.

View allSafeguarding children articles
View allCommunity safety articles

Key messages

  • The Local Government Association (LGA) supports the overall aims of the Online Safety Bill (OSB), which makes provisions for the regulation by Ofcom of certain internet services. The regulation proposed in this Bill is aimed at ensuring platforms have systems and processes in place to deal with illegal and harmful content and their associated risk, particularly to children and young people. The Bill primarily does this by introducing duties of care to some user-to-user services (e.g. social media sites) and search engines. The Bill also imposes duties on such providers in relation to the protection of users’ rights to freedom of expression and privacy.
  • Under the provisions of the Bill, all regulated services will have a duty of care in relation to illegal content and if services are deemed accessible by children, a duty to protect children from harm. Further to this, the Bill sets out that regulated services will be categorised into three categories within which different duties will apply. For example, providers of user-to-user services which meet a specified threshold (“Category 1 services”) are subject to additional duties. Further clarity is needed as to how services will be categorised.
  • The LGA recognises the delicate balance this legislation must maintain between preserving users’ freedom of expression and civil liberties whilst also protecting users from harmful content. We therefore welcome the user verification and user empowerment duty within this Bill that apply to category 1 services. The LGA welcomes these duties as they provide choice to adult users on what content and users they want to engage with whilst also allowing users to remain anonymous should they want or need too.
  • The OSB introduces a number of new criminal offences: a false communication offence; a threatening communication offence; an offence of sending flashing images with the intent of causing epileptic seizures; and a cyber-flashing offence. The LGA previously called for cyber-flashing to be made a criminal offence, so we welcome its inclusion within the Bill. Overall, these offences are a useful provision to ensure that individuals posting harmful content are held to account. However, it will be dependent on the police and CPS being given adequate resources, training, and comprehensive guidance to ensure these offences are used appropriately.
  • The LGA broadly welcomes the new threatening and false communication offences, as well as the user empowerment and verification duty that will enable users to control what content and users they interact with. However, we encourage the Government and Ofcom to go further and adopt a clearer and more robust provisions to manage ‘low-level’ abuse experienced by councillors that falls below the criminal threshold. As part of this, the LGA would like assurances from the Government that the democratic and journalistic protections set out in this Bill will not inadvertently protect perpetrators of abuse.
  • Councillors are experiencing increasing levels of online intimidation, abuse and threats made against them, which can prevent elected members from representing the communities they serve and undermine public trust in democratic processes. We hope this Bill will go some way in addressing the concerns we have heard from our membership. However, we regret the removal of the harm-based communications offence by the government at committee stage in the Commons, which could have been an important tool in tackling this intimidation, harassment and abuse.
  • To ensure online service providers adhere to their new responsibilities, the Bill introduces new regulatory powers and responsibilities for Ofcom. Ofcom will be responsible for drafting codes of practice for all duties and ensuring online platforms have the systems in place to adhere to these responsibilities; they also have powers to hold services to account should they need to. We ask Ofcom engages fully with relevant groups such as political parties and the LGA when developing its codes of practice to ensure there is consideration of unintended consequences. Ofcom must also be given adequate resources so that they can be agile and produce guidance at pace in line with emerging ‘harmful’ issues, as well as being able to take effective enforcement actions and deal adequately and speedily with complaints.
  • The LGA supported the Draft Online Safety Bill Joint Committee’s recommendation calling for Ofcom to publish a ‘safety by design’ code of practice. It is disappointing this has not been adopted, the LGA encourages the Government to produce a ‘safety by design’ overarching code of practice that can be referenced and adopted within the individual codes of practice.
  • Limited changes have been made to the Bill at Lords Committee and Report Stages, and while many of these are of a technical nature, the LGA welcomes the introduction of a new offence of encouraging or assisting another person to seriously self-harm; the creation of a new offence of failing to comply with an Ofcom confirmation decision in relation to the duties to protect children’s online safety; and to strengthen the requirements of the effectiveness of technology to be employed to permit adult users of social media services only to engage with other users with verified identities.
  • The LGA particularly welcomes the adoption of the amendment moved by Baroness Morgan at Report Stage, which represents a move towards a risk-based approach to the classification – and therefore level of duty applied to – digital platforms. The Bill requires Ofcom to categorise digital platforms in order to determine the duties they will be required to adhere to, including on user safe-guarding and the provision of user empowerment tools. As originally introduced, the Online Safety Bill would only allow Ofcom to place platforms with the largest user bases in category 1, which requires the most additional duties. Along with other stakeholders, the LGA has lobbied for a risk-based approach that would permit smaller platforms to be placed in category 1, as such platforms can have significant local reach, can contribute to issues around misinformation, host extremist material or represent a risk to children and young people. Baroness Morgan’s amendment permits Ofcom to place a platform in category 1 based on an assessment of the risk of harm posed by its functionality, even if it is of a smaller size in terms of its current user base.

FURTHER INFORMATION

The Online Safety Bill was published on 17 March 2022. The explanatory notes to Bill set out the case for a new regulatory framework for internet services. During the Bill’s passage through the Commons, significant changes were made through recommittal at Committee stage, including the removal of most measures relating to ‘legal but harmful’ content in relation to adults.

Many of these proposals were set out in Draft Online Safety Bill and were already subject to consultation, following the Government’s Online Harm White Paper and scrutiny from the Draft Online Safety Bill Joint Committee.

This Bill is of significant interest to councils, covering a wide range of issues from child protection and public health issues, to abuse and intimidation and free speech.

This briefing covers the LGA’s views on selected amendments that are relevant to local communities and councils.

Definitions of harm

  • The central premise of the Bill is to protect users from harm and associated risk. The LGA has previously called for the government to set out what specific harms platforms will need to protect users against. We welcome the inclusion of a list of harms in the Bill following government amendments at re-convened Committee stage in the Commons, although for adults these will now only apply to user empowerment tools, and we have concerns at the exclusion of lower-level abuse or harassment and the wider impact of online misinformation and hate on democracy.

     
  • The Bill now incorporates into the duties on adult user-empowerment a list of the forms of content that would be defined as harmful and which the user should have access to tools to control their exposure to. The definition incorporates encouragement, promotion or instructions for suicide, self-harm and eating disorders; or content which is abusive towards or incites hatred towards people with a protected characteristic. Given recent developments, such as the removal, subsequently rescinded, of suicide prevention prompts on Twitter in December 2022, the LGA welcomes the specific inclusion of suicide and self-harm on the face of the Bill. 

LGA View 

  • It is right that the Online Safety Bill protects the freedom of speech of users, including the freedom of users to criticise issues such as government policy. However, as it stands, the failure of social media companies to address abuse and harassment is driving people away from their platforms, reducing their opportunity to contribute to important debates and limiting their own freedom of speech. Evidence shows that those with certain  characteristics, including women, disabled people and those from black and minority ethnic backgrounds receive more abuse than others, and this can result in people being less likely to speak out about complex issues or engage in online debate; failure to tackle this contributes to the existing challenges we face in relation to representation and inclusion of minority groups. While the LGA welcomes that these amendments recognise that those with certain protected characteristics are particularly targeted for abuse and harassment online, we regret that it places the onus on individuals to take steps to avoid such abuse or harassment.
  • One of the challenges of placing responsibility on users to deal with online abuse themselves is that, even if a user ignores or blocks a particular user, this does not always stop that content being published, potentially stirring up ill feeling online. Evidence has shown that people are promoted content that links with their existing views, leading to ‘echo chambers’ online where abusive content can be shared and amplified. This material can become increasingly extreme over time, providing a possible gateway into other harmful platforms and an increasing interest in violence. Therefore, while tools to support users are important, preventing harmful content, in particular violent and threatening content, from being published and shared in the first place by building safety into platform design is essential.

     
  • The LGA is concerned that the definitions of content to be covered by user empowerment tools in the Bill are limited and may not incorporate substantial elements of the abuse and harassment that councillors and other elected politicians face on-line, whether as a result of protected characteristics, or simply by virtue of their being public office holders. Providers of social media platforms should be encouraged to introduce specific safeguards for those holding elected office, including fast track routes to report abuse, intimidation and harassment.

     
  • The LGA believes that as the bill stands, it does not do enough to recalibrate responsibility for adult online safety so that it is shared between user and service provider. The amended Bill after the re-convened Commons committee stage provides substantial and welcome tools to empower adult users to calibrate their online experience and the content they see but means that the onus lies almost solely on the user to take steps in relation to potentially harmful content. It should not be the sole responsibility of, for example, a victim of online harassment, to take action to avoid such content. Introducing a requirement that the Secretary of State and Ofcom (and therefore, by extension, the providers of regulated internet services) undertake their functions with reference to the overarching principle that regulated internet services should be safe by design can rebalance that responsibility.

     
  • Given recent developments, such as the removal, subsequently rescinded, of suicide prevention prompts on Twitter in December 2022, the LGA welcomes the specific inclusion of suicide, self-harm and eating disorders on the face of the Bill as areas to be regulated through user empowerment tools. However, given the very high level of potential risk to individuals from exposure to such material, it does not appear appropriate – from a public health perspective – for the default position of a platform to be that users would be liable to be exposed to such material. The LGA would support a requirement that the default position on all platforms should be that users should not be exposed to such material, and that users must pro-actively employ user empowerment tools if they wish to change that.

     
  • It is vital that the Bill recognises that disproportionate harm is experienced by people on the basis of their protected characteristic(s) and that statutory duties should be exercised with reference to this. Surveys have found that female, BAME or LGBTQ+ candidates and elected members experience significantly higher levels of online abuse and harassment.

Impact on councillors, candidates and democracy

  • Councils are concerned that democracy can be impacted through abuse and harassment experienced on social media. This can particularly impact on those from minority groups, for example, research by Amnesty International analysing tweets that mentioned women MPs in the run up to the 2017 General Election found that the 20 Black, Asian and Minority Ethnic MPs received 41 per cent of the abuse, despite making up less than 12 per cent of the those in the study. This type of treatment can make people less willing to speak out on important topics and can even lead to people standing down from political office. In its Debate Not Hate report last year, the LGA found that online abuse and smear campaigns can deter people from standing for election or for leadership positions and that this particularly applied to some people with protected characteristics. It is of vital democratic importance that people from all backgrounds feel safe to stand for election, helping to ensure that our governments reflect our communities. However, with those with certain protected characteristics more likely to be targeted with abuse, the ways in which social media currently operates makes this increasingly difficult.

     
  • Councillors and candidates at local government elections, like other elected officials, are often subject to substantial levels of abuse and harassment online, which crosses the line from legitimate scrutiny and criticism. The default position should be that the onus rests upon digital platforms to ensure that elected officials and candidates for office – like any other individual – are not subject to extremes of abuse and harassment, rather than for the targeted individuals to be required to take action to avoid it.

Disinformation and misinformation

  • Misinformation and disinformation spread online is of increasing concern to many councils, including (but not limited to) in areas around public health, asylum and refugee accommodation, planning and transport policy. This represents a broad risk to public wellbeing and good governance, but also a specific one for councillors and council officers, who can become targets for abuse, harassment and intimidation on the basis of such misinformation.

     
  • The LGA welcomes the commitment in the Bill to establish an advisory committee on disinformation and misinformation, in recognition of the growing scale of this issue. It is appropriate, given the level of concern, that it should be established speedily, and should consider whether a dedicated code of practice would be of value to both service providers and regulators.

     
  • During the Covid pandemic, misinformation, conspiracy theories and anti-vaccine content on social media proved a significant challenge globally. The World Health Organisation suggested the spread of misinformation, “amplified on social media and other digital platforms, is proving to be as much a threat to global public health as the virus itself.”

     
  • Many councils found that misinformation spread via social media proved a challenge in ensuring compliance with Covid regulations and in ensuring high levels of vaccine take-up. Historically, similar patterns have been experienced in relation to, for example, take-up of the MMR jab and subsequent increases in instances of mumps.
  • Where content can be proven to be false, this should fall within the definition of “content that is harmful to adults”, and there should be a clear duty on regulated services to remove, clearly label or otherwise meaningfully restrict access to such content.
  • The LGA supports the creation of a false communications offence. A significant issue for many of our members is that of smear campaigns, in which falsehoods are shared about councillors, prospective councillors or officers to prevent them from holding office or to cast doubt on their professional competence. Such deliberate communication of false information can amount to harassment or lead to significant psychological harm – or even physical harm, alongside professional and personal damage to councillors and candidates. Much of this is shared by those claiming to be doing so in the name of journalism or political debate. While this could in theory be removed as misinformation or via expensive libel claims, in practice this is rarely the case.
  • Following the removal from the Bill of clauses on the ‘legal but harmful’ content, which would have in some cases required service providers to regulate, by removal or otherwise, deliberately false and harmful communications, it is of increased importance to retain the criminal offence option to seek to dissuade individuals from the deliberate communication of harmful false communications. This is a complex area where freedom of speech must be carefully protected. The LGA believes that the narrowly drawn offence in Clause 60, and the exemptions provided in Clause 61, do ensure that the offence would only impact on deliberately harmful instances of false communications, and would not impact on legitimate political debate and disagreement, on journalism, or on genuine misunderstandings or differences of perspective around a given set of facts.

Categorisation of internet services

  • All online services will be designated as a category 1, 2A or 2B services, their category will be dependent on the number of users (size) and the functionalities of that service. However, the thresholds for each category have not yet been determined and will be set out in secondary legislation.  

     
  • Crucially, only services that are ‘user-to-user’ services (an internet service which allows users to generate, upload or share content) and meet category 1 thresholds will be subject to additional duties. The Government has suggested that category 1 platforms will be those that are the highest risk and with the largest user-bases, such as the main social media platforms. 

     
  • At the re-convened Commons Committee stage, government amendments were added to the Bill which would require Ofcom to maintain a list of platforms or other online services that are close to the threshold for classification as category 1 services, enabling faster action to impose greater safety responsibilities when emerging platforms become influential. The LGA strongly welcomes this. We also strongly welcome Baroness Morgan’s amendment, accepted at Lords report stage, that enables Ofcom to classify smaller platforms as category 1 services if their assessment is that their functionality represents a risk. This would ensure that small, but high-harm platforms, including dedicated hatred and harassment sites, can be subject to stricter regulation and to appropriate risk mitigation under the triple shield. There has been cross-party support for both this amendment, and similar amendments, at all stages of the Bill’s passage through the Lords.

LGA view

  • The LGA supports a move towards a risk-based approach to categorisation, rather than one based solely on audience size. We are concerned that many smaller user-to-user platforms could be left out of scope of category 1 requirements, even if they host large volumes of harmful material.  
  • A recent academic study, conceptualizing “Dark Platforms,” looked at how Covid-19 related content, especially conspiracy theories, was communicated on “dark platforms” such as 8Kun. It found that digital platforms that are less regulated can be used for hosting content and content creators that may not be tolerated by more mainstream platforms. As currently drafted, there is a risk that the Bill could inadvertently push harmful content onto these sites.  

     
  • We therefore urge the Government to set out as soon as possible what companies will fall into which category and reconsider their approach to categorising services. The Bill should ensure that Ofcom can take a ‘risk-based’ approach to categorising services to ensure that all platforms with a high-risk profile, including smaller platforms, fall within the scope of category 1. We urge that Baroness Morgan’s amendment is retained as part of the Bill in remaining Lords and Commons stages.

     
  • The fast-moving nature of the internet means that content, trends, or new sites can gain traction amongst users quickly, including harmful content and associated risks. In undertaking its duties under this Bill, it is important that Ofcom is able to react rapidly and nimbly to emerging platforms and risks, including being able to re-categorise to higher levels speedily. We therefore strongly support the new register of ‘emerging category 1 services’, but it is vital that Ofcom has adequate resources to act swiftly and comprehensively in a complex and rapidly changing online environment.

Protecting children online 

  • Councils have a duty under the Children Act 2004 to work with local police and health partners to safeguard and promote the welfare of children in their area. Child exploitation and grooming is a serious and growing crime. While the exploitation of children by criminals has, sadly, been happening for a long time, the risks to children and young people continue to increase, including as a result of criminals using online spaces to meet, groom and exploit their victims. 

     
  • According to a freedom of information request from the National Society for the Prevention of Cruelty to Children (the “NSPCC”), in 2020/21, online grooming offences reached record levels with the number of sexual communications with child offences in England and Wales increasing by almost 70 per cent in three years. 

     
  • The LGA strongly welcomes the Government’s ambition to ensure children are safe online, and that provisions relating to child protection were excluded from the amendments moved by the Government in the Commons relating to harmful content for adults. However, we are concerned that the move away from a regulatory approach to harmful content for adults increases risks around children and young people accessing misinformation, content relating to self-harm and suicide, or extremist content, given the porous nature of the internet. We would urge the government to engage with child protection experts in identifying what extra steps can be taken to address this.

     
  • At the Commons re-convened committee stage, government amendments were added to the Bill requiring Ofcom to provide guidance, including examples, to service providers on content they should consider to be harmful to children, as well as what content should be covered by user empowerment tools for adults. The amendments set out that the guidance should specifically include encouragement to suicide, self-harm, eating disorders, abuse or incitement to hatred based on protected characteristics as content that should be covered. The LGA welcomes the creation of this new duty. Ofcom should engage with child protection experts and children themselves in developing this guidance.

LGA view

  • The LGA supports moves to ensure that app stores are covered by the same level of regulation as social media sites in relation to child safety. App stores are a primary route through which online users initially access a range of user-to-user services, platforms and other routes to online content. There is therefore a role for providers of app stores to play in ensuring that children do not have access to harmful content or platforms where they could be subject to exploitation or grooming.

     
  • Many app store providers already undertake risk assessment or age restriction measures in relation to which apps they choose to host. This should be standard good practice and this amendment would provide a double check against children accessing harmful content in many cases, by requiring both app store and service provider to undertake risk assessments and pro-active protective measures.
  • The LGA supports proposals for the creation of an advocacy body for child users of regulated internet services. The NSPCC has highlighted the importance of ensuring appropriate user advocacy mechanisms in place when Ofcom develops its overarching risk assessment and risk profiles, to counterbalance industry influence. The LGA agrees that the voices of all internet users – including children and young people, vulnerable adults and parents and carers – must continue to be heard as different elements of the Bill are put into practice. Only by considering the ‘real world’ impact of online activity – both positive and negative – can we hope to effectively ensure online spaces that allow us to safely harness all the benefits offered by social media and search platforms.
  • The LGA supports proposals to strengthen requirements around the level of confidence that age verification tools employed by service providers employ in order to protect children. Ofsted’s 2021 review of sexual abuse in schools and colleges found that leaders were concerned about problems created by children and young people’s easy access to pornography. The review cited evidence that viewing pornography can shape unhealthy attitudes, including sexual aggression towards women, with more frequent consumption associated with victim-blaming attitudes.
  • The Online Safety Bill must introduce robust age verification controls for all commercial providers of online pornography. We regret that the Government is not progressing with part three of the Digital Economy Act 2017, which would provide the option for payment providers to withdraw their services from infringing sites.