Online Safety Bill, Committee Stage, House of Lords, 19 April 2023

The Local Government Association (LGA) supports the overall aims of the Online Safety Bill (OSB), which makes provisions for the regulation by Ofcom of certain internet services.

View allCommunity safety articles

Key messages

  • The Local Government Association (LGA) supports the overall aims of the Online Safety Bill (OSB), which makes provisions for the regulation by Ofcom of certain internet services. The regulation proposed in this Bill is aimed at ensuring platforms have systems and processes in place to deal with illegal and harmful content and their associated risk, particularly to children and young people. The Bill primarily does this by introducing duties of care to some user-to-user services (e.g. social media sites) and search engines. The Bill also imposes duties on such providers in relation to the protection of users’ rights to freedom of expression and privacy.
  • Under the provisions of the Bill, all regulated services will have a duty of care in relation to illegal content and if services are deemed accessible by children, a duty to protect children from harm. Further to this, the Bill sets out that regulated services will be categorised into three categories within which different duties will apply. For example, providers of user-to-user services which meet a specified threshold (“Category 1 services”) are subject to additional duties. Further clarity is needed as to how services will be categorised.
  • The LGA recognises the delicate balance this legislation must maintain between preserving users’ freedom of expression and civil liberties whilst also protecting users from harmful content. We therefore welcome the user verification and user empowerment duty within this Bill that apply to category 1 services. The LGA welcomes these duties as they provide choice to adult users on what content and users they want to engage with whilst also allowing users to remain anonymous should they want or need too.
  • The LGA broadly welcomes the new threatening and false communication offences, as well as the user empowerment and verification duty that will enable users to control what content and users they interact with. However, we encourage the Government and Ofcom to go further and adopt a clearer and more robust provisions to manage ‘low-level’ abuse experienced by councillors that falls below the criminal threshold. As part of this, the LGA would like assurances from the Government that the democratic and journalistic protections set out in this Bill will not inadvertently protect perpetrators of abuse.
  • Councillors are experiencing increasing levels of online intimidation, abuse and threats made against them, which can prevent elected members from representing the communities they serve and undermine public trust in democratic processes. We hope this Bill will go some way in addressing the concerns we have heard from our membership. However, we regret the removal of the harm-based communications offence by the government at committee stage in the Commons, which could have been an important tool in tackling this intimidation, harassment and abuse.
  • To ensure online service providers adhere to their new responsibilities, the Bill introduces new regulatory powers and responsibilities for Ofcom. Ofcom will be responsible for drafting codes of practice for all duties and ensuring online platformshave the systems in place to adhere to these responsibilities; they also have powers to hold services to account should they need to. We ask Ofcom engages fully with relevant groups such as political parties and the LGA when developing its codes of practice to ensure there is consideration of unintended consequences. Ofcom must also be given adequate resources so that they can be agile and produce guidance at pace in line with emerging ‘harmful’ issues, as well as being able to take effective enforcement actions and deal adequately and speedily with complaints.
  • The LGA supported the Draft Online Safety Bill Joint Committee’s recommendation calling for Ofcom to publish a ‘safety by design’ code of practice. It is disappointing this has not been adopted, the LGA encourages the Government to produce a ‘safety by design’ overarching code of practice that can be referenced and adopted within the individual codes of practice.

Further information

The Online Safety Bill was published on 17 March 2022. The explanatory notes to Bill set out the case for a new regulatory framework for internet services. During the Bill’s passage through the Commons, significant changes were made through recommittal at Committee stage, including the removal of most measures relating to ‘legal but harmful’ content in relation to adults.

Many of these proposals were set out in Draft Online Safety Bill and were already subject to consultation, following the Government’s Online Harm White Paper and scrutiny from the Draft Online Safety Bill Joint Committee.

This Bill is of significant interest to councils, covering a wide range of issues from child protection and public health issues, to abuse and intimidation and free speech.

This briefing covers the LGA’s views on selected amendments that are relevant to local communities and councils.

Definitions of harm

  • The central premise of the Bill is to protect users from harm and associated risk. The LGA has previously called for the government to set out what specific harms platforms will need to protect users against. We welcome the inclusion of a list of harms in the Bill following government amendments at re-convened Committee stage in the Commons, although for adults these will now only apply to user empowerment tools, and we have concerns at the exclusion of lower-level abuse or harassment and the wider impact of online misinformation and hate on democracy.
  • The Bill now incorporates into the duties on adult user-empowerment a list of the forms of content that would be defined as harmful and which the user should have access to tools to control their exposure to. The definition incorporates encouragement, promotion or instructions for suicide, self-harm and eating disorders; or content which is abusive towards or incites hatred towards people with a protected characteristic. Given recent developments, such as the removal, subsequently rescinded, of suicide prevention prompts on Twitter in December 2022, the LGA welcomes the specific inclusion of suicide and self-harm on the face of the Bill. 

LGA View 

  • It is right that the Online Safety Bill protects the freedom of speech of users, including the freedom of users to criticise issues such as government policy. However, as it stands, the failure of social media companies to address abuse and harassment is driving people away from their platforms, reducing their opportunity to contribute to important debates and limiting their own freedom of speech. Evidence shows that those with certain  characteristics, including women, disabled people and those from black and minority ethnic backgrounds receive more abuse than others, and this can result in people being less likely to speak out about complex issues or engage in online debate; failure to tackle this contributes to the existing challenges we face in relation to representation and inclusion of minority groups. While the LGA welcomes that these amendments recognise that those with certain protected characteristics are particularly targeted for abuse and harassment online, we regret that it places the onus on individuals to take steps to avoid such abuse or harassment.
  • Councils are concerned that democracy can be impacted through abuse and harassment experienced on social media. This can particularly impact on those from minority groups, for example, research by Amnesty International analysing tweets that mentioned women MPs in the run up to the 2017 General Election found that the 20 Black, Asian and Minority Ethnic MPs received 41 per cent of the abuse, despite making up less than 12 per cent of the those in the study. This type of treatment can make people less willing to speak out on important topics and, as the National Assembly for Wales found, can even lead to people standing down from political office. In its Debate Not Hate report last year, the LGA found that online abuse and smear campaigns can deter people from standing for election or for leadership positions and that this particularly applied to some people with protected characteristics. It is of vital democratic importance that people from all backgrounds feel safe to stand for election, helping to ensure that our governments reflect our communities. However, with those with certain protected characteristics more likely to be targeted with abuse, the ways in which social media currently operates makes this increasingly difficult.
  • One of the challenges of placing responsibility on users to deal with online abuse themselves is that, even if a user ignores or blocks a particular user, this does not always stop that content being published, potentially stirring up ill feeling online. Evidence has shown that people are promoted content that links with their existing views, leading to ‘echo chambers’ online where abusive content can be shared and amplified. This material can become increasingly extreme over time, providing a possible gateway into other harmful platforms and an increasing interest in violence. Therefore, while tools to support users are important, preventing harmful content, in particular violent and threatening content, from being published and shared in the first place by building safety into platform design is essential.
  • The LGA is concerned that the definitions of content to be covered by user empowerment tools in the amended Bill are limited and may not incorporate substantial elements of the abuse and harassment that councillors and other elected politicians face on-line, whether as a result of protected characteristics, or simply by virtue of their being public office holders. Providers of social media platforms should be encouraged to introduce specific safeguards for those holding elected office, including fast track routes to report abuse, intimidation and harassment.
  • Content that encourages, promotes, or instructs users to engage in harmful behaviour should be considered harmful content within the Bill. We are concerned at the limited reference and provisions to tackle personal and democratic harms from online mis- and disinformation in the Bill.

 

New clause to be inserted After Clause 1, tabled by Lord Stevenson of Balmacara and Lord Clement-Jones

  • This new clause seeks to clarify the purpose of the Bill in line with the recommendations of the Joint Committee which carried out pre-legislative scrutiny of the Bill and would require the Secretary of State and Ofcom to act with regards to those purposes in exercising their functions under the Bill.
  • The purposes set out by the amendment would re-iterate the need to protect freedom of speech and supports the government’s position that a higher level of protection online is appropriate for children than adults.
  • However, the amendment would also clarify that the Secretary of State and Ofcom should exercise their statutory duties with regards to:
    • the disproportionate level of harms experienced by people online on the basis of protected characteristics they hold;
    • that regulated internet services should be safe by design.

LGA View

  • The LGA supports this amendment, and we consider it appropriate that the Joint Committee’s recommendations are reinforced on the face of the Bill.
  • This amendment has the potential to recalibrate responsibility for adult online safety so that it is shared between user and service provider. The amended Bill after the re-convened Commons committee stage provides substantial and welcome tools to empower adult users to calibrate their online experience and the content they see but means that the onus lies almost solely on the user to take steps in relation to potentially harmful content. It should not be the sole responsibility of, for example, a victim of online harassment, to take action to avoid such content. Introducing a requirement that the Secretary of State and Ofcom (and therefore, by extension, the providers of regulated internet services) undertake their functions with reference to the overarching principle that regulated internet services should be safe by design can rebalance that responsibility.
  • We strongly welcome an overarching recognition that disproportionate harm is experienced by people on the basis of their protected characteristic(s) and that statutory duties should be exercised with reference to this. Surveys have found that female, BAME or LGBTQ+ candidates and elected members experience significantly higher levels of online abuse and harassment.

Amendment to Clause 12, lines 9 and 17, tabled by Baroness Morgan of Cotes, Baroness Parminter and the Lord Bishop of Gloucester; and Amendments to Clause 12, lines 19 and 23, tabled by Lord Clement-Jones

  • These amendments relate to the user empowerment tools that the Bill requires regulated digital platforms to introduce to enable adult users to decide what level of control they want over ‘legal but harmful’ content they could be exposed to. As the Bill stands, platforms would be able to default to the lowest level of control, with users needing to ‘opt-in’ to user empowerment tools that would limit such exposure.
  • At the re-convened committee stage in the Commons, the Government introduced definitions of the type of content that would be expected to be covered by user empowerment tools for adults. This includes encouragement, promotion or instructions for suicide, self-harm and eating disorders; or content which is abusive towards or incites hatred towards people with a protected characteristic.
  • These amendments would require services to, by default, have their user-empowerment tools set to the safest available level, with users then having the opportunity to pro-actively decide to reduce the levels of protection.

LGA view

  • The LGA supports these amendments. The Government’s previous decision to remove of clauses around harmful content for adults moved the onus from the provider to regulate such content on to the user to disengage. The position advocated in these amendments represents a middle ground whereby concerns over freedom of speech in relation to content that could be considered ‘legal but harmful’ are addressed, but a protection from such content is considered to be the default.
  • These amendments would recalibrate responsibility for adult online safety so that it is shared between user and service provider. The amended Bill after committee stage provides substantial and welcome tools to empower adult users to calibrate their online experience and the content they see, but means that the onus lies almost solely on the user to take steps in relation to potentially harmful content. It should not be the sole responsibility of, for example, a victim of online harassment, to take action to avoid such content.
  • Councillors and candidates at local government elections, like other elected officials, are often subject to substantial levels of abuse and harassment online, which crosses the line from legitimate scrutiny and criticism. The default position should be that the onus rests upon digital platforms to ensure that elected officials and candidates for office – like any other individual – are not subject to extremes of abuse and harassment, rather than for the targeted individuals to be required to take action to avoid it.
  • Given recent developments, such as the removal, subsequently rescinded, of suicide prevention prompts on Twitter in December 2022, the LGA welcomes the specific inclusion of suicide, self-harm and eating disorders on the face of the Bill as areas to be regulated through user empowerment tools. However, given the very high level of potential risk to individuals from exposure to such material, it does not appear appropriate – from a public health perspective – for the default position of a platform to be that users would be liable to be exposed to such material. The LGA supports these amendments in requiring that the default position on all platforms should be that users should not be exposed to such material, and that users must pro-actively employ user empowerment tools if they wish to change that.

New clause to be inserted after Clause 12, tabled by Baroness Stowell of Beeston, Baroness Bull and Baroness Featherstone

  • This amendment would require category 1 services (i.e. those services considered to represent the highest risk) to undertake a suitable and sufficient adults’ risk assessment; to keep the risk assessment up to date; and to undertake a new risk assessment when a significant change is made to the service’s design or operation.
  • The risk assessment would be required to include an assessment of the level of risk of adult users of the service encountering material set out in the Bill’s definition of ‘legal but harmful’ material (i.e. suicide, self-harm, eating disorder promotion, abuse or incitement to hatred based on protected characteristics), and how easily such material could be spread e.g. via the algorithms employed by the service. An appraisal of how the design and operation of the service reduces or increases the level of risks would be required to be part of the assessment.
  • The assessment would also be required to consider the extent to which user empowerment tools might result in interference with users’ rights to freedom of speech.

LGA view

  • The LGA supports this amendment. As it stands, the Bill sets out a range of material that could be considered harmful to adults. The LGA welcomes the scope of this definition, both for reasons of promoting the safety of councillors and officers and of civility in public life; and to reduce exposure to material promoting suicide, self-harm or eating disorders. It is therefore appropriate that higher-risk platforms be required to undertake a risk assessment of potential exposure to such material and to examine how the operation of the platform impacts on those risks. This is a sensible extension of the children’s risk assessment work that platforms will be required to undertake.

New clause to be inserted after Clause 35, tabled by Baroness Finlay of Llandaff, Lord Knight of Weymouth, Baroness Morgan of Cotes and Baroness Tyler of Enfield

  • This amendment would make harmful material related to suicide or self-harm subject to stricter restriction than other ‘legal but harmful’ content in relation to its access by adults.
  •  User-to-user platforms and search engines would be required to establish policies on how they would treat such content, being required to either take it down, restrict users’ access, or limit its recommendation or promotion.

LGA view

  • The LGA supports this amendment. Concerns have also been raised about the availability of “pro-suicide” content online, both via social media and search engines. In 2019, the Children’s Commissioner for England published an open letter to social media platforms, arguing that “The recent tragic cases of young people who had accessed and drawn from sites that post deeply troubling content around suicide and self-harm, and who in the end took their own lives, should be a moment of reflection.”
  • Content that encourages, promotes or instructs users in harmful behaviour should be considered harmful content within the Bill and given the evidence of the impact of such content; it is appropriate that it be subject to strict duties on digital platforms.

New clause to be inserted after Clause 15, tabled by Baroness Merron

  • This amendment seeks to address concerns over health disinformation and misinformation on digital platforms. It introduces a range of statutory duties on category 1 (i.e. the highest risk) services, including maintaining an up-to-date risk assessment of harmful disinformation and misinformation that is present on the service; and to develop and maintain a policy in relation to the treatment of such material.

LGA view

  • The LGA supports this amendment. During the Covid pandemic, misinformation, conspiracy theories and anti-vaccine content on social media proved a significant challenge globally. The World Health Organisation suggested the spread of misinformation, “amplified on social media and other digital platforms, is proving to be as much a threat to global public health as the virus itself.”
  • Many councils found that misinformation spread via social media proved a challenge in ensuring compliance with Covid regulations and in ensuring high levels of vaccine take-up. Historically, similar patterns have been experienced in relation to, for example, take-up of the MMR jab and subsequent increases in instances of mumps.
  • Where content can be proven to be false, this should fall within the definition of “content that is harmful to adults”, and there should be a clear duty on regulated services to remove, clearly label or otherwise meaningfully restrict access to such content.

Amendment to Clause 139, page 124, line 42, tabled by Lord Knight of Weymouth and Lord Clement-Jones; and at page 124, line 42, tabled by Lord Knight of Weymouth

  • These amendments seek to ensure that the advisory committee on misinformation and disinformation that Ofcom is required to establish is set up speedily – within six months of the Act being passed. The second amendment requires that the advisory committee’s first priority is to consider and report upon whether a dedicated Ofcom code of practice on misinformation and disinformation, as required by the Bill in other areas, would be effective.

LGA view

  • The LGA supports these amendments. Misinformation and disinformation spread online is of increasing concern to many councils, including (but not limited to) in areas around public health, asylum and refugee accommodation, planning and transport policy. This represents a broad risk to public wellbeing and good governance, but also a specific one for councillors and council officers, who can become targets for abuse, harassment and intimidation on the basis of such misinformation.
  • The LGA welcomes the commitment in the Bill to establish an advisory committee on disinformation and misinformation, in recognition of the growing scale of this issue. It is appropriate, given the level of concern, that it should be established speedily, and should consider whether a dedicated code of practice would be of value to both service providers and regulators. Baroness Fox of Buckley has given notice that she intends to oppose Clause 139 – which establishes the committee – standing part of the Bill. The LGA believes that the creation of this committee is of genuine importance and that Clause 139 should remain in place.

Clauses 160 and 161 stand part – Lord Moylan has given notice of his intention to oppose the question

  • These clauses establish a false communications offence. To be guilty of an offence, a person must knowingly send a message knowing it to be false, to be doing so with the intention of causing non-trivial psychological or physical harm to the likely audience; and to have no reasonable excuse for sending the message.
  • An offender can be liable for a fine, or a prison sentence of up to one year.
  • Clause 161 exempts a number of categories of organisation from being able to commit an offence under Clause 160, including recognised news publishers and broadcasters.

LGA View

  • The LGA supports the creation of a false communications offence and therefore opposes deleting these clauses. A significant issue for many of our members is that of smear campaigns, in which falsehoods are shared about councillors, prospective councillors or officers to prevent them from holding office or to cast doubt on their professional competence. Such deliberate communication of false information can amount to harassment or lead to significant psychological harm – or even physical harm, alongside professional and personal damage to councillors and candidates. Much of this is shared by those claiming to be doing so in the name of journalism or political debate. While this could in theory be removed as misinformation or via expensive libel claims, in practice this is rarely the case.
  • Following the removal from the Bill of clauses on the ‘legal but harmful’ content, which would have in some cases required service providers to regulate, by removal or otherwise, deliberately false and harmful communications, it is of increased importance to retain the criminal offence option to seek to dissuade individuals from the deliberate communication of harmful false communications. This is a complex area where freedom of speech must be carefully protected. The LGA believes that the narrowly drawn offence in Clause 60, and the exemptions provided in Clause 61, do ensure that the offence would only impact on deliberately harmful instances of false communications, and would not impact on legitimate political debate and disagreement, on journalism, or on genuine misunderstandings or differences of perspective around a given set of facts.
 

Categorisation of internet services

  • All online services will be designated as a category 1, 2A or 2B services, their category will be dependent on the number of users (size) and the functionalities of that service. However, the thresholds for each category have not yet been determined and will be set out in secondary legislation.  
  • Crucially, only services that are ‘user-to-user’ services (an internet service which allows users to generate, upload or share content) and meet category 1 thresholds will be subject to additional duties. The Government has suggested that category 1 platforms will be those that are the highest risk and with the largest user-bases, such as the main social media platforms. 
  • At the re-convened Commons Committee stage, government amendments were added to the Bill which would require Ofcom to maintain a list of platforms or other online services that are close to the threshold for classification as category 1 services, enabling faster action to impose greater safety responsibilities when emerging platforms become influential. The LGA strongly welcomes this. However, we are concerned that at committee stage the minister again insisted that both categorisation and this register would be subject to user number thresholds, rather than a risk-based approach.

Amendment to Schedule 11, page 216, line 30, tabled by Baroness Morgan of Cotes, Baroness Parminter and Lord Mann; new clause to be inserted after Clause 86, tabled by Lord Stevenson of Balmacara; and amendment to Clause 26, page 28, line 6, tabled by Lord Russell of Liverpool, Baroness Harding of Winscombe and Lord Knight of Weymouth

  • These amendments would permit Ofcom to adopt a ‘risk-based’ approach to classifying and regulating user-to-user services. Baroness Morgan’s amendment permits platforms which attract smaller user numbers, but are hubs for extreme hate or other content, to be classified as category 1 platforms and therefore be regulated in the same way as larger user-to-user services.
  • Lord Stevenson’s amendment has a broadly similar effect, permitting Ofcom to urgently re-classify and regulate a service as a category 1 platform, where they consider this necessary to avoid or mitigate serious harm, regardless of the size of the user base.
  • Lord Russell’s amendment would have a similar impact to the other two amendments, but particularly focused on ensuring that the size of a provider’s user base is not allowed to disproportionately impact decisions in relation to child protection duties.

LGA view

  • The LGA supports these amendments. We are concerned that many smaller user-to-user platforms could be left out of scope of category 1 requirements, even if they host large volumes of harmful material.  
  • A recent academic study, conceptualizing “Dark Platforms,” looked at how Covid-19 related content, especially conspiracy theories, was communicated on “dark platforms” such as 8Kun. It found that digital platforms that are less regulated can be used for hosting content and content creators that may not be tolerated by more mainstream platforms. As currently drafted, there is a risk that the Bill could inadvertently push harmful content onto these sites.  
  • We therefore urge the Government to set out as soon as possible what companies will fall into which category and reconsider their approach to categorising services. Instead, the Government should take a ‘risk-based’ approach to categorising services to ensure that all platforms with a high-risk profile, including smaller platforms, fall within the scope of category 1.  
  • The fast-moving nature of the internet means that content, trends, or new sites can gain traction amongst users quickly, including harmful content and associated risks. In undertaking its duties under this Bill, it is important that Ofcom is able to react rapidly and nimbly to emerging platforms and risks, including being able to re-categorise to higher levels speedily. We therefore strongly support the new register of ‘emerging category 1 services’, but it is vital that Ofcom has adequate resources to act swiftly and comprehensively in a complex and rapidly changing online environment.

Protecting children online

  • Councils have a duty under the Children Act 2004 to work with local police and health partners to safeguard and promote the welfare of children in their area. Child exploitation and grooming is a serious and growing crime. While the exploitation of children by criminals has, sadly, been happening for a long time, the risks to children and young people continue to increase, including as a result of criminals using online spaces to meet, groom and exploit their victims.  
  • According to a freedom of information request from the National Society for the Prevention of Cruelty to Children (the “NSPCC”), in 2020/21, online grooming offences reached record levels with the number of sexual communications with child offences in England and Wales increasing by almost 70 per cent in three years. 
  • The LGA strongly welcomes the Government’s ambition to ensure children are safe online, and that provisions relating to child protection were excluded from the amendments moved by the Government in the Commons relating to harmful content for adults. However, we are concerned that the move away from a regulatory approach to harmful content for adults increases risks around children and young people accessing misinformation, content relating to self-harm and suicide, or extremist content, given the porous nature of the internet. We would urge the government to engage with child protection experts in identifying what extra steps can be taken to address this.
  • At the Commons re-convened committee stage, government amendments were added to the Bill requiring Ofcom to provide guidance, including examples, to service providers on content they should consider to be harmful to children, as well as what content should be covered by user empowerment tools for adults. The amendments set out that the guidance should specifically include encouragement to suicide, self-harm, eating disorders, abuse or incitement to hatred based on protected characteristics as content that should be covered. The LGA welcomes the creation of this new duty. Ofcom should engage with child protection experts and children themselves in developing this guidance.

Amendments to clauses 10 and 11, tabled by Baroness Harding of Winscombe, Baroness Stowell of Beeston, Lord Knight of Weymouth and Lord Clement-Jones

  • These amendments clarify that application (‘apps’) stores are required to undertake the same levels of risk assessment and proactive measures to protect children from encountering harmful material (such as age assurance measures) as user-to-user services, such as social media platforms.

LGA view

  • The LGA supports these amendments. App stores are a primary route through which online users initially access a range of user-to-user services, platforms and other routes to online content. There is therefore a role for providers of app stores to play in ensuring that children do not have access to harmful content or platforms where they could be subject to exploitation or grooming.
  • Many app store providers already undertake risk assessment or age restriction measures in relation to which apps they choose to host. This should be standard good practice and this amendment would provide a double check against children accessing harmful content in many cases, by requiring both app store and service provider to undertake risk assessments and pro-active protective measures.

Amendment tabled by Lord Moylan to leave out subsection (3) at page 10, line 14

  • This amendment would remove the duty on regulated user-to-user services (e.g. social media platforms) that are likely to be accessed by children to introduce proportionate systems and processes (e.g. age verification systems) to prevent children accessing encountering harmful content on their platform.
  • The amendment would not impact on the requirements in subsection (2) for such platforms to mitigate and manage the risk of harms to children on regulated platforms.

LGA view

  • The LGA opposes this amendment. The Bill represents an important step forward in improving child protection online, and measures such as age verification, where appropriate, are an important tool in that respect. Removing subsection (3) and relying solely on subsection (2) would reduce the requirement on regulated services from one of ‘preventing’ children from accessing harmful content to one of ‘mitigating and managing’. This would substantially weaken the duties in the Bill. We do not believe that the use, where appropriate, of age verification or similar measures in order to prevent access to harmful material by children represents an unjustifiable limit on freedom of speech or expression and is a proportionate approach to the harms children can be subject to online.

New clause to be inserted After Clause 11, tabled by Lord Stevenson of Balmacara

  • This clause would introduce personal liability on senior managers of user-to-user services (e.g. social media sites) in the event of the service failing to fulfil its duties under Clause 11 of the Bill. This covers a wide range of duties, but primarily these relate to having in place proportionate measures to mitigate and manage the risk of harms to children; to have in place proportionate systems and processes (e.g. age verification systems) to prevent children from encountering harmful content; to set out these measures in the terms and conditions of the site, and to apply the measures consistently.
  • The amendment would create an offence of failing to comply with a relevant duty under Clause 11, which can be applied to a senior manager of the entity controlling the user-to-user service. They would be liable to a prison term of up to two years, a fine, or both. Tariffs for these offences would be set separately by the Secretary of State.

LGA view

  • The LGA supports this amendment. The duties set out in Clause 11 are amongst the most important within the Bill in terms of strengthening child protection online, and it is vital that they ‘have teeth’ in their application.
  • Given the size and turnover of many of the entities engaged in the provision of social media and similar platforms, fines levied at a corporate level may not prove a substantial enough deterrent to non-compliance, especially where the new duties under Clause 11 would require significant investment in systems or major amendments to the functioning of a platform. Given the importance of ensuring these duties are consistently fulfilled in order to improve child protection online, it is appropriate in this instance for the potential of personal sanctions against senior officers to be incorporated in the Bill.

New clause to be inserted after Clause 142, tabled by Lord Knight of Weymouth, Baroness Kidron, Baroness Newlove and Baroness Tyler of Enfield

  • This amendment would require Ofcom to establish a new advocacy body for child users of regulated internet services, in order to represent, protect and promote their interests.
  • The advocacy body would have a specific role in reflecting the interests of child users with one or more protected characteristics; to assess emerging threats to child users of digital services; and to publish an annual report on threats to child users.
  • The body would be independent of Ofcom, beyond budget setting responsibilities, and would be formed from a UK-wide organisation or organisations. The body would be required to include representation from young people under the age of 25.

LGA view

  • The LGA supports this amendment. The NSPCC has highlighted the importance of ensuring appropriate user advocacy mechanisms in place when Ofcom develops its overarching risk assessment and risk profiles, to counterbalance industry influence. The LGA agrees that the voices of all internet users – including children and young people, vulnerable adults and parents and carers – must continue to be heard as different elements of the Bill are put into practice. Only by considering the ‘real world’ impact of online activity – both positive and negative – can we hope to effectively ensure online spaces that allow us to safely harness all the benefits offered by social media and search platforms.

New schedule to be inserted before schedule 8, tabled by Baroness Kidron, Lord Stevenson of Balmacara, the Lord Bishop of Oxford and Lord Bethell

  • This schedule would strengthen the duties of regulated providers of online services around age verification duties.
  • Under this schedule, providers would need to be able to demonstrate that the age verification tools and systems they are employing provide them with a high level of confidence in the age of those accessing the service.
  • The level of confidence required would be proportionate to the level of risk arising from content hosted on the service

LGA view

  • The LGA supports this new schedule, particularly in relation to sites hosting pornographic content. Ofsted’s 2021 review of sexual abuse in schools and colleges found that leaders were concerned about problems created by children and young people’s easy access to pornography. The review cited evidence that viewing pornography can shape unhealthy attitudes, including sexual aggression towards women, with more frequent consumption associated with victim-blaming attitudes.
  • The Online Safety Bill must introduce robust age verification controls for all commercial providers of online pornography. We regret that the Government is not progressing with part three of the Digital Economy Act 2017, which would provide the option for payment providers to withdraw their services from infringing sites.

Contact

Rollo Maschietto

Public Affairs Support Officer

[email protected]