Whilst the Bill provides a definition of harm and sets out how harmful content will be prioritised for child and adult users; it remains unclear what types of harm platforms will need to protect users against. We urge the Government to set out their proposed categories of harmful content to children and adults as soon as possible and consult with independent experts to define and prioritise harmful content.
- The Local Government Association (LGA) supports the overall aims of the Online Safety Bill (OSB), which makes provisions for the regulation by Ofcom of certain internet services. The regulation proposed in this Bill is aimed at ensuring platforms have systems and processes in place to deal with illegal and harmful content and their associated risk. The Bill primarily does this by introducing duties of care to some user-to-user services and search engines. The Bill also imposes duties on such providers in relation to the protection of users’ rights to freedom of expression and privacy.
- Under the provisions of the Bill, all regulated services will have a duty of care in relation to illegal content and if services are deemed accessible by children, a duty to protect children from harm. Further to this, the Bill sets out that regulated services will be categorised into three categories within which different duties will apply. For example, providers of user-to-user services which meet a specified threshold (“Category 1 services”) are subject to additional duties. Further clarity is needed as to how services will be categorised.
- The central premise of the Bill is to protect users from harm and associated risk. Whilst the Bill provides a definition of harm and sets out how harmful content will be prioritised for child and adult users; it remains unclear what types of harm platforms will need to protect users against. We urge the Government to set out their proposed categories of harmful content to children and adults as soon as possible and consult with independent experts to define and prioritise harmful content. Ofcom must also be given adequate resources so that they can be agile and produce guidance at pace in line with emerging ‘harmful’ issues.
- The LGA recognises the delicate balance this legislation must maintain between preserving users’ freedom of expression and civil liberties whilst also protecting users from harmful content. We therefore welcome the user verification and user empowerment duty within this Bill that apply to category 1 services. The LGA welcomes these duties as they provide choice to adult users on what content and users they want to engage with whilst also allowing users to remain anonymous should they want or need too.
- The OSB introduces four new criminal offences: a harm-based communication offence, a false communication offence, a threatening communication offence and a cyber-flashing offence. The LGA previously called for cyber-flashing to be made a criminal offence, so we welcome its inclusion within the Bill. Overall, these offences are a useful provision to ensure that individuals posting harmful content are held to account. However, it will be dependent on the police and CPS being given adequate resources, training, and comprehensive guidance to ensure these offences are used appropriately.
- To ensure services adhere to their new responsibilities, the Bill introduces new regulatory powers and responsibilities for Ofcom. Ofcom will be responsible for drafting codes of practice for all duties and ensuring services have the systems in place to adhere to these responsibilities; they also have powers to hold services to account should they need to. We ask Ofcom to engage fully with relevant groups such as political parties and the LGA when developing its codes of practice to ensure there is consideration of unintended consequences.
- The LGA supported the Draft Online Safety Bill Joint Committee’s recommendation calling for Ofcom to publish a ‘safety by design’ code of practice. It is disappointing this has not been adopted, the LGA encourages the Government to produce a ‘safety by design’ overarching code of practice that can be referenced and adopted within the individual codes of practice.
- The LGA broadly welcomes the new harm-based, threatening, and false communication offences, as well as the user empowerment and verification duty that will enable users to control what content and users they interact with. However, we encourage the Government and Ofcom to go further and adopt a clearer and more robust provisions to manage ‘low-level’ abuse experienced by councillors that falls below the criminal threshold. As part of this, the LGA would like assurances from the Government that the democratic and journalistic protections set out in this Bill will not inadvertently protect perpetrators of abuse.
- Councillors are experiencing increasing levels of online intimidation, abuse and threats made against them, which can prevent elected members from representing the communities they serve and undermine public trust in democratic processes. We hope this Bill will go some way in addressing the concerns we have heard from our membership.
The Online Safety Bill was published on 17 March 2022. The explanatory notes to Bill set out the case for a new regulatory framework for internet services.
Many of these proposals were set out in Draft Online Safety Bill and were already subject to consultation, following the Government’s Online Harm White Paper and scrutiny from the Draft Online Safety Bill Joint Committee.
This Bill is of significant interest to councils, covering a wide range of issues from child protection and public health issues to abuse and intimidation and free speech. The wide-ranging nature of the Bill, and the significant role of the internet in the lives of most residents, means there are likely to be additional issues of importance for councils beyond the scope of this initial briefing.
This briefing will cover the LGA’s views on the Bill on issues relevant to local communities and councils.
Categorisation of internet services
- All online services will be designated as a category 1, 2A or 2B services, their category will be dependent on the number of users (size) and the functionalities of that service. However, the thresholds for each category have not yet been determined and will be set out in secondary legislation.
- Crucially, only services that are ‘user-to-user’ services (an internet service which allows users to generate, upload or share content) and meet category 1 thresholds will be subject to additional duties. The Government has suggested that category 1 platforms will be those that are the highest risk and with the largest user-bases, such as the main social media platforms.
LGA view on categorisation of internet services
- The LGA are concerned that many smaller user-to-user platforms could be left out of scope of category 1 requirements, even if they host large volumes of harmful material.
- A recent academic study, conceptualizing “Dark Platforms,” looked at how Covid-19 related content, especially conspiracy theories, was communicated on “dark platforms” such as 8Kun. It found that digital platforms that are less regulated can be used for hosting content and content creators that may not be tolerated by more mainstream platforms. As currently drafted, there is a risk that the Bill could inadvertently push harmful content onto these sites.
- We therefore urge the Government to set out as soon as possible what companies will fall into which category and reconsider their approach to categorising services. Instead, the Government should take a ‘risk-based’ approach to categorising services to ensure that all platforms with a high-risk profile, including smaller platforms, fall within the scope of category 1.
- The Government have committed to undertaking further research to determine whether there is sufficient evidence to expand the duties on small but risky platforms. Whilst this step is welcome, the LGA believe the evidence already exists to regulate more thoroughly smaller platforms with higher risk.
- This amendment would enable Ofcom to categorise services as category 1 based on their assessment that they pose a very high risk of harm, regardless of their number of users.
LGA view on Amendment 159, tabled by Kirsty Blackman MP
- We support this amendment. The LGA welcomed the Minister’s reassurances during Second Reading of the Bill that companies can move between categories, and different parts of a large conglomerate can be regulated differently depending on their activities. However, we remain concerned that smaller platforms can still host significant quantities of high risk material and be a source of harassment and intimidation for councillors and that a risk-based approach to categorisation is more appropriate than one based in large part on user-base size.
- The Government has published draft amendments for consideration at a re-convened public bill committee stage to ensure Ofcom can identify and publish a list of companies that are close to the Category 1 thresholds. This is to ensure Ofcom proactively identifies emerging risky companies and is ready to assess and add these companies to the Category 1 register without delay. The LGA strongly welcomes the publication of these draft amendments.
- More broadly, we urge Government to ensure Ofcom has adequate resources in order to respond quickly to both emerging harms and new platforms.
Protecting children online
- Councils have a duty under the Children Act 2004 to work with local police and health partners to safeguard and promote the welfare of children in their area. Child exploitation and grooming is a serious and growing crime. While the exploitation of children by criminals has, sadly, been happening for a long time, the risks to children and young people continue to increase, including as a result of criminals using online spaces to meet, groom and exploit their victims.
- According to a freedom of information request from the National Society for the Prevention of Cruelty to Children (the “NSPCC”), in 2020/21, online grooming offences reached record levels with the number of sexual communications with child offences in England and Wales increasing by almost 70 per cent in three years.
- The LGA strongly welcomes the Government’s ambition to ensure children are safe online, however there are a number of ways this Bill can be strengthened to ensure they effectively tackle the range of ways in which abusers use social networks.
- This amendment would establish an advocacy body to represent the interests of child users of regulated digital services. The body would represent the interests of child users and protect and promote those interests.
- The body would be a statutory consultee for Ofcom decisions under the powers granted in the Online Safety Act which impact on the interests of children.
- The body can either be an existing organisation or a newly-created body, at the discretion of the secretary of state.
LGA view on Amendment NC28, tabled by Kirsty Blackman MP
We support this amendment. The voices of all internet users – including children and young people, vulnerable adults and parents and carers – must continue to be heard as different elements of the bill are put into practice. Only by considering the ‘real world’ impact of online activity – both positive and negative – can we hope to effectively ensure online spaces that allow us to safely harness all the benefits offered by social media and search platforms.
Definitions of harm
- The central premise of the Bill is to protect users from harm and associated risk. Whilst the Bill provides a definition of harm and sets out how harmful content will be prioritised for child and adult users; it remains unclear what specific harms platforms will need to protect users against.
- The LGA believes that content that encourages, promotes, or instructs users in harmful behaviour should be considered harmful content within the Bill. We would urge the Government to set out during the passage of the Bill how they will designate harms into these categories for both children and adult users.
- The amendment creates a new offence under the Suicide Act 1961 of encouraging or assisting self-harm.
- An offence would have been committed if a communication is sent encouraging another person to commit serious physical harm to themselves, whether or not that person does indeed attempt to commit such self-harm.
- Someone convicted of such an offence is liable to imprisonment for up to five years.
LGA view on Amendment NC16, tabled by David Davis MP
- We support this amendment. The impact of certain types of content on the mental health of some users is of concern to councils. This includes the mental health of elected officials, vulnerable adults and children, as well as all users who can be affected by harmful content online. In extreme cases this can include encouragement to self-harm, and it is appropriate that such encouragement be made a specific offence.