Clause 31 requires all regulated services to conduct assessment to determine if their service is likely to be accessed by children, the ‘child user’ condition is met if a ‘significant number’ of children use or are likely to use that service. A provider is only entitled to conclude that it is not possible for children to access a service, or a part of it, if there are systems or processes in place (for example, age verification, or another means of age assurance). If a site is deemed accessible by children, they will need to adhere to a duty to protect children and carry out a harm risk assessment.
Clauses 10 requires user to user services to conduct children’s risk assessments and states that theses assessments need to consider factors such as encountering different definitions of harm, the nature and severity of potential harm encountered, impact on children in different age groups and children certain characteristics, impact of algorithms, level of functionality of platforms, design, and operation of the service and different ways in which service is used. Clause 25 also places similar duty to conduct a children’s risk assessment for search engines likely to be accessed by children.
Clause 12 requires category 1 user to user services to conduct an adult’s risk assessment and states that these assessments need to consider factors such as user base, encountering different types of harm and severity and nature of harm, impact on adults with certain characteristics, impact of algorithms, level of functionality of platforms, design, and operation of the service and different ways in which service is used.
LGA view
Whilst the LGA broadly welcomes Clause 31, we are concerned about the ‘child user condition’. We ask for further clarity on what is meant by a ‘significant’ number of children so as to ensure children are protected across as many services as possible.
Exposure to risk should reflect stages of child development and so proposals in the Bill for safety duties and risk assessments to reflect the different needs and vulnerabilities of different age groups are welcome. Learning to manage and respond to risk is an important part of growing up. It is important to ensure that schools, parents, youth workers and others are supported to understand the risks and are able to help children and young people to also understand them and learn to navigate this in all areas of their lives. The LGA also therefore welcomes that media literacy is a distinct consideration within both children and adults risk assessments.
We had previously stated the need for the Bill to take into account cumulative harm. Evidence from both Facebook and the National Society for the Prevention of Cruelty to Children (the “NSPCC”) support this ask in relation to harms experienced by children and young people. Facebook researchers noted that teenagers struggling with the psychological effects of Instagram were struggling to log off of the app, feeling ‘addicted’ but unable to stop themselves from consuming more content. Cumulative misinformation, extremist material or cumulative abusive also has an impact on adult users.
The Bill provides that risk assessments for both children and adults will explicitly cover the nature and severity of harm that children and adults might suffer. The LGA would welcome explicit mention of the cumulative impact of harm in both risk assessments. This is part of a broader call for Ofcom to be clear on services responsibility to manage ‘low-level’ abuse and harm directed at children and adults.
Further to the issue of cumulative impact, the OSB does not discuss cross-platform approaches, despite the nature of much online harm occurring in this way – for example, young people playing games together on one platform, but talking to each other via a separate service, or meeting on a more ‘child-friendly’ site before moving to another that offers alternative contact options. We therefore welcome that the Government have stated that Ofcom’s overall risk assessment will cover risks associated with harms moving across different services subject to the safety duties. Services’ own risk assessments will also have to consider this. However, given the cross-platform risk posed to children, we would ask that the Government goes further and instruct Ofcom to specifically address cross-platform risks, and to place a clear requirement on platforms to co-operate and respond to cross-platform harms when discharging their safety duties.