The bill introduces regulatory changes that should make it easier to use ADM. The bill seeks to reduce the scope of application of Article 22 under GDPR, to allow wider use of ADM in all but cases where special category data is used. Within the bill, associated safeguards have been put in place to reduce the risks to individuals.
Under clause 80 of the bill, a new article, 22A is introduced, ADM which results in either a legal or similar significant effect will no longer be restricted. Instead permitted under lawful grounds outlined in article 22 of UK GDPR. The article defines ADM as a decision “based solely on automated processing” when there is “no meaningful human involvement in the taking of the decision”. A decision would qualify as being a “significant decision” if it produced “a legal effect for the data subject” or had a “similarly significant effect for the data subject”. New article 22D would give the secretary of state the power to make regulations to describe cases that are or are not to be taken to have meaningful human involvement, or what is or is not to be taken as a significant decision Principles of fairness and transparency will continue to apply to use of ADM as outlined in data protection requirements and where organisations rely on the legitimate interest's lawful basis for processing, they will still need to demonstrate the impact on the rights and freedoms of individuals whose data is being processed.
LGA view
It is expected that Article 80 of the bill would facilitate the greater use of ADM in councils, alleviating some of the data restrictions that prevent wider AI use and the resource incentive processes associated with current compliance. The changes present opportunities for transformative AI and digital solutions to be embedded in public services, allowing councils across the country to support the government AI action plan. However, as outlined inour response to the governments AI action Plan Local Government needs support and to be valued as a key partner to innovate further and be part of shaping the AI future.
To adopt ADM and AI safely, councils need clarity on what constitutes compliance with data protection law and the public sector equality duty. Councils are independent political bodies setting the local level of scrutiny and accountability needed for their local context. Many councils are establishing governance frameworks to embed ethics and define the level of human involvement expected. A clearer direction should include allowances for individual council discretion to ensure councils can adapt the level of human involvement to match risk appetite depending on use case application and the council’s assessment of local need.
There are concerns related to the role of the Secretary of State (SoS) for the Department for Science, Innovation and Technology (DSIT) being able to legally change what constitutes ‘human involvement’. If any changes are to be made, the LGA would need to reflect on these individually and assess the potential impact on local authorities. Where necessary, the LGA will work with DSIT to raise concerns.
As outlined in the AI opportunities action plan the government aims to enable safe and trusted AI development and adoption through regulation, safety and assurance. Regulation, safety and assurance needs to be coherent and coordinated to ensure that councils can confidently deploy AI and retain the trust of their communities, political discretion may not provide that clarity. The LGA supports a cross-societal Responsible AI Advisory panel, as outlined in the Digital Blueprint, and will look forward to working with DSIT to ensure that local government is appropriately represented.
Concerns have been raised by a number of significant data rights and advocacy groups as to the implications of article 80. There is a risk that if there is a loss of trust in AI and ADM technologies, through concerns related to human rights, equity and justice, this will significantly impact trust in councils, trust in decision making and councils ability to deliver public services that respond to local needs.
The DUA's widening of the permitted use of personal data in automated decision-making technologies could, if passed in its current form, open the door for organisations to more easily leverage AI systems and therefore could result in AI products coming to market quicker in the UK in comparison to the EU. However, councils are already struggling with the assessment of safe and trustworthy private sector tools and need significant support to be able to responsibly procure and deploy AI tools that are coming into the market. It’s vital that there are more opportunities for market shaping and pre-procurement engagement between the public sector and innovators.
We would welcome clear direction and for the regulator or Government to undertake coordinated assurance of key suppliers to ensure that developers are adhering to legal duties in the development and training of models and are integrating meaningful human oversight at all stages of deployment. There is a significant compliance burden on councils to assure meaningful oversight which is made more challenging by the significant power imbalance, and information and capabilities asymmetry that often exists between councils and technology companies. Undertaking this centrally in a coordinated way across the public sector would address the compliance burden and provide greater assurance for councils