Resetting the relationship between local and national government. Read our Local Government White Paper

A pro-innovation approach to AI regulation: government response – policy brief

This brief allows you to get a quick understanding of the Department for Science, Innovation & Technology's response to the AI White Paper consultation and what it means for local government.


In March 2023, the UK Government published its White Paper consultation outlining its approach to regulating Artificial Intelligence (AI). Following a period of consultation, the government recently published its response (February 2024), outlining its “pro-innovation” approach to AI regulation.

The LGA, alongside SOLACE and Socitm, responded to the white paper, setting out key messages from local government based on continued sector engagement. The response highlighted the cautious optimism of the sector and emphasised the importance of mitigating the range of risks identified at a local level. Additionally, it underlined the need to support local governments in the adoption of AI technologies to ensure they are not left behind in the ongoing AI revolution.

The LGA’s response was amongst 466 individual responses which have informed the latest government update on the pro-innovation approach to AI regulation.

The Local Government Association (LGA), Society for Innovation, Technology and Modernisation (Socitm) and Society of Local Authority Chief Executives and Senior Managers (Solace) provided a joint response to the Government's AI White Paper.

Summary of a pro-innovation approach

The AI market is evolving at rapid speed, with the market expected to grow by over $1 trillion by 2035. The government's pro-innovation approach positions the UK as a leader in the adoption and advancement of AI technologies to revolutionise public services. This approach carefully balances innovation with safety, ensuring that progress is made efficiently and securely.

The response, which is largely unchanged from the initial white paper, outlines a non-statutory approach to regulation based on a cross-sector set of principles. The principles were devised following the world’s first AI safety summit involving 28 of the leading nations in the development and adoption of AI. The five core principles are:

  • Safety, security, and robustness.
  • Appropriate transparency and explainability.
  • Fairness.
  • Accountability and governance.
  • Contestability and redress.

The principles form the main basis of the government’s approach acting as a guide to be adapted and adopted across all sectors of the economy. The principles promote the responsible development and use of AI. As part of the pro-innovation approach, the government has asked several of the key regulators to publish plans of how they intend to manage AI risks and opportunities, in line with the principles-based framework and existing laws, by April 2024.

Alongside encouraging regulators to adopt the framework, they have announced a support plan to increase AI capabilities and manage risk, supporting key sector areas to understand how to operationalise the principles. The government is doing this in several ways:

  • Risk assessment: New multi-disciplinary team with DSIT to monitor AI risks.
  • Regulator capability: £10 million of funding for regulators to develop skills and enhance capabilities.
  • Regulator powers: of regulatory power to ensure regulators can appropriately address the risks and opportunities of AI.
  • Coordination: Government-led steering committee to support the development of AI governance.
  • Research and innovation: Working closely with UK Research and Innovation (UKRI) to support in depth research into AI, specifically around AI safety. 
  • Ease of compliance: Supporting businesses to get AI products to market through a specific advisory board. 
  • Public trust: Further work on AI assurance and standards. 
  • Monitoring and evaluation: Developing a monitoring and evaluation plan, a proposed plan to assess the regulatory framework by spring 2024.

The strategy also focuses on aiding businesses by utilising the potential of the Digital Markets, Competition, and Consumers Bill, which is currently making progress through Parliament. This proposed law will equip the Competition and Markets Authority (CMA) with extra tools to pinpoint and resolve any competition concerns relating to AI, thus enabling SMEs to gain a foothold in the market.

Additionally, through the work of the AI Safety Institute, the government is engaging with some of the largest AI developers, such as open AI and Deep Mind, and has introduced voluntary measures aimed at enhancing transparency. These efforts seek to build public trust and facilitate the sustainable development of "highly capable general-purpose AI systems."

What does this mean for Local Government?

The primary goal of the pro-innovation approach is to effectively address the inherent risks associated with AI. The outlined principles delineate clear, outcome-focused objectives aimed at ensuring the safe deployment of AI technologies. 

The interpretation and implementation of these principles by various regulators will influence multiple service areas within local government. The government has set a tight deadline of April 2024 for regulators to publish how they intend to utilise the principles in their area, highlighting the urgency of the matter. This will provide valuable clarity on the AI-related risk landscape identified by regulators.

The full extent of the impact on local government remains to be seen; however, it is evident that the principles are intended to encompass "all sectors," including local government. There is significant potential for tailoring these principles to specific services, especially in areas of heightened concern such as procurement.

This marks the initial phase of AI regulation, necessitating a flexible and adaptive policy approach due to the rapid pace of AI technological advancements. The approach states the government “will not rush to regulate”. In the meantime, they will utilise bills already in motion such as The Digital Protection and Digital Information Bill and The Digital Markets, Competition and Consumers Bill, to support the regulatory environment on key issues. 

Nevertheless, this does not preclude the possibility of legislation and stricter regulation in the future, such as the EU AI Act, or the establishment of a dedicated AI regulator. The approach already hints at potential measures surpassing voluntary agreements with AI developers in the near future.

The LGA will continue to monitor the policy landscape and analyse the impact on local government.