Resetting the relationship between local and national government. Read our Local Government White Paper

State of the sector: Artificial intelligence

Local Government State of the Sector AI
In February 2024, the Local Government Association (LGA) launched a survey to explore the landscape of artificial intelligence (AI) deployment within local authorities. The insights gathered will inform the LGA's ongoing engagement, research, and advocacy efforts, ultimately enabling us to better support local authorities in the safe deployment of AI.

Full report


In February 2024 the Local Government Association conducted a survey to explore the use of Artificial Intelligence (AI) in English councils. The purpose was to build a picture of where AI is currently being deployed in local services and council business units and to map where the greatest opportunities and risks lie, to build an evidence base for its support to councils in this space, and to ensure that local government is part of the national conversation.

The survey used the Government’s definition of AI (November 2023, Introducing the AI Safety Institute):

The theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages. Modern AI is usually built using machine learning algorithms. The algorithms find complex patterns in data which can be used to form rules.

It also defined four types of AI, using information provided by the Alan Turing Institute:

  • Perceptive AI, such as systems that recognise faces and fingerprints, or try and analyse images, audio or video, for example in the analysis of consultation responses or identifying car registration plates in the prevention of fly tipping. This includes sensing AI such as remote or continuous sensing through smart sensors.
  • Predictive AI, such as systems that try and make a prediction about an outcome for an individual, or try and assign people to appropriate service or system, for example predicting an outcome in services or assigning an adult social care treatment pathway.
  • Generative AI, such as systems that generate text or images, such as ChatGPT and DALL:E
  • Simulation AI, such as digital twins and agent based modelling.

Key findings

  • Responses were received from almost a quarter (23 per cent) of councils.  This may reflect the level of AI usage among councils, with those not using it preferring not to take part in the survey. As such, the results of the survey should not be taken to be more widely representative of the views of all councils. Rather, they are a snapshot of the views of this particular group of respondents.
  • Most respondents (85 per cent) reported that they were using or exploring AI with half (51 per cent) at the beginning of their AI journey, 16 per cent developing their AI capacity and capabilities around AI, 14 per cent making some use of AI while 4 per cent are innovative and considered as leaders among councils in their use of AI.
  • Among respondents who were using or exploring AI, the most commonly adopted type was generative AI (systems that generate text or images etc) which was being used by 70 per cent. This was followed by perceptive AI, (systems that recognise faces or analyse images, audio or video, etc) which had been adopted by 29 per cent and predictive AI, (systems that try to make a prediction about an outcome) which was being used by 22 per cent
  • The functions where respondents using or exploring AI had most commonly utilised it were corporate council use: HR, administration (meeting minutes), procurement, finance, cyber security (85 per cent), health and social care (adults) (35 per cent) and health and social care (children’s) (31 per cent)
  • Almost two-thirds (63 per cent) of the respondents using or exploring AI were paying external suppliers for the provision of AI tools or technologies, or were in the process of procuring this
  • The areas where most respondents had realised benefits from using AI were staff productivity (35 per cent), service efficiencies (32 per cent) and cost savings (22 per cent). The areas where respondents saw the greatest AI opportunities were corporate council use: HR, administration (meeting minutes), procurement, finance, cyber security, identified by 85 per cent, followed by health and social care (adults) (43 per cent), and advice and benefits (36 per cent)
  • The five biggest barriers to deploying AI identified by respondents were lack of funding (64 per cent), lack of staff capabilities (53 per cent), lack of staff capacity (50 per cent), lack of sufficient governance (including AI policy) and lack of clear use cases (41 per cent each)
  • Among respondents using or procuring external suppliers for the provision of AI tools or technologies, three-quarters (75 per cent) identified ‘project scoping: understanding where AI can add value’ as representing a barrier to a great or moderate extent. followed by ‘evaluation: understanding how to evaluate solutions’ (65 per cent) and ‘market intelligence: understanding who is a trusted partner’ (63 per cent).
  • The issues most commonly considered to represent a great or moderate AI risk were cyber security (81 per cent), organisational reputation and resident trust (75 per cent) and deep fakes disinformation (69 per cent)
  • Two-thirds (65 per cent) of respondents were using their existing policies to manage AI risk. Existing boards, a senior responsible owner, or staff training and skills development were each being used by 31 per cent of respondents

AI hub