Resetting the relationship between local and national government. Read our Local Government White Paper

AI: LGA, iNetwork and Socitm response to call for views on proposed code of practice for AI and cyber security

The Local Government Association (LGA), Society for Innovation, Technology and Modernisation (Socitm) and iNetwork provided a joint response to the Department for Science, Innovation and Technology's (DSIT) call for views on a proposed code of practice for AI and cyber security.


About us

The Local Government Association (LGA) is the national voice of local government. We are a politically led, cross-party membership organisation, representing English councils. Our role is to support, promote and improve local government, and raise national awareness of the work of councils. Our ultimate ambition is to support councils to deliver local solutions to national problems.

The Society for Innovation, Technology and Modernisation (Socitm) is a membership organisation of more than 2,500 digital leaders engaged in innovation and modernisation of public services. Established for more than 30 years, our network combines to provide a strong voice, challenge convention, and inspire change in achieving better place-based outcomes for people, businesses, and communities.

The Society of Local Authority Chief Executive (Solace) is the UK’s leading membership network for public sector and local government professionals. We currently represent over 1600 members across the UK and have regional branches across the country which play host to a number of events such as regional development days, skills days, and networking opportunities.

iNetwork is a membership led partnership for local public sector based organisations. Established 20 years ago, we currently have over 120 members across the North West and Yorkshire and Humber. We have a strong collective voice empowered to confront the most pressing challenges in the local public sector to drive innovation and change to enhance service delivery for our residents, patients, tenants and service users.

Key messages

  • Coherence and clarity are essential across codes, standards, and regulations to ensure buyer confidence, supplier adherence, and to enable council staff to protect vital services.
  • The proposed code of practice is a welcome step, but its voluntary nature could significantly hinder its potential effectiveness.
  • To successfully support market differentiation of secure suppliers, independent assessment and certification mechanisms would be beneficial to verify supplier compliance in the AI supply chain, particularly for suppliers deemed strategic across the public sector.
  • Councils face significant challenges in harnessing the potential of AI in both procurement and development, due to wide disparities in resources and capabilities, particularly in IT and cyber security.
  • The Cyber Security of AI code aims to create a competitive market for AI providers, but dominant large companies in local government may prevent responsible uptake of the code. 

Introduction and context

The Local Government Association (LGA), iNetwork and Socitm welcomes the Department for Science, Innovation and Technology's (DSIT) call for views on the proposed code of practice for AI and cyber security. As DSIT’s approach to codes of practice is modular, as are our responses and this response should be considered in line with our response to the software code of practice for vendors call for views.  

Local government forms a significant part of the public sector, with £121billion annual spend, according to the Local authority revenue expenditure and financing: 2023-24 budget, and a workforce of 1.32million – second only to the NHS. Local government is responsible for a range of vital services for people and businesses throughout the UK, interacting with every household in Britain at different points of the lifecycle. Services include support to the most vulnerable in our society through adult and children’s social care, and housing, as well as schools, licencing, business support, registrar services, and planning. 

Council's continued digital transformation offers unprecedented opportunities to enhance public services for citizens, and there is early stage exploration of AI both across business functions and services. However, with increasing digitalisation and exploration of AI, councils are exposed to heightened vulnerabilities and more sophisticated threats. As councils seek to deliver services that are ‘secure by design’, there is rapidly growing concern about the security of the supply chain, and that cyber security practices are failing to keep pace with innovation and the introduction of risk to the sector. The AI supply chain carries particular risks that are reflected in this response, which builds on our corresponding call for views response to the software code of practice. 

With councils at the frontline of public services, it is of paramount importance that their challenges in developing and procuring secure AI technologies are addressed in a coordinated way.  

AI use in local government 

The LGA recently surveyed councils in England to understand current AI usage, future plans, perceived risks and opportunities, and support needs.

AI deployment and exploration 

The survey indicated that there is a positive outlook on AI adoption within the sector, with 85 per cent of respondents indicating they're either using AI or exploring its potential. Only 15 per cent of respondents had not begun exploring AI capabilities. This strong interest suggests a sector receptive to innovation and eager to leverage AI. However, cyber security is cited as the biggest risk to AI adoption (85 per cent) amongst councils. 
 
Generative AI (70 per cent) was the most common type deployed by councils that responded to the survey, followed by prescriptive (29 per cent), predictive (22 per cent), simulation (2 per cent), and "other" (19 per cent). The majority of applications were for internal council functions like HR, administration, procurement, finance, and cyber security. Adult social care and children's social care were the next most popular areas for AI use. 

Application of AI 

Most respondents (63 per cent) procured AI through external suppliers, while 26 per cent developed tools in-house. Many who develop tools are utilising existing options like Microsoft Copilot Studio or ChatGPT Enterprise to customise products for council use. Therefore, as these tools are provided by AI suppliers, this response will focus on predominantly on councils as buyers rather than councils as developers.  

Governance of freely available tools on personal devices (ChatGPT3.5, Gemini, DALL-E) varied. A small minority of those who responded (4 per cent) banned these tools on council devices, while most permitted their use with or without limitations. Some councils restricted these tools in favour of corporate options like Microsoft 365 Copilot due to data security and information governance concerns. You can read more about specific use cases of AI in local government on the LGA AI Hub.  

Benefits and opportunities 

Staff productivity, service efficiency, and cost savings were the most commonly cited areas where respondents were already experiencing benefits from AI use, and where they see the biggest future potential. However, several councils acknowledged it is still too early to quantify the realised benefits. 

Readiness 

Over half (53 per cent) of respondents felt their council was very or fairly ready to adopt or continue to adopt AI in terms of technology (infrastructure, software, and cloud). This was followed by institutional culture (leadership and receptivity to change) at 45 per cent, and policies and procedures (governance frameworks and risk management) at 37 per cent. Workforce skills and expertise were identified as the biggest area of unpreparedness (72 per cent not very or at all ready), followed by data availability, quality, and storage (68 per cent). The three biggest barriers to AI deployment cited were lack of funding, lack of staff skills, and lack of staff capacity. 

Governance 

Risk assessments for AI in local government will vary depending on the specific use case. Even within the same use case, there may be variations across councils due to factors like council culture, local needs assessment, leadership direction, and existing risk matrices. This is evident in the differing approaches to staff access to large language models and other freely available generative AI tools. 
 
Governance approaches for identified risks will depend on council size and existing governance structures. Currently, most councils manage AI use through existing policies (including information governance and data protection), existing boards (like data ethics boards), and measures like appointing a senior responsible owner and focusing on staff training and skills development. Some councils prefer issuing guidance over stricter usage policies. More mature councils have established AI boards for strategic oversight of AI products and applications, ensuring compliance with statutory duties like the Public Sector Equality Duty and UK GDPR. 

Supplier governance 

Many councils are currently auditing existing suppliers to assess AI use. Only 18 per cent of survey respondents had a supplier policy or contract clauses requiring suppliers to declare AI use in service delivery to councils or residents. While 52 per cent lacked such a policy or clause, 26 per cent of those had informal discussions with suppliers during contract management.

The Code of practice

Council context and compliance burden 

Councils are at the forefront of delivering public services, and handle vast amounts of sensitive resident data. The drive in demand and potential towards more efficient and modern services through AI presents a unique opportunity but also significant challenges. In relation to AI readiness, in the LGA survey, workforce was stated as the biggest barrier to councils implementing AI safely and responsibly. Across 317 councils, there is a wide range of capacities and capabilities. Significant disparities exist in council size, financial resources, and workforce, with larger, upper-tier local authorities employing over 5,000 staff while smaller, lower tier, district councils may have fewer than 500 staff.  

These differences are often particularly pronounced in IT and procurement functions. Larger, more well-resourced councils often have more capacity and capabilities to design, implement and monitor cyber security controls in supply chains and/or have dedicated IT procurement teams. However, smaller and less resourced authorities may lack specialist expertise and capacity to develop and implement new processes. The variance in capacity and capabilities must be considered if the code is implemented.  

When considering approaches to cyber security, particularly specialised requirements for different forms of technology as DSIT is doing, they must be supportive to councils’ overall cyber resilience efforts and complement rather than add additional burdens without support. This point is elaborated in our software code of practice response, particularly in relation to the LGA’s advocacy and support for the Local Government Cyber Assessment Framework (LGCAF) to reduce the compliance burden councils currently experience.  

Coherence and clarity across different voluntary codes, standards and regulation 

DSIT frames this code of practice and others as 

stepping stones towards either further tailored guidance, or towards firmer interventions such as international standards or domestic regulation, if necessary.

It is therefore vital that there is coherence across regulatory plans already underway, and that the code is cognisant of the guidance that already exists. Clarity and coherence are vital to achieve buyer confidence and supplier adherence. 

Cyber security and public trust are inherently linked, and the rationale of the code includes increasing confidence in AI with users. Currently, councils are navigating a cluttered and confusing landscape of codes and standards aimed at improving security in the software supply chain. This confusion can compromise the objective of trust and assurance that councils require in their software suppliers to vital services. Councils have been calling for one standard that clearly articulates good practice and expectations of software suppliers that they can draw on to protect vital council services, which is elaborated on in our software code of practice consultation response.  

In a bid to ensure that their approach to AI is done so in the most secure, safe and ethical way, councils are already navigating a multitude of different guidance that exists both by Government and regulators, as well as leading research institutions such as The Alan Turing Institute. The AI assurance ecosystem is developing, and that development is welcomed given the specific risks that AI introduces. For example, we are aware of an AI Management Essentials scheme that is currently under development by the Responsible Technology Adoption Unit in DSIT. We are also currently analysing the plans that regulators have been asked to provide Government on how they are implementing the pro-innovation approach principles into their regulatory praxis. One of the principles is ‘safety, security and robustness’. Given that this code deals exclusively with security, there is a concern that accountability mechanisms for AI will be duplicative or conflicting, and will create confusion which will undermine the objectives of the code.

As outlined in an earlier response to a Parliamentary Inquiry into the use of AI across government, local government must be formally represented on public sector wide strategic boards that oversee the development of UK-wide governance and regulation, and councils must be consulted in the development of guidance. Local government need their priorities and context to be understood and integrated into public sector guidance to reduce the likelihood of duplication which leads to unnecessary public expense.

External verification or auditing 

DSIT must consider how it incorporates independent assessment and certification mechanisms to verify supplier compliance in the AI supply chain. The AI Safety Institute was established to test the safety of advanced AI and measure its impacts on people and society. Whilst important, advanced AI is not what is being used and explored in councils currently, and reliance solely on supplier self-attestation is insufficient.  

In a previous call for views on software resilience and recent code of practice response, we called for an enhanced role for Crown Commercial Services and other Public Buying Organisations (PBOs) in assessing suppliers for buyers purchasing through their frameworks. This would reduce duplication of multiple departments and councils doing it separately. It would therefore strengthen the security of councils with less capacity and resources to do so, and would make the best use of the auditing/verification expertise that exists. 

We understand the AI auditing capacity in the UK currently is low, and DSIT are considering how to further develop this career pipeline. This makes it more crucial that the auditing expertise that does exist is used more economically by the public sector, including with the most widespread used forms of AI, and auditors must have a high degree of independence from the suppliers they are assessing.

Voluntary nature

While the proposed code of practice is a welcome step, its voluntary nature could significantly hinder its potential effectiveness. To ensure meaningful impact, we urge the government to: 

  • strengthen its proposed approach, particularly in high-risk service areas
  • consider different tools at its disposal, including procurement levers, capacity building and training
  • create a mandatory standard
  • consider limiting barriers to entry for SMEs, and recognise councils’ role in fostering SME markets

Local government market dynamics and AI market dominance 

An objective underpinning the AI and Cyber Security code of practice is to ‘create a market ecosystem where AI supply chain stakeholders are looking to use security as a differentiator between their competitors. A technical global standard will help enhance an organisation’s reputation and drive better practices across the industry.However, there is a well-documented risk that AI provision is set to be dominated by a very small number of global technology companies.

Defined as perceptive AI, such as systems that recognise faces and fingerprints, or try and analyse images, audio or video, for example in the analysis of consultation responses or identifying car registration plates in the prevention of fly tipping. This includes sensing AI such as remote or continuous sensing through smart sensors; predictive AI, such as systems that try and make a prediction about an outcome for an individual, or try and assign people to appropriate service or system, for example predicting an outcome in services or assigning an adult social care treatment pathway;  generative AI, such as systems that generate text or images, such as ChatGPT and DALL:E, and simulation AI, such as digital twins and agent based modelling.

Market dominance and a lack of competition can lead to high costs with limited value, significant costs related to data sharing access such as APIs, a lack of improvements to the technology and service, and unsatisfactory security practices. For more information on published cost savings, please see Swindon’s AWS case study on translation services and the development of their easy-read product, and regarding staff productivity - Microsoft’s case study using Barnsley of Microsoft 365 copilot. It also leads to less buying power to use security as a differentiator.

There has been significant up-selling and marketing towards local government over the last year by companies selling AI products. Without sufficient skills across the sector and an immature assurance ecosystem regarding AI, there have been challenges in identifying genuinely beneficial products, and assuring the security and trustworthiness of products and vendors. This is mirrored in and also compounded by the software market challenges that exist in specific local government service areas, as outlined in our software code of practice response.

To address concerns regarding market dominance by a handful of global companies, more must be done to foster competition, particularly for SMEs; local government is considered a vital vehicle for making this happen. The Procurement Act 2023 provides a unique opportunity to foster the work of SMEs and to work preventatively on market dominance.

This code of practice presents an opportunity to make the security assurance ecosystem more mature if it is strengthened and implemented with local government in mind, and is coherent with other initiatives across Government. Beyond standards and codes of practice, there is work ongoing in the CDDO to review and update their list of strategic suppliers. This is to ensure local government needs and priorities are sharing engagement with suppliers to both central and local government, and local government specific suppliers are included in their engagement. This strategic supplier engagement will be instrumental in driving adherence to this code through the coordinated buying power of the UK’s public sector. However, local government needs, and context must continue to inform this initiative, and councils must derive value from innovation funding that is achieved through strategic relationships.

Workforce and skills

The effective management of AI risks within local government is heavily reliant on the skills and knowledge of its workforce. A significant challenge lies in upskilling staff to understand and mitigate the unique cyber security implications of AI technologies. This is true for IT and procurement professionals, as well as the wider workforce. Many councils already regularly undertake cyber security training with staff members, and consideration must be given to how this can be updated to incorporate AI specific risks, with a specialised and priority focus given to high risk service areas where AI is being explored. The diverse roles within local government, and varying sector capabilities require tailored guidance to ensure consistent interpretation and application of AI-related regulations and best practices. 

It is crucial that investment is made to equip public sector workforces with the necessary skills and knowledge to manage and utilise AI effectively. This needs to include being able to evaluate the ethical and privacy considerations for its use, and to effectively implement and challenge the AI. This could include online courses, workshops, or dedicated AI certifications for staff. The training available to civil servants should be made available to all public sector workers, including council staff.  

There needs to be long-term workforce planning and investment in digital and technology practitioners within the public sector. This could save considerable sums of money spent on consultants each year. Access to independent technologist support, particularly from academia, which could provide impartial advice and guidance to councils would be invaluable. The LGA and its partners are currently building communities of practice and opportunities to facilitate knowledge sharing within the sector, which is key to ensuring good practice and lessons are shared among council officers and members seeking to securely exploit the opportunities AI provides across the public sector. 

Contact

LGA: Jenny McEneaney 

Senior Improvement Policy Adviser: Cyber, Digital, and Technology 

LGA: Grace Perks 

Programme Support Coordinator: Cyber, Digital, and Technology 

iNetwork: Shelley Heckman 

Partnership Director 

Socitm: Martin Ferguson 

Director of Policy & Research