Visit our devolution and LGR hub for the latest information, support and resources

Mental health performance, data and insight: Guidance for directors of adult social services

Partners in Care and Health thumbnail
This guidance document has been developed by Partners in Care and Health (PCH) working in collaboration with the Association of Directors of Adult Social Services (ADASS) Mental Health Network. The purpose is to provide practical guidance for Directors of Adult Social Services (DASS), to ensure insight and understanding in relation to their duties and wider system performance related to adult mental health services.

Background and introduction

The purpose of this guidance 

We developed this guidance to provide a practical resource for directors of adult social services and their teams, setting out the key features required of a mental health performance report. This includes providing:

  • sharable good practice
  • guidance on setting the foundations for good mental health performance reporting
  • guidance on implementing a performance reporting cycle.

This document does not include detailed guidance on monitoring performance of approved mental health professional (AMHP) services, as a national AMHP dataset is currently in development through a separate project to support councils with monitoring and reporting.

How this guidance was developed

This guidance document was developed following an extensive period of research and insight gathering including:

  • desktop research to identify good practice
  • semi-structured interviews with mental health leads from 12 councils
  • engagement with key leads from the Department of Health and Social Care (DHSC) and the Association of Directors of Social Services (ADASS)
  • stakeholder workshops with council and PCH leads on mental health.

How this guidance should be used

This guidance has been designed to support councils to work through what good looks like in mental health performance reporting from the foundations for good reporting, through to embedding good reporting and into governance and processes.

The key features of this process are summarised below:

Context

Statutory duties for directors of adult social services

General duties which relate to mental health performance

The DASS Guide produced by the Association of Directors of Adult Social Services (ADASS) outlines the statutory requirements of the role of director of adult social services (DASSs), including those relating to adult mental health services. It also recognises that good performance information is vital for assessing local need and ensuring the services in place are delivering for residents.

There are some key areas of DASS responsibility that link to mental health performance:

  • Working with people: including responsibilities around early intervention, co-production, social inclusion and wellbeing
  • Providing support: workforce and skills, commissioning and partnership with the NHS
  • Ensuring safety: safeguarding and protection, managing risk
  • Leadership: effective financial management, data and performance.

Service responsibilities relevant to mental health performance

DASSs have general service responsibilities that cover all aspects of social care, which are all relevant to mental health. These relate to prevention, information and advice, appropriate assessment of need and means and meeting essential needs for care and support.

Specific mental health duties

DASSs, also have specific legislative responsibilities relating to key mental health legislation including the Mental Health Act 1983 (MHA) and Mental Capacity Act (2005) . This means ensuring arrangements are in place for sufficient AMHPs to be available to undertake Mental Health Act work on a 24/7 basis, as well as court appointed deputyship, court of protection, guardianship, detention and after-care.

Mental health performance and assurance on statutory duties

Effectively measuring mental health performance can provide assurance on whether the DASS duties outlined above are being met. Good quality performance reporting can offer assurance in the following five key areas:

.

Laying the foundations for good mental health performance reporting

The foundations for good mental health performance reporting

Six key foundations for good mental health performance reporting have been identified.

  • Vision and understanding: a shared vision for what good looks like and understanding of duties across the system.
  • Governance processes to enable data sharing, reporting and escalation of risks and issues.
  • Understanding of ASC statutory duties to ascertain whether duties are being met, especially when services are delivered via section 75 (s75) agreement with a health trust.
  • Practice: understanding of whether social workers (and health staff if in s75) are delivering Care Act and MHA compliant assessments and support to understand whether data reporting issues are due to practice or data recording.
  • Information recording: how effectively information is recorded on systems and whether system users understand what needs to be reported from systems.
  • Recording systems that allow visibility or interoperability for partners and enable effective performance reporting.

Assessing whether the foundations are in place

To help identify the extent to which the foundations for good mental health performance reporting are in place across your local system, a diagnostic tool has been developed to sit alongside this guidance document.

Download the diagnostic tool 

The self-assessment of the current arrangements against each of the areas can be used to determine where the council is relative to ‘what good looks like’ statements, and highlights priority areas for focus, using the ‘what good looks like’ statements as a guide to the steps needed to develop or enhance foundations.

Developing a good mental health performance report: the framework

Why develop a mental health performance framework?

Developing a mental health performance framework will provide a structure to build performance reporting around, ensure performance reporting is in line with key local and national performance indicators and provide assurance that services are working for residents. It should also provide a view of how services are working across the local system – this is particularly important to understand the impact of the complex range of mental health services and support on offer.

Outcomes to build a framework around

A key starting point for a mental health performance framework is the identification of the outcomes that are important for the local mental health system. These could be national outcomes, locally identified outcomes or a combination of both agreed with system partners. Some examples of outcomes to build performance reporting around include:

  • nationally set ASCOF measures
  • CQC outcomes framework measures
  • outcomes measures identified through co-production with people who use services, for example the ‘what does good mental health care look and feel like’ statements from the Centre for Mental Health
  • strategic outcomes from the local adult social care strategy or plan
  • strategic outcomes from the local integrated care strategy
  • local improvement or transformation key performance indicators (KPIs), for example, increased uptake of Direct Payments, rate of uptake of aftercare, improvement in sustained employment.

Case study example: Sheffield City Council has developed a comprehensive performance framework for adult social care services using a combination of different outcome measures.

Mapping performance measures to the CQC framework

The table below provides an illustration of how performance metrics can be mapped to CQC quality statements. This will support the development of a performance framework that aligns with the adult social care CQC assessment themes.

Theme 1: Working with people - CQC quality statement

  • Assessing needs 
  • Helping people to live healthier lives 
  • Equity in experience and outcomes 

Theme 2: Providing support - CQC quality statement

  • Care provision integration and continuity 
  • Partnerships and communities

Theme 3: Ensuring safety - CQC quality statement

  • Safe systems, pathways and transitions 
  • Safeguarding

Theme 4: Leadership quality - CQC statement

  • Governance management and sustainability 
  • Learning improvement and innovation

Developing a good mental health performance report: the fundamentals

Data sources

To report on the fundamentals, the following data sources are likely to be available locally and will be most useful:

  • AMHP data
  • NHS Trust data (if delivering through s75)
  • ASC service data
  • Early intervention, early help and/or prevention service data
  • Safeguarding data
  • Complaints and compliments data
  • Feedback from people who have used services
  • Findings from practice reviews

Measures and metrics

The core measures and metrics that should be included in a mental health performance report are as follows.

  • Numbers of people using services
  • The number of referrals for support including source of referral
  • The number of referrals for Care Act and MHA assessment
  • The number of Care Act and MHA assessments completed
  • The number of people detained under the MHA
  • The number of people under s117 arrangements

Outcomes for people using services

  • The outcomes of referrals for support (for example, Care Act assessment referral, signposting – number and percentage of each outcome)
  • The outcomes of Care Act and MHA assessments (for example, No further action, service put in place, discharged to GP – number and percentage of each outcome)
  • The outcomes of reviews (for example, reduced care package, maintained care package – number and percentage of each outcome)
  • The time taken to complete Care Act and MHA assessments
  • The time taken to complete reviews
  • The number of overdue reviews (for example, reviews not completed by planned review date)
  • Performance to KPIs (for example, ASCOF)

Experiences of people who use services

  • The number of safeguarding concerns raised that relate to mental health, and number that required enquiry
  • The number of complaints received that relate to mental health services and number that were upheld
  • The number of compliments received that relate to mental health services
  • Qualitative measures of the experiences of people who have used services including case studies, journey maps and outputs from feedback surveys. These could be mapped to the outcome statements for ‘what good mental health care looks and feels like’ from the Centre for Mental Health.

Some illustrative examples these measures and metrics can be found in the section on 'Examples of metrics and measures for performance dashboards'

Measuring changes and trends

A fundamental part of mental health performance reporting is the ability to measure changes or trends over time to identify potential issues or blockages which may need further investigation, or potential capacity and demand challenges.

The measures and metrics identified in the section above should be reported on a monthly or quarterly basis and presented in a way that allows changes over time to be understood– this may be in a dashboard or using clear graphs. Operations, commissioning and/or performance colleagues should provide a narrative to explain potential reasons for changes over time – this narrative should be triangulated with information and data from other sources as widely as possible to get a rich and accurate picture of what is going on in the local area. Some examples of how this can be presented, and trends understood can be found in 'Examples of measures and metrics for performance dashboards.'

Data dashboards

Producing a data dashboard using data visualisation software is an effective way of presenting data in an accessible way. This is a tool that is used by councils across the UK, although capability and capacity to develop dashboards can vary area by area. It may be necessary to upskill teams or seek short term support to develop or enhance local reporting dashboards, working with council corporate services where necessary to support mental health leads to develop what is required, especially where they are integrated with NHS teams and may receive more arms-length support from council support services.

Case study example: Essex County Council has produced a comprehensive dashboard for monthly mental health performance reporting. Contact [email protected] for the full Essex dashboard.

Qualitative reporting

To bring the experiences of people who use or deliver services to life, the data dashboard can be accompanied by a qualitative report outlining stories or case studies, highlighting key themes to identify what is working well, and any areas for improvement in process or practice.

Case study example: North Yorkshire Council hold regular meetings across the county with residents with lived experience of mental health challenges. This is in addition to several other resident forums and regular “ask the director” sessions. Capturing and reporting on insights from regular engagement with local people can be a powerful way to enhance performance reporting.

Developing a good mental health performance report: moving to advanced reporting

Wider data sources

Data from the following local partners may be useful to monitor the wider contextual data that impacts on mental health performance:

  • Public health teams
  • Integrated care board (ICB)
  • Police services or police and crime commissioners
  • Children’s services
  • VCSE organisations
  • Adults and children’s commissioning teams
  • Housing services (internal and local housing providers)

Advanced measures and metrics

The additional measures and metrics that could be included in a more advanced mental health performance report are outlined below. See the section on 'Examples of advanced reporting metrics' for examples of how these metrics could be reported and presented.

Demographic comparisons

The number of referrals, assessments and outcomes including demographic breakdown and comparison (for example, age, ethnicity, gender)

Staff capacity and resilience, and practice quality

Reporting to cover staff in a range of capacities including social workers, AMHPs and those in social supervision roles:

  • staff caseload tracking including case load numbers and percentage of remaining capacity per team or area
  • the number of staff vacancies and associated costs
  • tracking of average duration in post to measure workforce resilience
  • the number of agency staff in post and tracking of changes in numbers of agency staff over time, with narrative to describe the reasons for this
  • tracking of average duration of agency staff contracts
  • qualitative measures to track whether staff feel safe and supported in their roles, able to provide care with compassion and build continuous relationships with those they support (for example, analysis of feedback from staff surveys or (exit) interviews with staff)
  • the number and percentage of assessments from practice reviews that meet quality standards including legislative compliance.

Hospital bed and placement capacity and waits

  • Hospital bed availability and bed waits, out of area bed usage and reasons for delays and bed waits; the number of beds available, waiting times and number of each reason for delay.
  • Community placement availability, waiting times, out of area usage and reasons for delays (including rehab, residential care and supported living); number of placements available, waiting times and number of each reason for delay.

Tracking of pathways and spend

  • Trends in types of assessment, tracking number of Care Act and MHA assessments over time.
  • Tracking of pathways following detention under the MHA.
  • Tracking of onward pathways and outcomes for people under s177 arrangements (number and percentage on each pathway and with each outcome).
  • Tracking of costs related to s117 and how these costs are shared between the council and NHS –with access to comparable data both regionally and nationally to support benchmarking.
  • Commissioned care packages and costs, including how these costs are shared between the council and NHS (for example, number of care packages and average cost).

Contextual information from system partners

  • Direct Payment use and outcomes; the number of Direct Payment recipients and the number and percentage of each type of outcome (for example, social inclusion).
  • Prevalence of common mental health conditions (Joint Strategic Needs Assessment (JSNA) data).
  • Nationally reported performance (for example, referrals for and effectiveness of Talking Therapies).
  • Transitions and needs of children on transition pathways.
  • The number of children and adolescents’ mental health service (CAMHS) referrals, the number of assessments and the outcomes of assessments to understand emerging needs.
  • Performance of commissioned services (for example, VCSE preventative services).
  • Police data (for example, patterns in mental health related callouts, number of times place of safety is used and settings used as a place of safety, Right Care Right Person (RCRP) metrics).

Wider, contextual performance

Completing deep dives

There may be certain elements of performance or practice that can’t or don’t get reported on as standard (as they may require too much time and capacity to gather data) but are still important to provide assurance that the council is meeting statutory duties. To address this, regular (in other words, quarterly or annual) ‘deep dives’ into these areas should be considered. Some suggested areas that may be considered for a ‘deep dive’ include:


Local approaches

Bespoke reporting

Not all local areas will have the same areas of focus, and it may be necessary for some systems to focus on specific issues (for example, delayed discharge). This may require dedicating deep dives for set periods of time into these issues before re-integrating into business-as-usual reporting.

Case study example: Lancashire County Council tracks bed wait duration, delay reasons and out of area bed usage on a monthly basis in MHA assessment reporting in order to identify and resolve issues, using a partnership approach with the NHS trust to address them. Contact [email protected] for the full Lancashire dashboard.

Quality spot checks

Other areas have developed local processes for regular deep dives to support their quality improvement processes. The results can be summarised and reported back for system level oversight.

Case study example: North Yorkshire Council have implemented a process where social work team managers randomly identify a case for review, complete an analysis of case notes to assess the extent to which person centred, strengths-based and Care Act or Mental Health Act compliant practice was used (measured on a scale of 1-5). Results of case reviews against this ‘practice quality assessment tool’ are reported in regular mental health performance reports. Contact [email protected] for the full North Yorkshire dashboard.

Analysis of the numbers of referrals, assessments and outcomes against demographics

Many areas will be aware of their hard to reach or left behind communities and may wish to develop bespoke reporting that allows them to track who is using services and surface any trends around communities not accessing support.

Case study example: Dudley Metropolitan Borough Council track the demographics of people who use services in their mental health data dashboard (contact [email protected] to receive the full dashboard) and have included the functionality to compare outcomes by demographic group, to understand who is using services, and if there are any differences in outcomes between groups

Putting mental health performance reporting into practice

Stakeholders 

There are a range of stakeholders in the mental health system who will be important to involve when establishing good mental health performance reporting. The diagram below splits key stakeholders between those who should be considered fundamental or advanced depending on where you are on your development journey.

A diagram showing the fundamental and advanced stakeholders in the mental health system. The fundamental stakeholders are principal social worker, performance leads, AMHP service lead,  commissioning lead, NHS trust performance lead. Advanced are people with lived experience, police, digital leads, public health, elected members, primary care leads, children's operations and commissioning leads, and ICB commissioning leads.
Stakeholder diagram


Establishing governance processes

To fully embed mental health performance reporting and establish rigour around the process, it is important to ensure that it is core to day-to-day monitoring and operations. Mental health performance reporting should be regularly reviewed by operational, commissioning and performance leads, with key findings reported to the DASS and senior leadership team. Reporting lines into statutory forums such as the Health and Wellbeing Board and corporate management team should also be built in.

In areas with integrated arrangements with health, an integrated governance meeting could be established or integrated into existing governance (in other words, joint commissioning board) to oversee performance.

Methods of reporting

It is important to work with identified stakeholders to establish reporting methods and frequency and identify a lead who will be responsible for pulling together data and the supporting narrative, and regularly presenting to an agreed governance group.

Examples of measures and metrics for performance dashboards

 

Example 1: Referrals by source

A bar graph showing number of mental health referrals by source- family, friends or community; self-referral; voluntary sector; commissioned provider or adult social care front door.
Example 1 - referrals by source.

 



Example 2: Referrals by year

A bar chart showing average number of mental health referrals by year from 2019 to 2024.
Example 2 - referrals by source

 



Example 3 - outcome of referral

A bar graph showing the outcome of referrals - either information given; referred for assessment; NHS services;  VCSE services or other. Each range is represented as a percentage.
Example 3 - outcome of referral

 


Example 4 - timeliness of assessment

A bar graph showing the timeliness of mental health assessments - whether overdue (red), overdue (amber) or on time - per month in 2024.
Example 4 - timeliness of assessment.

 


Example 5 - tracking safeguarding enquiries

A bar graph showing number of safeguarding concerns and number concerns raising enquiries per month from November 2023 to November 2024.
Example 5 - tracking safeguarding enquiries

 

Examples of advanced reporting metrics

Example 1: Bed wait duration

A bar chart showing bed wait duration in November 2024
Example 1 - bed wait duration (November 2024)

 


Example 2: Out of area beds

A bar graph showing the number of out of area beds used per month from November 2023 to November 2024
Example 2 - out of area beds

 


Example 3: Reasons for delay in admission
 

A bar graph showing showing reasons for delay in admission per month. The reasons are no available transport, no available trust doctor or no available AMPH.
Example 3 - delay reasons


 


Example 4: Practice quality

A bar graph showing the percentage of quality assessments rated good or above for each adult social care team.
Example 4 - practice quality

 



Example 5: Placement costs

A graph plotting the average weekly cost of mental health placements per month as a blue line and the number of placements in bars along the bottom from November 2023 to November 2024
Example 5 - placement costs

 


Example 6 - social worker capacity

A bar graph Graph showing the average social worker capacity from November 2023 to November 2024.
Example 6 - social worker capacity