Moderator: Welcome to the latest episode of the Nudges for Social Good podcast, from the Local Government Association. My name is Rhian Gladman, and I manage the behavioural change programme here at the LGA. So, the aim of this podcast series, as we've said before, is to really demystify behavioural insights and behavioural change, and provide real practical learning points and examples for you, in councils, to take away and try out on your own patch and in your own services. So, today, I'm joined by Hazel Wright from the Behavioural Insights Team. Hi there, Hazel. How are you?
Hazel Wright: Hello, very well thank you. Thanks for having me.
Moderator: So, to give a bit of context here, it's raining a lot where I am today, so you might hear some rain in the background. Our listeners might get that. Are you struggling with the weather where you are as well?
Hazel Wright: Yes. It's pretty dreary, but hopefully it won't interfere too much.
Moderator: A conversation about behavioural change and behavioural insights is going to cheer us all up, I'm sure. So, good stuff. Thanks for being with us today, I really appreciate your time. So, just to kick us off really, can you introduce yourself and your role at behavioural insights team please?
Hazel Wright: Yes, so I lead the local government team at Behavioural Insights, which is based across Manchester and London. My own academic background is really an experimental psychology, and later as a postgrad, in psychology and economics. So, when I first encountered BIT in 2014, I was actually working at the Cabinet office in the cities and local growth unit, and it was a real revelation to me that there were opportunities to apply behavioural science to public policy. I’ve always been really excited about the work that we do at BIT.
But, in terms of the team, most of what we do in the local government team is focused on vulnerable communities, so that's those who are homeless, those who are in receipt of social care, and those who work as carers, or informal carers as well. So, since a lot of the most, kind of, important policy areas like public health and social care are delivered locally, my team does a lot of work with local authorities across the UK to embed the use of behavioural science at that level.
Moderator: So, lots of experience there, and I think it's interesting that you've got that central government perspective, and now you're working more with local government and applying your academic background to that as well. So, great stuff. So, as I said in the introduction there, we’re really looking to demystify behavioural change, behavioural insights. There's a lot, kind of, academic, jargony words around it as well, and we really want to be very practical and pragmatic with this podcast.
So, we've had other councils who have been on the podcast and have shared their example of their behavioural insights interventional nudge and what happens. But, what I want to do today is for us to really strip it back even further, and just to really put ourselves in the position of a very busy councillor or local government officer, who really just doesn't quite know where to start with setting up such a behavioural insights project. So, if we can really strip it back. You know, where should a councillor or officer begin when developing a behavioural insights intervention?
Hazel Wright: So, before we reach the stage where we start developing interventions and solutions, a really good place for a local authority to start, I think, is by looking at the challenge or objective in more detail. The question that is useful to ask at this stage is, what is it that we actually want people to do? So, that's normally where my team would kick off a project.
So, we have methodology, called 'test'. The first part of this is 'target', and that involves honing in on the target behaviour, so what is the specific behaviour we want to change, and what is it we want people to do? So, as an example, you might start with a problem that sounds pretty clear, so say you want to increase the number of people taking up bowel cancer screening, and maybe we can be really clear immediately about who in the local area we need to increase uptake for. So, let's say it's gentlemen over 50, but the goal itself isn't detailed enough in terms of the behaviour, so what we need them to do. So, increasing the number of people taking up bowel cancer screening, in this case, it's really useful to know that the screening process is done at home, and that involves the person collecting and sending back three separate stool samples. So, now we're looking at a target behaviour that is actually to increase the number of males in their 50s who collect three stool samples and return them by post. So, I'm not going to dwell on that one, but it's fair to say that when you get down to that level of specificity in terms of the behaviour, it potentially brings you to a very different set of solutions down the line. So, it's always worth breaking down the aim into something specific.
As a local authority example, one of your goals, for example, might be to look at increasing the recruitment of foster carers. So, a number of different behaviours. It might be that you need people to attend information sessions, or it might be that you need potential foster carers to remember to be in for the home visit, or that you actually need them to complete the formal application.
So, I think, as a starting point for a local authority, I would think the key to developing an achievable target and starting with solutions is firstly thinking about the who, so whose behaviour is it we're trying to change? What specifically do we need them to do, and ideally by when? That is, I think, also a really useful conversation for councils to have with their operational teams and their delivery partners, because not only will those teams have insights to what the key behaviours are, it's a really good way of bringing everybody onto the same page in terms of understanding what, as a team, it is you're trying to achieve.
Moderator: Yes, I really like that, I like the, yes, the who, what, when. That's a really nice, you know, little principle to take away, isn't it, and keep in our minds as you're setting out on this journey. I guess because the council will see their part, potentially, of the service, and then maybe other partners that are seeing different parts, so if you can all come together holistically and look at that whole end to end service and ask yourself the, you know, who, what, when questions together, I think that's a really good place to start.
So, we understand the behaviour at the top level. We’ve drilled into it, you know, at different levels, as you set out beautifully in those examples there. We talked to our partners about that, we've done the who, what, when part. So, how can local authorities start to gain insights into what drives the current behaviour at the moment that we're looking to change?
Hazel Wright: Yes, this is the bit that we find really interesting. So, given that one of the guiding principles if behavioural science is, it is really to understand why people actually make decisions. This stage of the work is really about getting under the bonnet of the problem that you want to solve, and that's really important because councils and council workers might have an intuition for why they think people are doing what they're doing, but often our intuitions are wrong. So, we might anticipate behaviour based on what we think people should do, which is very different, or can be very different to how they behave in practice.
So, this next stage is really about exploration, so it's about understanding the context in which the behaviour happens. So, we know, for example, that environmental factors have a really big impact on behaviour, so the context and the details really matter. So, I would suggest putting a strong focus on observing systems and processes and behaviour in practice, and understanding behaviours in their given environment, wherever possible.
There are lots of different methods that local authorities can use, and that depends on, you know, the timescales, the resources, and the behaviour they're targeting. But, one very good one is to step into the shoes of the user. So, that involves actually moving through that process yourself, if you can, or observing what happens in practice, where you can, and that gives you a clearer view of the different steps that form a process, or that lead to a behaviour. It gives you a sense of how it feels, and where things in the environment like other people or physical situations might impact that behaviour. So, just as an example, members of my team have done all sorts of things, so they've gone on collection rounds with bin men really early in the morning.
Moderator: That was one of ours, wasn't it, yes, that was LGA, with Westminster.
Hazel Wright: Yes, a couple of members of our team got that one. That was really interesting and got some really useful insights about all sorts of things, including recycling and the behaviours around those. They've signed up for benefits in the past, so they've been through the whole system from the front end job centre, kind of, application, all the way through. They've shadowed health visitors on home visits, and all of these things that are really embed yourself into that context, and pick up on things that you wouldn't notice otherwise.
Then, I think the second thing is it's also really helpful to think about the data that you have available. So, that can tell you a lot about what's happening, and it might give you a bit of a, kind of, pointer towards a pattern. So, with foster care applications, for example, with one local authority we had a look at their data with them, and we could see from what they collected that there was this quite substantial 20% drop off in applications after a home visit, which is quite late in the process. So, we knew that even after getting all the way through that process, potential foster carers, that there's still something happening around the home visit and between that and the application stage, that is affecting their behaviour.
So, what we would do then, and another thing councils could do, is if you identify that point in the process where there is some friction or there is some drop off, you can then move to something like interviews which are a really useful way of getting at some of the detailed questions as to why there's a challenge there.
So, interviews are real social interactions, so if you can make an interview feel comfortable, you're likely to get really detailed, thoughtful answers that are really honest, that explain why people do what they do, or why they make the choices they make.
Moderator: I just want to pick up on that point about interviews and focus groups with those whose behaviour you would like to change. Obviously, now everything with COVID we're doing it all online, and, you know, we're not convening those groups of people anymore. Can you talk a bit about how, if a council wanted to, you know, do those either individual interviews or run focus groups, they could start to do that in this online world? Have you got any examples of that sort of work?
Hazel Wright: Yes, we have moved in some of our projects, to doing some of these things remotely, and it is challenging so I think it does put you in a very different environment for gathering that kind of information, because part of the benefit for doing interviews and part of the benefit of doing focus groups is that you can read the room, you can read social signals. If you're asking a question that's a bit difficult, then the interviewee can give you non-verbal cues that maybe you need to move in a different direction. So, COVID has definitely produced some, kind of, challenges in terms of collecting that kind of information.
But, equally, there can be benefits as well. So, I think one of the challenges with interviews is, or it can be, that people feel a bit worried or uncomfortable about talking about things that are difficult, or there might be a social desirability need, so, they want to give you an answer that they feel you want to hear, and you don't want really to influence them in that way. So, there are options for telephone interviews that we found that actually we think seem to increase the feeling of, not quite anonymity, but distance from the interviewer, that allow people to provide a bit more depth and a bit more insight into how they're feeling and what they've done because they've got that distance.
So, it's a strange thing where some of those methods have worked in perhaps a bit better than we've expected. But, it is difficult, and I think there are other methods that you can use as well, including moving to things like anonymous surveys that if you're feeling like you've got a pretty specific idea of where you need to ask questions around a specific topic, you can try something like that instead as a quick way of gathering lots of in depth information and perspectives from people without putting them in a situation where you're trying to do an interview at distance.
Moderator: Okay, so we've talked about, just to recap there, you can observe the behaviour yourself as officer councillors, you know, walk in the shoes of residents who are interacting with your services. You can do interviews, we can do, sort of, online interviews, telephone interviews, and you were talking there about a formal, anonymised survey as well.
So, there are some options there for councils to take that next step, and again, you would suggest working with partners as much as you can with this stuff?
Hazel Wright: Yes, absolutely, because, especially, we do a lot of work with vulnerable groups, so it's really useful to get the perspective on partners in terms of how to approach conversations with groups that have particular vulnerabilities. I would say, where possible, it is really useful, and this is obviously a difficulty with COVID, but it is really valuable to do observations. We’ve done a lot of work where some of the things that you pick up in observations are so small and so subtle, but they have such a disproportionate impact on behaviour, that it's really interesting.
So, for example, one of the projects we did was looking at the integration of a couple of teams that were co-located, so it was health and social care workers in the community. In theory, co-location means they should be interacting more, but we spent, you know, the day with one of these teams and we saw that actually, what they do is they go out and they spend most of their time in residents homes, and probably now they're spending most of their time separately finding other ways to contact residents, perhaps over the phone, so that means they're not actually interacting that much day to day, and they're only really coming together when they're doing things like handover meetings. So, that's something that we could then confirm through interviews and in-depth interviews with the professionals.
But, it's also things like, if you are able to be in the same space, we realised, for example, that when they’re coming into the building, these two teams, one team would turn left at the entrance and the other team would turn right. So, they go to different ends of the office, so they're co-located in practice, but it didn't mean, again, they were coming together. Everything in the building was segregated down to the milk that they had in their fridge. So, it is a really good way to pick up these, kind of, small, subtle things, and if you have the benefit of having time to do interviews or calls, then you can probe some of these things in a bit more detail as well.
Moderator: I'm really glad you made that point about engagement with vulnerable people, and how we really need to, you know, take care and ensure our interventions and how, you know, we're engaging with people is appropriate, and that we've engaged with partners about it. That's a really important point to make, so I thank you for raising that one.
So, if we think back to our busy councillor, our busy local government officer, what should they then consider when designing the actual intervention or the nudge they are going to carry out to try and change this behaviour?
Hazel Wright: So, there's lots of different methods for going about your final design for what it is that you want to use as a solution, but I think, however you approach it, the most important thing is to make a link between the solution you're designing, and what the actual behavioural barriers are, because that's ultimately what your solution is trying to address, and that's why the explore work is so important.
So, as an example, for a local authority who might be interested in preventative measures to ease the demand on social care services, they might be looking at, for example, the uptake of assistive technology. So, that's things like support bars and ramps, lifts and alarms, so it's preventing somebody from having an accident and then having to move into social care.
We did a project like that with one council, where we were looking at designing a solution to get people to apply for this assistive tech and have it installed, and the barriers that surfaced in the explore phase were really interesting. So, one of them was this, kind of, choice overload, so that's people switching off because they were given too many options that were very complicated and it was too hard to distinguish between the options for things they could ask for. But, really important was that we find there was a, kind of, social stigma around having things like support bars and emergency alarms installed. They're not subtle devices, they're not particularly aesthetically pleasing, but people were also worried that because they stood out, they're being seen as having a disability and that label, they felt that there was a lot of stigma around that label as well.
So, linking the solution to the barriers, one of the things we decided to do to tackle that stigma was to use social norms in a pamphlet to target people who we felt were in the at risk group for things like falls. What social norms do is they just highlight that the use of the devices like that, in that area, the local area, was really actually widespread. So, rather than being, kind of, deviant for having them, you're actually part of a larger group of people, locally, who shared the need for those things, so you aren’t standing out in that way, and we also let them know what the most popular options were. Then, also to, kind of, again, get at that feeling of stigma, we considered who the most effective messenger was. So, we included a quote from somebody like them which was a local resident, talking about how his pendent alarm basically made him feel more secure, and that was designed, and both of those things were designed to tackle and speak to the barriers that we picked up in that explore phase.
So, that intervention did increase referrals to the assessments for assistive tech and they did increase installations as well. So, I think the headline is, it's just really important to ground the solutions and the barriers that you find.
Moderator: So, just to unpack the social norms idea because this is really powerful. The idea behind this is that we are social creatures, we don't really want to stand out from the herd. This goes back, you know, pre-historic stuff, you know, we want to fit in, and we want to, you know, not really be the odd one out. So, I think social norms, you know, it's a really powerful tools for councillors to use, isn't it, to appeal to that.
Also, the other point around messenger, I think that's really important as well, is to really think about who is going to deliver that message and who you can use to be the messenger, and in this case you were saying it might not have been who the council would have traditionally put as the messenger on the pamphlet, is that fair to say?
Hazel Wright: Yes, exactly. It's levering those social norms to their best effect, and one way of making a social norm really powerful is to make the person you're using as a messenger, or to make the norm you're using as a comparison, as similar to that target group as possible. Similar age range, similar demographic characteristics, and similar needs, which is what we did in this case.
You're right to say it's not often that a council would choose somebody who is more like a resident than a figure of authority, but, actually, for this group, somebody like them with similar needs to them is a more powerful messenger than hearing it from somebody that they might recognise, say in local government, and that is an important insight.
Moderator: I guess, as well, with the assisted technology stuff, there may be a belief that this is for older residents of the population, and that's not always the case, is it? You will have people of working age, different ages, different demographics, that we will need to appeal to take up these technologies to help them in their lives, so I think by picking somebody different to give that message, you can appeal to the types of people who traditionally might have thought, 'that's not for me'.
Hazel Wright: Yes, exactly, and part of the, kind of, mechanism of social norms is really to reveal what other people are doing. So, it might be, in this case, for example, that people feel like they're somehow deviant or they're standing out because they might have a mobility challenge or they might have a disability. But, actually, making sure that they understand that there are lots of people that use this kind of technology and they're part of a much bigger community, and those people are not just elderly people. They're people your age and they're people my age, and the needs are really varied, is a really important thing in terms of making people feel much more comfortable and less like there is a stigma attached to it.
Moderator: So, in that intervention design, we've talked about social norms, we've talked about the messenger, are we co-designing that as well with our target audience? Can you give some examples of that please?
Hazel Wright: Yes, so I think the most effective intervention designs, they're co-designed, where you can, it's not always possible, with the group of people you're hoping to support, but they're also developed in partnership with people who perhaps normally, from a, kind of, strategic perspective, you wouldn't think to include. So, that is often, I think, for local authorities, it's actually their front-line staff.
So, the next, kind of, best thing to actually getting input from your target group, which is often a bit of a challenge, is working with people on the front-line that come into contact with your target group. So, this links to one really important factor with intervention development, which is about timing. So, we talk a lot about the best time to introduce an intervention, and if you're a local authority, the most timely point to introduce an intervention is one that your front-line teams will recognise when you won't. So, it's front-line teams that will know, and it's people that have actually done this, that will know that, for example, a social landlord will get a notification when one of their tenants applies for universal credit. That gives you an insight into when it might be the best time to reach out and contact that person because they're moving into a different financial situation.
Equally, if one of the things you are interested in, as we've said, supporting people to move into their own homes for longer, it might be really useful to consider that moment when residents submit an application to have their wheelie bin taken out for them, because they can’t do it for themselves anymore, and it's your front-line teams and your residents who will know that it's something that that is one of the first things that they might do when they start coming into a situation where they're developing mobility issues. So, I'd really recommend bringing in the expertise of, yes, whenever you can, your target group, and whenever you can, your front-line teams who are exposed to that side of the service, because they'll know where those little moments are where you can reach people at the right time and you can identify them.
Moderator: I think timeliness, that's such an important part of your intervention design, isn't it? It's understanding where in the customer journey is the right time. I guess another example from our programme is the fizzy drinks trial we did with Liverpool City Council, with the hospital cafes, and actually, you know, the decision through engagement and front-line staff, and service users as well, was taken as well to put red stop signs actually on the shelves where the fizzy drinks were so as the customer was making that choice, the point of the choice being made, they could then decide, 'oh, that ones very sugary, that ones not'. Rather than maybe picking up a drink, going to the till and then being told, actually, by the way, that's a high sugary drink. That would have been the wrong point in that customer journey, wouldn't it? It needed to be at the point the decision is made visually by the customer, so I just think, yes, that's a really important point to draw out.
So, okay. We've got out interventions designed, this is the, definitely the 64 million dollar question. What are the tips you have for measuring the impact of our trial? So, how will we know that it's worked?
Hazel Wright: Yes. So, I think this is really important, and this is important as well because it's something that needs to be considered almost as early as possible. So, before you reach the stage before you're looking at designing a solution, you want to know that there's a way that you can tell whether or not what you've done has worked.
So, one thing is that, I think, very early on, to think about, okay, how will we know if this has worked? That is, how will we measure the behaviour we are trying to change? Is it measurable? So, we've talked about increasing the uptake of assistive technology. You know, who actually records and holds that information? Is it recorded? You know, can you access that for your area, is it detailed enough to show you the numbers of applications or the numbers of installations?
So, is the behaviour observable, is it something that's already being recorded, or is there something you need to put in place to monitor that behaviour and record it so that you can tell if it's changed as a result of what you have done. That is something that's definitely, that's a first stage conversation to have with whoever it is that you feel owns that data or might be able to be able to put some of those measurements in place.
Moderator: So, to make that really clear, right at the beginning when we're identifying what behaviour it is we want to change, that's when you need to think, can this behaviour be measured? Who measures that, where's the data, how can we access it? Like you need to really get that clear right at the first stage?
Hazel Wright: Yes, exactly. It's part of a conversation about the feasibility of what you're trying to do. So, one of the things you obviously want to do is that you want to have an impact, and you want to know that your solution is going to drive change in the right direction, but you need what you're doing to be feasible in terms of understanding that impact.
So, yes, it's a first stage thing. Not being able to measure it at the front end doesn't necessarily mean there's not something you could put in place at some point in the process to make sure that data is collected, but that is often more effortful. So, it might be that that informs your decision about the kind of changes that you're trying to drive because it might be that there are other options for things that you could observe, that it would be really easy to collect data and it would be really easy to measure, so you, kind of, pivot your goal towards something like that instead, which can save a lot of time later down the line.
Moderator: Do you have an example of where you've set up that approach to measure the impact?
Hazel Wright: Yes. So, one of the things we have talked about is making sure that there's data available. One of the challenges with that is that there's not always data available that perfectly matches the behaviour that we're trying to change, so it's not always possible to have a direct assessment of that. So, there are other options that you have in terms of understanding other data that are being collected that might be a really useful proxy, so in one case we're working on at the moment, my team, quite a large scale project focusing on children currently being exploited, so it's a very vulnerable group and it’s very difficult to collect data on exploitation.
So, some of the context in which some of these things happens is very complicated, the data are very sensitive. So, it's very difficult to get an accurate read on whether or not anything that is happening locally is reducing the likelihood that those children are being exploited. So, one of the things that we are doing is looking at other outcome measures that we treat as proxies, so one of them might be, for example, it's really interesting that local authorities and some of the services that support these young people collect data on things like disclosures. Disclosures are when a young person might reveal to the person they're working with, so their trusted practitioner, that something has happened which links to exploitation or something has happened that is a, kind of, negative experience they have had with an adult in their area.
So, though that's not a perfect measure of exploitation, it's data that's being collected by partners as part of the service that they’re delivering so they can improve their service, but actually, is also really useful as an indicator for something like a trial or to allow us to measure the impact for something that's a bit more complicated to get at otherwise.
So, yes, I would say, where the data is not available, it's useful to look at other things that are being collected in that area that might serve as a good enough proxy to give you a sense of whether behaviour is changing, or, I think we've mentioned previously, things like surveys can be a, kind of, quick and dirty measure for getting at people's responses and self-report responses and whether they've done what you've asked them to do, or what you're interested in them doing. But, we'd always suggest looking at what data is available first and trying to navigate whether it's possible to use what you've already got, which is a lot less resource-intensive than trying to collect something from scratch.
Moderator: Yes, I think that's a really key point, isn't it? Where, you know, if you're collecting something that could be used as proxy measure anyway, or just see what the data is that's actually available, rather than trying to start something from scratch, I think that's a key point for the listeners to hear there. So, how important is it to test these interventions? Could we not just run a trial and go from there, and then roll it out more widely?
Hazel Wright: You could do that, you could.
Moderator: Playing devils advocate there.
Hazel Wright: Yes, I mean, it is important from the perspective that, you want to know whether what you're doing has worked, and you want more, hopefully, than an intuitive feel that what you've done has had a positive impact. So, one of the things that's really important is, you know, when you're thinking about putting all this time and effort into developing a behavioural informed solution, you really want to have a sense of how much of a difference that needs to make for it to be worth you rolling something out, or even designing something in the first place.
So, one of the things you want to consider is, what is the size of the change that matters. It's useful to have a bit of information at that point about what's already happening. So, if one of the things you want people in your area, so you want your residents, say, to recycle, if you think about what proportion of your residents already recycle, it might be that only 10% of them do. You might be happy to increase that by a couple of percent, three or four percent, maybe. But, if you're shooting for something that's a much bigger effect, if you have a target, say, for your residents, to see 40% of them recycling, that's quadruple what your base line is in terms of your impact, so you're really shooting for something that actually might not be feasible with what you're designing.
So, what you want to make sure you're doing is making sure you get the value out of that intervention, the value out of the resource you put into developing it. So, you want to consider, you know, is your intervention, does it have a good chance of getting to the impact that you think is important, and also, you know, is it realistic to get a change that is meaningful to you? So, if you think the intervention is good, and it's going to give you a small change, is that enough to be meaningful and is that enough to be worth the time?
So, I think, yes, that's one reason to consider testing because then you've got real proof that you've got a change that you can achieve with your intervention that wouldn't have happened otherwise, and you can really see the scale of that change.
Moderator: I think that's a really important point to bring up. So, what we're finding with a lot of the behavioural insights trials that we run with councils at the LGA is that the outcome impact is like a plus 2% take up of the service, or a minus 3%. It's quite small, 2-3% changes, but actually, what we're finding is that because these are some of our greatest challenges in local government, some of society's greatest challenges with most vulnerable groups, as you said earlier, the really difficult things we are trying to do here, ingrain behaviour over many generations, potentially, that actually a nudge of 2% here or 3% here is actually a great result for the council involved. So, whereas you would normally want to be getting 40%. 'Why can't we be getting 40%, 50%?" That is a huge thing.
So, I guess that's part of the conversations we're having with councils is that that 2-3% is a fantastic achievement in this space, but that you can actually evidence that this is a robust measure and that has happened. That's the key, isn't it?
Hazel Wright: Yes, and that gives you the ability to start looking at what else you can do to create a, kind of, additive effect. So, if you can say that there's a cause of link between the intervention that you've developed and the effect you've seen, even if that effect is really small, scaled across a whole population in one local authority area, that can drive really significant cost savings. But, also, it gives another point to start from. So, you know that you've got one thing that works, and anything else you do in the future you do to iterate or add to that is hopefully building on that initial finding.
So, sometimes in behavioural science, because we're talking about nudges, you know, we are talking about gentle interventions to push behaviour in one direction or another, and there's no reason that you can't use those things in concert. So, one intervention you develop that might include one nudge that is a social norm, could also include something that is a reciprocal nudge. So, you do something for somebody and they want to do something for you. You can build on the effects of each of those things by including lots of different nudges in one intervention.
So, it's all part of a bigger process of, kind of, iteration and learning and small gains can really be built on to get you to larger effects over time, but it's about understanding what works and having that evidence for what works so you can build off of those initial gains.
Moderator: I guess the importance of knowing that it's worked, even if it is that 2% one way or 3% the other, that then gives other councils the confidence to say, 'that's worked there', in Liverpool for example. 'We can pick that up and actually trial that in our own area', and that's what we're really keen to encourage at the LGA, hence this podcast, is to encourage other councils to pick up those interventions that have worked elsewhere and try them out in their local space, in their local challenges as well. So, that's another important reason for measuring.
Hazel Wright: Yes, and I mean, it's often the case that it's not possible, say, in one local authority area, to run an experimental trial, let's say a randomised, control trial where you have two separate groups that you randomise people into, and one gets the intervention and one doesn't. There are sometimes challenges and reasons why that's just not feasible in some local authority areas. But, there are other options for ways at looking your impact.
So, you can run small scale pilots that don't give you the same standards of evidence, but do allow you to look at whether or not there's indicative evidence for your intervention moving something in the right direction. So, if you're a local authority that is able to run a pilot, you're paving the way for the next local authority who might have that kind of scale in terms of the number of people they can access to deliver perhaps something more experimental that's a bit more robust. So, you're standing on each other's shoulders, really, and anything that you can do to start looking at what the impact might be is useful in terms of just gradually building that evidence base for what works locally.
Moderator: Yes. That point about randomised control trials is super important, isn’t it, with the units of government that we're, talking about local government, you've got some very small councils, some bigger scale councils, and then obviously you've mentioned the certain target groups might be quite small. The group that we actually want to engage with, rather than the larger scale national central government behaviour change trials for tax, for example. You've got millions of people in a control group. So, I'm really glad you pulled that point out, that even if you can't do the gold standard, RCT there are other things that councils can do and to learn from each other and iterate that's a really important point to pull out.
Okay, Hazel. So, as we're, sort of, coming towards the end of our conversation here today, what we like to leave our listeners with really are some top tips that they can takeaway to implement on their own behavioural insights projects. Want it to be nice and practical for them and pragmatic, so what are your top three tips you'd like to leave the listeners with today?
Hazel Wright: Okay, so the first one I'd say is, make sure you're specific about the behaviour that you want to change. So, think about, specifically, what it is that you actually need people to do that gets you to your outcome, whether that be filling in a form, whether that be turning up to something, whether that be stopping something they're already doing, really drill down to the specifics.
The second thing I'd say is, make sure that you include, where possible, perspectives of a really broad range of people who understand the services, or are in receipt of the services that are linked to your behaviour. So, wherever your behaviour is happening, make sure that you include your front-line workers or your practitioners, make sure you include your delivery teams in the design of the solution, and where possible, it's really useful to get the perspectives through explore work or anything you can do of the people who are actually going to receive the intervention, and here's the behaviour you need to change.
Then, the third thing I would say is, just make the use of free resources. So, there are lots of places where you can pick up new findings, so BIT publish findings on our website, the what work centre for local economic growth have things like evaluation tool-kits, they publish case studies and they have demonstrator projects, and they're really worth a read. So, I think one of the beauties of behavioural science is, you know, whatever you learn, you can use a nudge in one context which might be council tax, that might work in another context that might be recycling. So, there's a huge amount to learn, and I think it's a really fascinating area where there's a lot of value of building this, kind of, capability into teams. I hope that it's as enjoyable for everybody else as it is for us as well.
Moderator: Hazel, that's fantastic. Thank you so much for your time today on this slightly drizzly day, really appreciate it. It's been a fascinating conversation. Thank you very much.
Hazel Wright: Thank you.
Moderator: So, if you'd like to pick up, really on Hazel's final point there, if you'd like to learn more about behavioural insights project that you can try out in your council, then please do visit our website as www.local.gov.uk and search for behavioural insights because we have a host of other nudges for social good that you can learn from and also implement in your local area. Please do share the podcast with your colleagues and friends, and many thanks for listening.