We know that a solid evidence base should be essential for policymaking across all areas. The What Works Centre for Local Economic Growth examines which polices are most effective at increasing local economic growth and supports policymakers to produce more high quality evidence to better guide future decision making and policy effectiveness. Professor Henry Overman, director of the Centre, examines the progress so far.
- Reviewing the available evidence can help identify aspects of policy design which in turn can improve policy effectiveness
- Some interventions examined simply aren’t good at improving local economic growth, although these policies may deliver on alternative policy objectives
- To drive a step change in policy effectiveness, the Centre encourages policymakers to embed the use of evidence into their decision making processes
For the last three years I’ve directed the What Works Centre for Local Economic Growth, which aims to analyse which policies are most effective in supporting and increasing local economic growth .We do this by reviewing what the existing evidence base reveals about the impact of different policies, and supports policymakers to produce more and better evidence to help guide future decisions about where to spend scarce resources. While we await a decision on future funding, we’ve been taking stock of where we have got to and thinking about what’s next.
From the very early stages, one of the biggest challenges we faced was how to systematically review the huge amount of evidence that claims to evaluate local economic growth policies. We decided that our initial focus should be on summarising the available impact evaluations. That is, studies which look at what the changes are for those that are targeted by a policy and compare this to similar individuals, firms or areas that haven’t been targeted.
To date, we’ve done 11 reviews of this evidence, covering a wide range of policy areas – employment training and apprenticeships; broadband and transport; access to finance, business and innovation support; as well as culture and sports, estate renewal, and enterprise zones.
The first thing to note is that impact evaluations make up a very small proportion of the studies that show up in our initial literature searches. For Access to Finance we found 27 out of 1,450 studies that met our criteria, for Business Advice 23 out of around 700. This ratio was highest for Employment Training (71 out of around 1,000); and lowest for Transport (29 out of 2300). For all that some people worry about too much emphasis being placed on impact evaluations, these ratios highlight a big challenge. We don’t do enough high quality impact evaluation, meaning that decision-makers are not well served in terms of the available evidence when developing policy.
Turning to the findings, the first substantive message to emerge from our reviews is that success rates vary across different policies. For example, active labour market programmes and apprenticeships tend to be pretty effective at raising employment (and at cutting time spent unemployed). In contrast, firm-focused interventions (business advice or access to finance measures) don’t tend to work so well in raising workforce jobs.
Second, the evidence shows that different policies affect different outcomes. For example, firm-focused policies tend to be better at raising firm sales and profits than at raising employment. That might feed through to more money flowing in to the local economy, but if employment is the priority, resources might be better spent elsewhere.
Third, some interventions we looked at simply aren’t good at improving local economic growth. Of course, some of these policies do deliver on other important policy objectives – for example, improved housing and local environments in the case of estate renewal. We make similar points about sports and cultural events and some aspects of broadband prevision. These findings have often proved to be among our most controversial.
Partly this is a matter of misunderstanding. I’m convinced that these wider outcomes matter and it would be good if these policies delivered big local economic growth gains in addition to meeting other social objectives. Unfortunately, the evidence suggests we need to be very cautious about making such bold claims. Our reviews highlight the importance of being clear about policy objectives and which interventions are more likely to achieve them.
Our reviews also show that the impact of policies can be heterogeneous. Improving broadband access is a good example: SMEs tend to benefit more than larger firms, as do firms with a lot of skilled workers, and people and firms in urban areas. That gives us some clear steer about where economic development budgets need to be focused.
Finally, our reviews help to identify which aspects of policy design can improve effectiveness. For example, our Employment Training review suggests that involving firms and focusing on shorter courses can improve effectiveness. For Business Advice we find that tailored support may be more effective, although it is also more costly. The reviews themselves provide plenty more examples although we still have a long way to go on understanding differences in cost-effectiveness for different aspects of programme design.
A stronger focus on questions of programme design underpins our toolkits which seek to expand the range of evidence we consider when looking at the effectiveness of different types of approaches. If our funding is renewed, we’d expect this toolkit work to be a big part of our activities over the next couple of years.
So at the end of all of this, can I actually tell you what works? Of course not. We’re not here to tell local authorities and central government the ten things that they have to do to drive local economic growth. Good policymaking doesn’t work like that (nor does bad policy making for that matter). Our role is to try to summarise what the evidence can tell us to help inform decision-making, but local conditions, resources available, and political realities will all determine how that evidence is translated into policy.
For the vast majority of decision-makers who want information, rather than instruction, our reviews and toolkits provide a way of better understanding the likely impact of policy and for thinking about improved effectiveness. Now we want to continue to work with decision-makers to help them think about how to embed the use of evidence into the policy processes. Summarising and synthesising the available evidence is only the first step to ensuring that evidence can indeed help improve policy effectiveness.