Monday, November 21, 2005

how to do evidence-based policy - some pointers


I was extremely privileged to work for several years in an applied research unit which had been tasked with using evidence to influence decision making. This applied to the decision making of a varied bunch – from policy makers and their executives, to careers advisers and their clients. The idea was to collate and analyse labour market evidence and to tell people really what it meant – i.e. to improve their knowledge of the labour market to inform their practice or their actions and help them make better informed decisions.


It was extremely challenging to start to make evidence more useful. Traditionally, economists and analysts in the area in which we were working had typically delivered long-winded reports with innumerable tables and complicated graphs, and had shied away from analysing the key insights and messages from evidence. It was left to the practitioners to try and become analysts themselves – a flawed assumption. Of course, there were a few practitioners with the time and energy, that could do the analysis, but 99% couldn’t and it was unreasonable to expect this.

It was clear we would have to take a different approach. We did this by:

* Getting to know the market for analysis, intelligence and information – just what did people want from economic analysis and in what format or style? They told us they didn’t use voluminous reports – they sat on the shelf. We actually segmented the market in terms of sophistication of understanding and use of intelligence. Quite quickly there was one market we weren’t interested in catering for – the experts. The experts could do their own analysis, and were comfortable with long winded reports but represented a very small user base for our services who mostly had limited influence on policy or delivery. There was no payoff in catering to this user base, apart from being well thought of by our peers. I think it is a mistake of many research units in the public sector that they try to emulate experts as they think it’s the best way to go about things – but they ignore the bulk of their consumer base which is the middle ranking public sector official or service delivery agent.

*Getting to know how people could use intelligence to better inform their decisions – what do they need to know and when? This involved identifying a few key constituencies and opening up lines of communication about their needs – for example, we did this with the Further Education College sector and with the Careers Services sector. This eventually led to collaborative products that were market tested with clients before public release.

* People wanted short reports! They were fed up with 100-page reports – they wanted much shorter reports – in the case of careers advisers, they wanted one or one-half page news stories! We started to provide a hierarchy of reports – we would have the 100 page reports and tables that were necessary in the first place on which to do the analysis – we provided these but didn’t make a song and dance about them; we then provided a 20-30 page easy to read report; then a 5-10 page executive summary; and then a few 1-page tabloid articles covering the main points of analysis, findings and messages in an engaging way.

* People wanted key messages and insights directly relevant to them, but to be also assured of the robustness and quality of the research behind these messages and insights. Ok, we needed to provide short and accessible reports, but the work behind them better be robust and comprehensive. We achieved this through setting high quality standards for ourselves and our contractors. Contractors whose quality of work was inadequate were not rehired. We also engaged in some training for presentations with media people – this meant that we made good external presentations and that we could stand up and be credible as well as informative. We tried to establish and maintain a quality standard for our ‘brand’.

* People wanted ad hoc or regular access to key statistics that were up to date. They wanted this for their reports, their boss’s presentations, or their funding applications. They also wanted all the background information about the statistics – what they meant, the source, the accuracy. Plus, there was a selfish motive by us, the staff – for who wants to be bombarded with requests for the latest unemployment rate or other figures? We had some online resources developed which would provide a range of key indicators in an easy to use internet based toolkit.

*We wanted people to use evidence – as such we had to ensure it was relevant, they would read it, and would find it beneficial.


People want information, intelligence and economic analysis for a reason or a purpose. To meet this it must be ‘fit for purpose’ – and that doesn’t just mean technically correct or accurate (we’re taking this as a given, folks) but in a format and style which genuinely makes a decision or service delivery easier to make or enhanced in some way.

An academic gave our little applied research department a favourable review in a letter to a local newspaper – he said ‘evidence based policy needs evidence’ and congratulated us on sourcing and providing the evidence. I would like to take this further by saying ‘evidence based policy needs accessible, meaningful and relevant evidence’. It is no use providing 200 pages of evidence if it is irrelevant or if no-one is inclined to read it.

For some analysts, they may feel ‘cheapened’ or that they are selling themselves too far to provide their analytical insights in a one page tabloid article. For me, this is unnecessary snobbishness. I would rather do a piece of research and have 1,000 people read it and benefit from it than just 10 fellow analyst who probably knew it all anyway.


The best Careers labour market intelligence resource I have ever seen: Oregon Labour Market Information Service -

Using evidence to inform policy in the labour market in Scotland:


Post a Comment

<< Home