Evaluation: Facing the Tricky QuestionsMark Dyball, Director, People Science and Policy LtdDiane Warburton, Evaluation manager, Sciencewise-ERCLaura Grant, Director, Laura Grant Associates Ltd
Evaluation: facing the tricky questionsAssessing Impact on PolicyMark Dyball, DirectorPeople Science & Policy Ltd
Basic QuestionsCan science communication have an impact on policy-making?What sort of science communications projects might impact on policy-making?Is it feasible to make an assessment of the impact of a science communication project on policy-making?Is it appropriate to devote resources to find out whether there has been an impact on policy?
Policy-makers …are not just politicians and civil servantsinclude managers inUniversitiesCharitiesBusinessesare people that have the power to make decisions and set policies within their organisations
What influences policy-makers?EvidenceDogma/belief
Can science communication have an impact on policy making?Yes, butscience communication is a broad term;different projects have different goals; andmay influence different policy-makers.Was there an intention to influence policy-makers?If so, which ones?
What sort of projects might impact on policy-making?Research-based dialogue Understanding stakeholder attitudes and valuesBringing stakeholders and policy-makers togethere.g. HSE public dialogue on train protectionOften commissioned by the relevant policy-maker
What sort of projects might impact on policy-making?Communication of scienceHuman cloningStem cells
What sort of projects might impact on policy-making?Communication and evaluationFosters understanding ofAudiencesActivities that “work”Who are the effective communicatorsEvaluation can be used to influence policies of:Programme managers (funders)Project managers (communicators)
Resources?It is worth investing time and money in evaluating impact on policy-makers ifIt was a key goal of the projecte.g. Beacons  there is clear evidence that influence on one or more types of policy-maker was an unintended consequenceIn many cases it will not be worth this investment
How do you know if you had an influence?Interview policy-makersFollow document trailsObserve relevant eventsThe co-operation of the relevant policy-makers makes life easier for the evaluatore.g. EPSRC nanotechnology in healthcare
EPSRCNanotechnology in healthcareLarge scale public dialogue (BMRB)Ongoing evaluationThe quality of the projectObservationQuestionnairesinterviewThe impact of the projectDocumentationInterviews
A Final ThoughtWhich policy-makers would I most like to see influenced?Those who have commissioned the project/evaluationIf you have paid for intelligence, then use it.This includes you!
Evaluation: facing the tricky questionsAssessing value for moneyDiane Warburton, Evaluation ManagerSciencewise-ERCwww.sciencewise-erc.org.ukdiane@sharedpractice.org.uk
Question:When is a public engagement project good value for money?
Why we have assessed valueSciencewise has developed ways to measure and demonstrate the value of public dialogue because:•  reduced budgets and increased scrutiny•  need to maintain support as well as funding•  useful to review costs of design decisionsThis became a priority for Sciencewise project evaluations over the last couple of years
Practical and other problems•  "Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted" Albert Einstein •  A cynic is "A man who knows the price of everything and the value of nothing" Oscar Wilde •  Measuring can be expensive in time as well as money•  It is very hard to get detailed accurate cost dataWe decided that measuring value needs to be relevant and proportionate. And it is not always appropriate.
Looked at all sorts of models• Classic VFM: economy, efficiency, effectiveness• SROI: Cabinet Office approach to social value• Classic cost benefit analysisFound were not appropriate because, as models:• too complex• too detailed• made too many assumptions• too much focus on ‘monetising value’Learnt from these but took a different approach
Assessing impactsRather than looking at benefits, we focused our evaluations on assessing four types of impacts:•  impacts on policy and policy making•  impacts on policy makers•  impacts on public participants•  impacts on scientists and others involvedThis approach allowed us to identify short term as well as long term impacts, on people and policy
Six questions on costs1     What was the basic budget?  e.g. Nanodialogues £240,000; Drugsfutures £300,000 2     What were public participants' and stakeholders' perceptions of whether it was 'money well spent'? e.g. Drugsfutures public participant: "yes, if our views are listened to"3     Could costs have been reduced without losing quality? e.g. geoengineering dialogue: public access events did not add enough value to match costs, and very detailed evaluation
Six questions on costs, continuedCould a small additional investment have achieved significant extra benefits? e.g. Synthetic biology dialogue: commissioning follow on work to maximise impacts of project reports; Big Energy Shift: additional public participants at workshop with policy makers What costs could be saved later by having had good public engagement? no examples yet
Six questions on costs, continued6   What are the costs of engagement compared to overall programme budgets? e.g.•      Geoengineering: cost £155,000; fed into EPSRC / NERC sandpit which alone allocated £2.5 million•      Synthetic biology: cost £360,000; budget for synthetic biology research in UK 2005-10 £18 - £33 million•      Nanodialogues: cost £240,000; value of nano research in 2007 estimated at $12 billion•      Stem cell dialogue: cost £300,000; industry valued at more than £500 million per year "If you think dialogue is expensive, try conflict" Andrew Acland
Assessing value of engagement•  Costs are only part of the story, but an important part and usually missed out•  Overall balance of costs and benefits almost always depends on understanding longer term impacts, especially on policy •  Different audiences perceive value differently. We have found: •  policy makers value robust evidence and advice from public, raising public awareness, or testing ideas to manage risk•  public participants value being listened to and having influence, or just the fun of it•  experts and scientists value taking their work to new audiences, and learning new communications skills
Final thoughtsWe have found:•  Evidence of value is vital. Numbers are always powerful, but hard to pin down•  Different audiences want different evidence and so need different messages•  Collaboration and sharing experience helps development, but …•  Can it continue in the current climate?
Many thanksDiane Warburton, Evaluation ManagerSciencewise-ERCwww.sciencewise-erc.org.ukdiane@sharedpractice.org.uk
Facing the tricky questions:EVALUATING LONG TERM IMPACTSDr Laura Grantlaura@lauragrantassociates.co.uk
The problem(s)It’s hard to track people that have been involved in activitiesImpacts might diminish over time, making them harder to measureOther factors have more time to intervene, making attribution (more) difficultWho will do/fund this work?
Case study: Ingenious long term tracking studyIngenious is the Royal Academy of Engineering public engagement grant scheme, now in its 5th roundEvaluation approach:Used Theory of Change to think about what success looked like (including long-term)Support grant holders in self-evaluation of short-term outcomesLong term tracking study to explore impacts after two years
Desired impacts (long-term)Changing practice – engineeringPE is embedded in engineering practice and policy makingTo create a community of engineers that are able to communicate their work to many groups and understand how their work impacts on societyFor it to become standard practice to engage the public at an early enough stage to influence the development of technology For more engineers to be willing and able to work with the mediaChanging practice – public engagementA proportionate amount of engineering in the PE landscape (and for it to be labelled as such)For the ‘development’ funding model to become established Promote meaningful longer-term evaluation of programmes, such as the tracking study
The studyInitial e-surveyRound 1 n=21, 22% response rateRound 2 n=40, 63% response ratePotential to follow up with interviewsMay combine data from Round 1&2Need to take into consideration the way the grant scheme has developed
What worked wellThinking through ‘what success looked like’ for the programme alongside the evaluation approach from the startThe overall approach meaning the evaluator and grant holders meet at workshops, where the tracking study can be discussedPeople were interested enough to respond!
ChallengesCollecting contact detailsDifficulty in benchmarking  long-term impactsVery difficult to know if respondents could have achieved the impacts another wayAttribution or contribution?
Final thoughtsLay the foundations for long-term work by defining success and evaluating short-term outcomes.Long-term evaluation can make a difference to programmes. Obviously this will not happen overnight.Think about who the impacts will be on and whether it’s possible for your evaluation to measure them.As a community, we have research questions around long-term impact that evaluations of individual projects/programmes are not best placed to answer. Who faces these questions?
DiscussionIn groups…Say hello and introduce yourselves; nominate four roles
Timekeeper
Reporter

SCC2011 - Evaluation: Facing the tricky questions

  • 2.
    Evaluation: Facing theTricky QuestionsMark Dyball, Director, People Science and Policy LtdDiane Warburton, Evaluation manager, Sciencewise-ERCLaura Grant, Director, Laura Grant Associates Ltd
  • 3.
    Evaluation: facing thetricky questionsAssessing Impact on PolicyMark Dyball, DirectorPeople Science & Policy Ltd
  • 4.
    Basic QuestionsCan sciencecommunication have an impact on policy-making?What sort of science communications projects might impact on policy-making?Is it feasible to make an assessment of the impact of a science communication project on policy-making?Is it appropriate to devote resources to find out whether there has been an impact on policy?
  • 5.
    Policy-makers …are notjust politicians and civil servantsinclude managers inUniversitiesCharitiesBusinessesare people that have the power to make decisions and set policies within their organisations
  • 6.
  • 7.
    Can science communicationhave an impact on policy making?Yes, butscience communication is a broad term;different projects have different goals; andmay influence different policy-makers.Was there an intention to influence policy-makers?If so, which ones?
  • 8.
    What sort ofprojects might impact on policy-making?Research-based dialogue Understanding stakeholder attitudes and valuesBringing stakeholders and policy-makers togethere.g. HSE public dialogue on train protectionOften commissioned by the relevant policy-maker
  • 9.
    What sort ofprojects might impact on policy-making?Communication of scienceHuman cloningStem cells
  • 10.
    What sort ofprojects might impact on policy-making?Communication and evaluationFosters understanding ofAudiencesActivities that “work”Who are the effective communicatorsEvaluation can be used to influence policies of:Programme managers (funders)Project managers (communicators)
  • 11.
    Resources?It is worthinvesting time and money in evaluating impact on policy-makers ifIt was a key goal of the projecte.g. Beacons there is clear evidence that influence on one or more types of policy-maker was an unintended consequenceIn many cases it will not be worth this investment
  • 12.
    How do youknow if you had an influence?Interview policy-makersFollow document trailsObserve relevant eventsThe co-operation of the relevant policy-makers makes life easier for the evaluatore.g. EPSRC nanotechnology in healthcare
  • 13.
    EPSRCNanotechnology in healthcareLargescale public dialogue (BMRB)Ongoing evaluationThe quality of the projectObservationQuestionnairesinterviewThe impact of the projectDocumentationInterviews
  • 14.
    A Final ThoughtWhichpolicy-makers would I most like to see influenced?Those who have commissioned the project/evaluationIf you have paid for intelligence, then use it.This includes you!
  • 15.
    Evaluation: facing thetricky questionsAssessing value for moneyDiane Warburton, Evaluation ManagerSciencewise-ERCwww.sciencewise-erc.org.ukdiane@sharedpractice.org.uk
  • 16.
    Question:When is apublic engagement project good value for money?
  • 17.
    Why we haveassessed valueSciencewise has developed ways to measure and demonstrate the value of public dialogue because:• reduced budgets and increased scrutiny• need to maintain support as well as funding• useful to review costs of design decisionsThis became a priority for Sciencewise project evaluations over the last couple of years
  • 18.
    Practical and otherproblems• "Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted" Albert Einstein • A cynic is "A man who knows the price of everything and the value of nothing" Oscar Wilde • Measuring can be expensive in time as well as money• It is very hard to get detailed accurate cost dataWe decided that measuring value needs to be relevant and proportionate. And it is not always appropriate.
  • 19.
    Looked at allsorts of models• Classic VFM: economy, efficiency, effectiveness• SROI: Cabinet Office approach to social value• Classic cost benefit analysisFound were not appropriate because, as models:• too complex• too detailed• made too many assumptions• too much focus on ‘monetising value’Learnt from these but took a different approach
  • 20.
    Assessing impactsRather thanlooking at benefits, we focused our evaluations on assessing four types of impacts:• impacts on policy and policy making• impacts on policy makers• impacts on public participants• impacts on scientists and others involvedThis approach allowed us to identify short term as well as long term impacts, on people and policy
  • 21.
    Six questions oncosts1 What was the basic budget? e.g. Nanodialogues £240,000; Drugsfutures £300,000 2 What were public participants' and stakeholders' perceptions of whether it was 'money well spent'? e.g. Drugsfutures public participant: "yes, if our views are listened to"3 Could costs have been reduced without losing quality? e.g. geoengineering dialogue: public access events did not add enough value to match costs, and very detailed evaluation
  • 22.
    Six questions oncosts, continuedCould a small additional investment have achieved significant extra benefits? e.g. Synthetic biology dialogue: commissioning follow on work to maximise impacts of project reports; Big Energy Shift: additional public participants at workshop with policy makers What costs could be saved later by having had good public engagement? no examples yet
  • 23.
    Six questions oncosts, continued6 What are the costs of engagement compared to overall programme budgets? e.g.• Geoengineering: cost £155,000; fed into EPSRC / NERC sandpit which alone allocated £2.5 million• Synthetic biology: cost £360,000; budget for synthetic biology research in UK 2005-10 £18 - £33 million• Nanodialogues: cost £240,000; value of nano research in 2007 estimated at $12 billion• Stem cell dialogue: cost £300,000; industry valued at more than £500 million per year "If you think dialogue is expensive, try conflict" Andrew Acland
  • 24.
    Assessing value ofengagement• Costs are only part of the story, but an important part and usually missed out• Overall balance of costs and benefits almost always depends on understanding longer term impacts, especially on policy • Different audiences perceive value differently. We have found: • policy makers value robust evidence and advice from public, raising public awareness, or testing ideas to manage risk• public participants value being listened to and having influence, or just the fun of it• experts and scientists value taking their work to new audiences, and learning new communications skills
  • 25.
    Final thoughtsWe havefound:• Evidence of value is vital. Numbers are always powerful, but hard to pin down• Different audiences want different evidence and so need different messages• Collaboration and sharing experience helps development, but …• Can it continue in the current climate?
  • 26.
    Many thanksDiane Warburton,Evaluation ManagerSciencewise-ERCwww.sciencewise-erc.org.ukdiane@sharedpractice.org.uk
  • 27.
    Facing the trickyquestions:EVALUATING LONG TERM IMPACTSDr Laura [email protected]
  • 28.
    The problem(s)It’s hardto track people that have been involved in activitiesImpacts might diminish over time, making them harder to measureOther factors have more time to intervene, making attribution (more) difficultWho will do/fund this work?
  • 29.
    Case study: Ingeniouslong term tracking studyIngenious is the Royal Academy of Engineering public engagement grant scheme, now in its 5th roundEvaluation approach:Used Theory of Change to think about what success looked like (including long-term)Support grant holders in self-evaluation of short-term outcomesLong term tracking study to explore impacts after two years
  • 30.
    Desired impacts (long-term)Changingpractice – engineeringPE is embedded in engineering practice and policy makingTo create a community of engineers that are able to communicate their work to many groups and understand how their work impacts on societyFor it to become standard practice to engage the public at an early enough stage to influence the development of technology For more engineers to be willing and able to work with the mediaChanging practice – public engagementA proportionate amount of engineering in the PE landscape (and for it to be labelled as such)For the ‘development’ funding model to become established Promote meaningful longer-term evaluation of programmes, such as the tracking study
  • 31.
    The studyInitial e-surveyRound1 n=21, 22% response rateRound 2 n=40, 63% response ratePotential to follow up with interviewsMay combine data from Round 1&2Need to take into consideration the way the grant scheme has developed
  • 32.
    What worked wellThinkingthrough ‘what success looked like’ for the programme alongside the evaluation approach from the startThe overall approach meaning the evaluator and grant holders meet at workshops, where the tracking study can be discussedPeople were interested enough to respond!
  • 33.
    ChallengesCollecting contact detailsDifficultyin benchmarking long-term impactsVery difficult to know if respondents could have achieved the impacts another wayAttribution or contribution?
  • 34.
    Final thoughtsLay thefoundations for long-term work by defining success and evaluating short-term outcomes.Long-term evaluation can make a difference to programmes. Obviously this will not happen overnight.Think about who the impacts will be on and whether it’s possible for your evaluation to measure them.As a community, we have research questions around long-term impact that evaluations of individual projects/programmes are not best placed to answer. Who faces these questions?
  • 35.
    DiscussionIn groups…Say helloand introduce yourselves; nominate four roles
  • 36.
  • 37.
  • 38.
  • 39.
    FacilitatorWas there animpact on policy?Was the activity good value for money?What are the long-term impacts?Points for discussion:Have you come across these tricky questions in your own work? Where?
  • 40.
    How did youdeal with them? What did you learn?
  • 41.
    What refined ornew tricky question would you like to see future evaluations address?Note your comments on the sheets provided. Please be clear as we will be writing these up after the session.The reporter from each group will be asked to very briefly feed back one ‘tricky question’ at the end of the session.
  • 42.
    Next stepsPlease handthe notes from your table in at the front of the room.We will write up the notes and questions and share the on the British Science Association website.

Editor's Notes

  • #28 What long term outcomes can we meaningfully measure, and for whom?How do we link these outcomes to public engagement work?What types of programmes are appropriate to evaluate in the long term?What other evaluation work is necessary to lay the foundations for a successful long-term study?Can the findings of such work usefully influence public engagement practice?
  • #31 Important to noteThese long term outcomes are about the PE and engineering communitiesNOT the public, who are covered in the shorter-term outcomes and who are much more problematic to find, also less deep involvementHaving been involved in project delivery, we felt that there would likely be more pronounced impacts on engineers/PE people that we could capture through the evaluation.Also – these goals probably overlap with the goals of many other programmes and schemes.What we did have was anecdotal evidence of collaborations and activities that had developed following other grant schemes, so we had an idea where to start looking, but were very open to unexpected outcomes, or no outcomes at all.Include comment from Lesley’s colleague: “what do you want to evaluate long term impact for? There might not be any”
  • #32 Will mention findings verbally as they are not yet published…Evidence that PE community were more likely to drive the work following the funded period (but they were usually the ‘delivery’ partners)Many had maintained linksStrongest outcome for engineers was skills, unclear whether they would have had these anyway, but impact could be around removing/reducing barriers to engineers gaining PE skills and experienceSome evidence in a small proportion of cases that engineering had been considered in greater depth and longer-term influence on PE community“Engineering was practically invisible in our programme prior to the Award.”Also picked up many people that were doing this work anyway, so Ingenious helped them continue their work rather than change it. Is this success?
  • #33 i.e. we discussed long term impacts and who these would be on in the ToC work – here we identified that the desired long term changes were for the PE and engineering communities. We then felt that it would be possible to evaluate these in the longer-term.NOT – we can’t reach the people that we really want the long term impacts to be on… so let’s just evaluate what we can with who we can get.
  • #34 We hope to address the questions about attribution in the qualitative work