Guides

How to evaluate public engagement projects and programmes

Guidance on how to evaluate your public engagement programme.

updated on 14 Aug 2023
11 minutes read

Introduction

Evaluating your work is key to reflecting on what is working well and where improvements can be made, as well as assessing the impact of your work. The key steps to any evaluation plan, whether you are evaluating a public engagement activity, a culture change project, or a long term support programme, are the same. In this guide we share some of the ways people have used evaluation to inform the delivery of excellent public engagement projects and activity. 

Logic models

A really helpful tool for planning your approach is to use a logic model. Logic models are a tool used by many funders, managers and evaluators of complex interventions to help them plan and evaluate their success. Using a logic model enables you to map your project, considering what you are hoping to achieve, and how you plan to achieve this, and to make your assumptions about change explicit. A typical logic model will include the following features:

  • Current situation - A description of the situation you are trying to change
  • Aims – what you hope to achieve
  • Inputs – what you will contribute
  • Activities – what you are going to do to achieve the aims
  • Outputs – what you create
  • Outcomes – what happens as a result
  • Impacts – what is the long term effect
  • Assumptions – that you are making in designing your approach
  • External factors – that could influence the outcomes of your project

A logic model can provide a useful framework to map out your project – and understand better the shape of what you are trying to do. Working through the logic model with those who will be involved in the project (e.g. team members, partner organisations) helps you have a useful discussion about your project, and highlights the assumptions you are making. It helps you make explicit how you think the activities you are planning will lead to the desired impacts.

A logic model can be used to inform your approach to evaluation. What questions do you have about your approach? What do you want to know? For example, it may be that you are interested in the current situation, if and how the activities influence the outputs and outcomes, or whether you have actually made a difference. Your questions might focus on the current situation; the processes you are using; or the outcomes and impacts. A logic model helps you make those important decisions about where to focus your attention.

What are outputs, outcomes and impacts?

When planning an evaluation it is helpful to differentiate between outputs, outcomes and impact as these provide useful ways to define the different ways in which your work can contribute to change, over time.

Outputs are usually tangible products, and are relatively easy to capture. Examples of outputs for a public engagement programme might include:

  • Online resources including websites; tweets; blogs
  • Events
  • Exhibition
  • Publications including leaflets; articles; reports
  • Partnerships
  • Training courses
  • People e.g. numbers and demographics of participants in the activities

Monitoring outputs is relatively straightforward. You should make sure you have routine ways to collect this data.

Outcomes and impacts

Outcomes are the results of the activity, whereas impacts tend to relate to longer term change. Outcomes can be thought of as the immediate impacts arising from the programme. Outcomes are usually easier to capture as they happen quickly, whereas the impacts happen over a longer time frame, often when you are no longer in contact with the project participants.

There is a relationship between the outputs, outcomes and impacts of a programme.

  • Typical outcomes for a public engagement programme might include:
  • increased understanding of the topic
  • Enjoyment
  • Skills development
  • Attitudinal change
  • Inspiration and creativity
  • New experiences

The outcomes are the things we think need to happen in order to have longer term impact, i.e. to fulfil the aim of the programme. Remember it is important to consider all the participants in the programme including members of the public, the delivery team, and partners.

Longer term impacts, can be categorised into these three types:

  • Conceptual impacts: these can be thought of as changes to how people think. Examples include changes in knowledge, understanding, attitude, or awareness.
  • Capacity building impacts: these can be thought of as changes in what people do. Examples include skills development or participation.
  • Instrumental impacts: these can be thought of as changes in how things work. Examples include changes to policies, behaviour or practices.

Gathering evidence

Once you have developed your logic model you need to consider what you want to know about your programme. You may wish to focus on evaluating the results of the activity (‘summative’ evaluation), but don’t forget how useful evaluation can be when used ‘formatively’ to inform the development of your approach, or to provide ongoing reflection on what is working well, and where improvements could be made.

Initially you need to consider the overall questions your evaluation will address. These questions can inform your approach to evaluation. One of the primary audiences for this work will be you and your team, and therefore it is important to think through how evaluation will help you do your work well. It is also important to think through the evidence you may need to justify your business case.

Having an external evaluator can really help, or you may have someone in your central public engagement team who could offer assistance or advice. If you have never evaluated your work before, it is good to find someone who can help ensure your approach is relevant to what you hope to learn.

Once you know your questions, it is important to consider how you will approach gathering relevant data. Here are a few mechanisms commonly used to evaluate public engagement programmes. Remember that you should try to make the evaluation activity part of the event, rather than an ‘add on’. It will help ensure people get involved, and also ensure you get more data:

  • Graffiti Wall: taking different forms, a graffiti wall offers a great opportunity for participant feedback, Questions could include: What did you learn today? What did you enjoy most? What didn’t you like?
  • Quizzes – if you are doing events then integrating a quiz towards the beginning and the end can be a great way to capture baseline data, as well as learning as a result of the event. Remember to keep it fun, and don’t put people in a place where they feel foolish.
  • Questionnaires – often the default option, questionnaires can provide really useful feedback. Designing questionnaires can be a challenge, and it is a good idea to test out your questions before using them to evaluate your event. Think about the type of activity you are running, and if and how participants will be encouraged to participate in filling in a questionnaire. A short questionnaire is more appealing to participants, so it is sensible to ask fewer questions, to encourage people to take part. Alternatively, you could recruit people to interview participants using your questionnaire, or have electronic versions available on a tablet.
  • Post cards – why not provide postcards for people to feedback which they can then post into a box? The cards could have a question, or couple of questions on them – and you could leave a side blank for other comments.

Analysing Data

It is important to consider how you analyse the data you have collected to address your evaluation questions. It can be tempting to capture lots of qualitative data, without considering how you will analyse it, and the time needed to do this well.

There are two types of data – quantitative data, and qualitative data. A combination of both forms of data can often help address evaluation questions well. For example, whilst it helps to know that 30% of your participants thought they learnt something new from the activity, qualitative data can help you understand the texture of the new things they learnt.

Reporting

The final part of the evaluation work you do is to report on what you have learnt. Just like any engagement, it is important to consider the audience for the report. Is it the funder, who wants to know you delivered what you said you would; is it your team, who want to understand how to develop more effective ways to engage with the public; is it your partners, who want to know the impact of the project on their audiences, or staff? Make sure you share the data and its analysis in an ethical, and transparent way, and don’t be afraid of presenting things that haven’t gone to plan. Evaluation is an effective tool to stimulate learning, especially if you are happy to share when your approach didn’t work.

Worked example

Overarching aim: To improve the oral health of secondary school students

To achieve this aim, you might have several objectives

  1. Run three workshops to bring together researchers and young people to share research insights into the long term impact of effective oral health and the experiences of young people
  2. Use these workshops to co-develop a programme of face to face activities to encourage young people to improve their oral health practice
  3. Pilot the activities with two schools already involved in the programme, to refine the approach
  4. Train activity leaders – both researchers and young people
  5. Roll out this programme of activities across Yorkshire

Mapping this into a logic model, you would then consider the outputs, outcomes and impacts you hope to achieve.

Potential Outputs

  • 3 events with 20 young people and 20 researchers
  • Report from event
  • Activity toolkit
  • 2 activity day pilots with 40 students
  • Training course with 15 researchers and 15 young people
  • 20 events run across Yorkshire
  • 400 participants including 20 teachers; 40% receiving free school lunches
  • 15 young people who have received training in event delivery
  • 15 researchers who have received training in event delivery

Potential Outcomes

  • Researchers have a better understanding of young people’s needs and concerns re oral health
  • Young people involved have raised awareness of the long term impacts of poor oral health
  • Young people involved in running activity days have increased confidence in running events
  • Project participants inspired to improve their oral health through brushing their teeth regularly and visiting the dentist

Potential Impacts

Short term: Researchers champion engaged approaches to their research Participants act as oral health ambassadors, sharing their knowledge and understanding with others

Long term: Improved oral health amongst participants, and their families and friends

Assumptions

  • Co-developing an approach with young people will lead to more effective engagement with a large cohort of young people
  • Showcasing the research findings and practical ways these could be addressed by individuals, will contribute to young people taking positive action around their oral health
  • Researchers and young people will want to engage with the project
  • Training will equip young people and researchers to effectively run the activities in schools
  • Schools will want to engage with the programme
  • Participants will share their learning with family and friends, who will change their oral health care.

External factors

  • Pressures schools in the target area are facing leading them to prioritise other things
  • Families may not want to adopt new practices, especially if they seem unnecessary or costly

Evaluation Questions 

Questions might include:

  • Has the programme improved oral health of the participants? Has it had an influence beyond the participant group?
  • Did the co-produced activities meet the needs of the target group?
  • How have the researchers attitudes towards engagement changed as a result of participating in the programme?
  • To what extent has the inclusion of research findings improved participants approach to oral health?
  • Was the training appropriate for researchers and young people to deliver the activities in the schools

Data collection

Once you have considered the questions, it is worth considering how you might address them, and the approaches you might take. Here are some suggestions.

 

Question

Potential data collection methods

Has the programme improved oral health of the participants? Has it had an influence beyond the participant group?

Participant survey. This would be done before the intervention; following the intervention, and 6 months following the intervention to capture long term change.

Focus group. Could do this with families of young people involved in the programme – again before and after the intervention to explore if and how the approach influenced the families of those involved.

Did the co-produced activities meet the needs of the target group

Participant survey

Independent observer

Graffiti wall at activity days for participant feedback

How have the researchers' attitudes towards engagement changed as a result of participating in the programme

Researcher questionnaire

Researcher focus group

Researcher log books

To what extent has the inclusion of research findings improved participants' approach to oral health?

Interviews with key participants

Participant survey

Was the training appropriate for researchers and young people to deliver the activities in the schools

Independent observer of activity delivery

Researcher focus group

Young people focus group

 

Data Analysis

The next stage is to analyse the data. It is worth considering how you will do this before you collect the data, to enable you to collect what you need to draw your conclusions. It is useful to ensue you collect some quantitative data e.g. using scales for people to assess their enjoyment of an event, or whether they would recommend it to a friend. It is also helpful to collect qualitative data to explore why people have responded on the ways that have chosen. For example if 10% people who participated didn’t enjoy the event, you would want to know why.

Lots of the online survey tools can analyse the quantitative data, providing tables and graphs to illustrate how people have responded. You can do analysis using excel spreadsheets, or specific evaluation software packages.

In addition, if you have collected lots of qualitative data you can create or use a coding framework to help see patterns and trends in the data

Reporting

Once you have analysed the data it is sensible to pull it together into a short report – you can share with your team. You may also need to include a summary of your evaluation and what you learnt in your report to funders.