Here we offer a selection of practical guides to evaluation.
Evaluating public engagement activities
This is a practical, ‘hands-on’ guide to help you draw out the information you need for your evaluation. Produced as part of the Beacons for Public Engagement Initiative, it is accompanied by podcasts that offer an introduction on how to use the support pack, plus stories and top tips from people who have used the guide.
Produced by People Science & Policy Ltd for the Research Councils UK, BIS and the NCCPE (2011), this evaluation guide is aimed at anyone who wants to engage with the public. The first section provides an overview of evaluation - what it's about, what it can do, what it can't do, how you should report your findings and how it can help strengthen funding applications - and looks at building an evaluation strategy. A range of evaluation methods are examined in the second section, with guidance on data analysis and writing evaluation reports.
Produced for DIUS and the ESRC, this revised version of the Tavistock Institute's Evaluation Framework is not a comprehensive manual on how to conduct evaluations (see the other technical resources listed in this section for this). Rather, it provides an overview of evaluation for Science and Society initiatives along with a set of procedures, guidelines, methods and examples to improve evaluation design. Its primary relevance is to those designing and implementing evaluations of Science and Society activities, from 'public awareness-raising' actions to educational initiatives aimed at encouraging young people to study science and related subjects in school. It is also aimed at policy-makers and programme managers to help them specify the kinds of evaluation approaches that need to be built into programmes.
Evaluation consultant Ben Gammon stresses that clarity is the key to effective evaluation, both in terms of planning and reporting. The pros and cons of different evaluation methods for museum exhibits - observation, interview/questionnaire and automatically logged data - are presented, along with the key features of a well-written evaluation report.
Evaluating community engagement and participation
This guide is intended to help those involved in running or commissioning engagement activities to understand the different factors involved in creating effective public participation. It helps planners set and measure attainable objectives, evaluate impact, and identify lessons for future practice. Using clear language, simple instructions, illustrative case studies and a glossary, this guide is a tool for anyone involved in public participation in central government and beyond.
Produced for the Joseph Rowntree Foundation (2000), this audit toolbox looks at ways of assessing levels of community involvement in area regeneration initiatives. There are tools and appraisal exercises for measuring: the history and patterns of participation; the quality of participation strategies; the capacity within partner organisations to support community participation; the impacts of participation and its outcomes, and the capacity within communities to participate.
Evaluating arts projects
The Arts Council's (2004, revised) Partnerships for Learning aims to help those involved in arts education projects to understand evaluation clearly and to evaluate effectively. In the long term, it aims to raise the standard of arts education projects. This guide provides a flexible framework, applicable to many different situations and useful for evaluating short or more extended projects. It will benefit those experienced in evaluation techniques as well as beginners, who may find the checklists and case studies useful.
Providing the best - Guidance for artists and arts organisations on assessing the quality of activities provided for children and young people
This Arts Council England (2006) guidance document provides a self-assessment framework for those running arts activities with children and young people. The defining characteristics of high quality arts opportunities are described with examples, and prompt questions to help you reflect on your own experiences. The document helps to define good practice and provide advice on developing and delivering arts experiences. The guidance could also be used for project planning and evaluation; analysing support, training and professional development needs, and helping to select artists for participatory work. Section 8 deals specifically with monitoring and evaluation.
Written in 2004, this toolkit aims to help voluntary and community arts organisations in Northern Ireland to carry out their own evaluations, especially the social impact on participants. Many arts organisations in Northern Ireland already have excellent evaluation systems and this toolkit aims to increase the consistency of evaluation work so that individual arts organisations can better understand and explain the impact of their work. The toolkit is the first stage in a larger process that will a) see evaluation extended to all sectors of the arts, and b) will evaluate the impact of arts organisations' work on audiences as well as participants.
Evaluating work with children and young people
Participation Works have published a set of three evaluation guides that focus on work with children and young people. Evaluation in a Nutshell is a short introductory guide that breaks down the evaluation process into simple steps. More detail is given in The Guide. The Toolkit presents a compilation of sample forms and activities for use with children of all ages that you can adapt or copy to collect information during your evaluation. In addition to consent, feedback and monitoring forms, The Toolkit includes the National Youth Agency's Hear by Right Self Assessment Tool, to map and plan young people's participation.
Published by UNICEF (2005), this comprehensive toolkit includes: classical evaluation tools; tools for participatory research and for participant evaluations of workshops; ice-breakers and team-builders; and a note on body language. Each tool is presented in a user-friendly format to enable the practitioner to put it to use, with information given on: its purpose; instructions on how to use it; the time required; materials necessary; sample questions that it might be used to answer; its advantages and disadvantages; tips, and a real-life example. The toolkit also includes several tools for participants to evaluate the workshops in which they are participating.
Measuring the Magic: Evaluating and researching young people's participation in public decision making
Published by the Carnegie Young People Initiative, a research project that aims to improve the quality and increase the breadth of young people's participation in public decision-making, this report provides an overview of the existing evidence about what works in youth participation within the public realm. It examines both the impacts of involving young people and the processes involved. The cited evaluations and research illustrate common themes in existing practice, and where gaps are identified, recommendations for the planning of future research are examined.
Evaluating social impact
Produced by nef (the New Economics Foundation) for organisations with a social enterprise mission, this site will get you started on evaluating your work through a series of simple to use tools. The interactive tools and How To guides will help you measure impact and demonstrate the quality of what you do.
Written for recipients of Heritage Lottery Fund grants, this in-depth guidance paper (2008) will help you develop your plans for evaluation.
The W.K. Kellogg Foundation (WKKF) works to improve the lives of children and families in the Americas and Southern Africa through projects that focus on Education & Learning; Food, Health & Well-being, and Family Economic Security. Written primarily for WKKF grantees, the Evaluation Toolkit offers useful guidance for anyone seeking to design an effective, useful evaluation. Based on WKKF's mission and evaluation philosophy, the toolkit has seven sections: Where to start; Evaluation approaches; Evaluation questions; Evaluation plan; Budgeting; Hiring & managing evaluators, and Additional resources.
Inspiring Learning: An Improvement Framework for Museums, Libraries and Archives. Measuring Outcomes
The Museums, Libraries and Archives Council's Inspiring Learning Framework aims to help its members capture and evidence their impact by identifying generic learning and social outcomes for individuals and communities. Be sure to check out the invaluable download section with action planning templates and checklists, guides to selecting research methods, questionnaires and coding tools as well as the searchable directory of case studies. Guidance is also given on how the information generated from evaluation can be used to develop new partnerships and support a funding bid.
Evaluation consultant Ben Gammon draws on his experience as Head of Visitor Research at the Science Museum to present an engaging overview of questionnaire design. Examples are used throughout to illustrate good and bad practice and the common pitfalls, both in terms of the content of questions and in the way they are asked and scored. Although written for those developing visitor questionnaires, many of the issues discussed in this 11-step guide are widely applicable.
This guide sets out some of the main steps needed to plan and manage a project using an outcomes approach. Outcomes are the results of what you do, rather than the activities or services provided. Although useful for a range of applicants, this guide is primarily for those wanting to find out more about the outcomes approach. Further information about the issues raised, examples and sources of further support are available on the Evaluation & Research section of the BIG website.
This useful guide covers some of the basics of evaluation. It is written NHS commissioners, managers and practitioners in mind, but contains some useful guidance deciding whether to commission an external evaluation.