The BetterEvaluation Resource Library contains hundreds of curated and co-created resources related to managing, conducting, using, and strengthening capacity for evaluation.
You can use the search field and filtering on this page to find resources that you are interested in or you can browse our extensive list. An alternative way to find resources best suited to your needs is to explore the Rainbow Framework, where you can find resources relating to evaluation methods, approaches and tasks.
- 142 results found
- X Rainbow Framework
What counts as good evidence?
This paper, written by Sandra Nutley, Alison Powell and Huw Davies for the Alliance for Useful Evidence, discusses the risks of using a hierarchy of evidence and suggests an alternative in which more complex matrix approachesIdentifying evaluative criteria
This chapter details the use of a needs assessment to identify evaluative criteria.Develop a communications plan
This web page from the Evaluation Toolkit website outlines a six-question approach that can be used to develop a communications plan for the dissemination of evaluation results.Value for investment: A practical evaluation theory
This booklet proposes a model for evaluating value for investment in social programs by integrating economic and evaluative thinking. Julian King is a New Zealand public policy consultant.Introduction to qualitative research methodology
This manual, written by Karina Kielmann, Fabian Cataldo and Janet Seeley, aims to give readers of a non-scientific background an introduction to key theoretical concepts and methodologies in qualitative research.Identifying the intended user(s) and use(s) of an evaluation
This guideline from the International Development Research Centre (IDRC) highlights the importance of identifying the primary intended user(s) and the intended use(s) of an evaluation.A short primer on innovative evaluation reporting
This book by Kylie Hutchinson presents a number of innovative ways of reporting, including different methods for presentations, narrative summaries, presenting findings visually and making use of digital outputs.Participant produced video: Giving participants camcorders as a social research methods
This toolkit from Real Life Methods provides a guide to using participant-produced video to allow participants to record their everyday lives and reflect on those things that matter to them.New trends in development evaluation
This working paper from UNICEF aims to review the latest trends in evaluation to develop a new strategy for improving the evaluation function.Mapping change: Using a theory of change of guide planning and evaluation
This guide, written by Anne MacKinnon and Natasha Arnott for GrantCraft, describes the process of developing a theory of change to support planning and evaluation.Making evaluations matter: A practical guide for evaluators
This guide, written by Cecile Kusters with Simone van Vugt, Seerp Wigboldus, Bob Williams and Jim Woodhill for the Centre for Development Innovation, presents a framework for planning and managing an evaluation with a focus oRecommendations in evaluation
This presentation, given by Lori Wingate for the Evaluation Center, Western Michigan University, takes a look at the 'why, what and how' of making recommendations in an evaluation.Machine learning in evaluative synthesis: Lessons from private sector evaluation in the World Bank Group
An exploration of the potential to use machine learning techniques to enhance the efficiency of analyzing, classifying, and synthesizing extensive amounts of text in evaluation research.Advanced content analysis: Can artificial intelligence accelerate theory-driven complex program evaluation?
This paper presents the methodology and results of an assessment of the applicability and utility of artificial intelligence for advanced theory-based content analysis.Leveraging imagery data in evaluations: Applications of remote-sensing and streetscape imagery analysis
This paper discusses using imagery data in evaluations and the advantages and limitations of relevant methodologies.Qualitative research & evaluation methods: Integrating theory and practice
The fourth edition of Michael Quinn Patton's Qualitative Research & Evaluation Methods Integrating Theory and Practice, published by Sage Publications, analyses and provides clear guidance and advice for usiHow to manage an evaluation and disseminate its results
This guide from the United Nations World Food Programme (WFP) outlines the roles and responsibilities of evaluation managers during and after the evaluation has taken place.Using qualitative comparative analysis to explore causal links for scaling up investments in renewable energy
This paper illustrates how qualitative comparative analysis (QCA) was used to identify causal pathways for scaling renewable energy to meet sustainable development and climate goals.Process tracing and contribution analysis: A combined approach to generative causal inference for impact evaluation
This article, written by Barbara Befani and John Mayne for the IDS Bulletin (Volume 45 Number 6), outlines how the combined use of contribution analysis (CA) with process tracing (PT) can shift the focus of impSet-theoretic methods for the social sciences: A guide to qualitative comparative analysis
This book, by Schneider and Wagemann, provides a comprehensive overview of the basic principles of set theory to model causality and applications of Qualitative Comparative Analysis (QCA), the most developed form of set-theorThe art and craft of bricolage in evaluation
This CDI Practice Paper, by Tom Aston and Marina Apgar, makes the case for ‘bricolage’ in complexity-aware and qualitative evaluation methods.From narrative text to causal maps: QuIP analysis and visualisation
This paper focuses on analysing raw data to produce useful visual summaries, describing in detail the processes involved in a QuIP analysis.Rethinking rigour to embrace complexity in peacebuilding evaluation
This 2024 open-access journal article presents the inclusive rigour framework and applies it to three cases of peace-building evaluation.The book of why: The new science of cause and effect - Book review
This review of The Book of Why: The New Science of Cause and Effect attempts to explain "reclaiming causal" from the perspective of an influential statistician and thinker.Joint Committee on Standards for Educational Evaluation (JCSEE) program evaluation standards
This resource from the Joint Committee on Standards for Educational Evaluation provides statements that describe each of the standards developed to support program evaluation.Broadening the range of designs and methods for impact evaluations
The working paper, written by Elliot Stern, Nicoletta Stame, John Mayne, Kim Forss, Rick Davies and Barbara Befani for the UK Department for International Development (DFID), describes how theory-based, case-based and pMaking rigorous causal claims in a real-life context: Has research contributed to sustainable forest management?
This article discusses an impact evaluation that examined the contribution of two forestry research centres - the Centre for International Forestry Research (CIFOR) and the Centre de Coopération Internationale en Recherche Agronomique pourLost causal: Debunking myths about causal analysis in philanthropy
This 2021 paper, updated in 2024, advocates for more causal analysis in philanthropic evaluation - not just describing actions taken and changes observed, but also learning how and why change occurred.A guide to assessing needs
This book, written by Ryan Watkins, Maurya West Meiers and Yusra Laila Visser for The World Bank, provides detailed guidance on needs assessment during the early stages of project development.Budgeting for developmental evaluation (DE)
An interview with internationally recognised evaluation expert Michael Quinn Patton by Heather Britt for BetterEvaluation, April 2012.QuIP and the Yin/Yang of Quant and Qual: How to navigate QuIP visualisations
This discussion paper reviews how quantitative and qualitative processes are utilised in analysis and presentation ofCracking causality in complex policy contexts
This blog post addresses the challenge of making credible causal claims and discusses experiences from developing the Qualitative Impact Assessment Protocol (QUIP). Author James Copestake