The BetterEvaluation Resource Library contains hundreds of curated and co-created resources related to managing, conducting, using, and strengthening capacity for evaluation.
You can use the search field and filtering on this page to find resources that you are interested in or you can browse our extensive list. An alternative way to find resources best suited to your needs is to explore the Rainbow Framework, where you can find resources relating to evaluation methods, approaches and tasks.
- 391 results found
- X Tool
- X Discussion paper
Innovations in monitoring & evaluating results
This discussion paper, written by Thomas Winderl and edited by Jennifer Colville for the United Nations Development Program (UNDP), provides a detailed inventory of innovative practices used for monitoring and evaluation in a developmeFeminist evaluation and gender approaches: There’s a difference?
The purpose of this article is to provide readers with a historical overview and description of feminist evaluation and gender approaches.Democracy, governance, and randomised media assistance
This briefing was commissioned by BBC Media Action to investigate the potential to use experimental and quasi-experimental designs to create a counterfactual, which is one option for investigating causal attribution and contributiTheory of change software
There are a number of options when it comes to using software to help create a logic model.Logic model development workshop
Susan Cottrell’s Logic Model Development Workshop, sponsored by American University’s Measurement and Evaluation Program, is aimed at audiences who are new to developing logic models, and those who need a refresher.Conducting evaluations virtually
This webinar, hosted by American University faculty Beverly Peters, Ph.D. and Kavita Mittapalli, Ph.D., discussed the opportunities and challenges of conducting virtual evaluations.Evaluation report layout checklist
This checklist from Stephanie Evergreen distills the best practices in graphic design and has been particularly created for use on evaluation reports.Reporting style guide template
This style guide template is designed to ensure consistency in formatting across various project documents, including evaluation plans, reports, and presentations.Credentialed evaluator competencies template
This competency template is designed to support individuals pursuing the Credentialed Evaluator designation through the Canadian Evaluation Society.Information request checklist
This resource is a checklist designed to guide evaluators when starting a new project, ensuring they gather essential information to support their evaluation efforts.Evaluation status update template
This resource is a status template designed to keep stakeholders informed about the progress of an evaluation project.Lived and perceived space during lock-down in a sensitive map approach
This paper describes the use of sensitive mapping to explore individuals' experiences during the COVID-19 lockdown in France.Navigating competing demands in monitoring and evaluation: Five key paradoxes
In this article, Marijn Faling, Sietze Vellema, and Greetje Schouten report on five paradoxes in monitoring and evaluation, each encompassing two competing logics. This resource was contributed by Marijn Faling.Feedback workshop checklist
This checklist from the Evaluation Checklists Project supports the planning, conducting, and following up of feedback workshops when used as evaluation tools.Where we go from here: The mental sketch mapping method and its analytic components
This paper discusses the mental mapping method, which helps us understand how people think about and interact with space. Authors Jack Jen Gieseking.Using SenseMaker in child-centred research
This paper is one of two documents submitted by Becca Smith related to the use of the SenseMaker approach to evaluate attitudes towards girls’ education in Ethiopia.Learning to Make All Voices Count - Leveraging Complexity-Aware MEL to Pursue Change in Complex Systems
The emergence of government evaluation systems in Africa: The case of Benin, Uganda and South Africa
This article documents the experiences of three countries - South Africa, Benin and Uganda - in deepening and widening their national evaluation systems and some of the cross-cutting lessons that can be drawn from their experienUsing M&E to improve government performance and accountability: A glance of 6 countries’ NES
This report on lessons learned from the Twende Mbele program compares the experiences of six different countries – Benin, Ghana, Kenya, Niger, South Africa and Uganda and discusses issues relating to leadership and linkages, capacity, and qOutcome monitoring and learning in large multi-stakeholder research programmes: lessons from the PRISE consortium
This discussion paper outlines the key lessons to emerge from designing and applying an outcome monitoring system to the Pathways to Resilience in Semi-arid Economies (PRISE) project.Ushahidi
Ushahidi is an open-source mapping and crowdsourcing tool that can be used by organizations to collect, manage and analyse crowdsourced info.Impact assessment of financial market development through the lens of complexity theory
This example of complexity theory from FSD Kenya focuses on evaluating the impact of two financial programs implemented by the organisation: the development of a credit information-sharing system and the implementatConversations about measurement and evaluation in impact investing
This article documents issues emerging during discussions of impact investing and social impact measurement amongst participants of the Innovations in Evaluation strand at the 8th African Evaluation Association Conference.A stakeholder view of the development of national evaluation systems in Africa
This journal article compares developments in National Evaluation Systems in Ghana, Kenya, Rwanda, South Africa, Uganda and Zambia.Handbook on poverty and inequality
This book form the World Bank provides a range of tools which allow the user to measure, describe, monitor, evaluate, and analyze poverty.What counts as good evidence?
This paper, written by Sandra Nutley, Alison Powell and Huw Davies for the Alliance for Useful Evidence, discusses the risks of using a hierarchy of evidence and suggests an alternative in which more complex matrix approachesDiagnostic tool for a monitoring and evaluation systems analysis (MESA) - Guidance Note
The MESA is a diagnostic tool created by the Global Evaluation Initiative that guides country stakeholders in gathering, structuring and analyzing information on the current capacity of their country's M&E ecosystem.Usando o ‘Quadro Arco-íris’
Esta versão compacta do framework rainbow pede que você pense em uma série de questões-chave.Utilisation du cadre de référence Arc-en-ciel
Cette version compacte du cadre de Rainbow vous invite à réfléchir à une série de questions clés. Il est important de tenir compte de toutes ces questions, y compris les rapports, au début d'une évaluation.Cómo usar el Marco Arcoíris
Esta versión compacta del Marco Arcoíris le pide que piense en una serie de preguntas clave.استخدام إطار " قوس قُزح" لإجراء تقييم أفضل
إطار " قوس قُزح" لتقييم أفضل يدفعك إلى التفكير في سلسلة من الأسئلة الرئيسية والهامة. ومن المهم أن نأخذ في إعتبارنا كل هذه المواضيع منذ بداية التقييم بما في ذلك التقارير التي سوف نحصل عليها.Zum gebrauch des BetterEvaluation rainbow framework
BetterEvaluation Rainbow Framework regt zum Nachdenken über eine Reihe von Schlüsselfragen an.