Search
3447 results
Filter search resultsRigour
Rigour involves using systematic, transparent processes to produce valid findings and conclusions. There are significant differences in what this is understood to mean in evaluation.MethodAEA guiding principles for evaluators
This webpage from the American Evaluation Association (AEA) outlines the guiding principles to be used by evaluators in order to promote ethical practice in evaluations.RessourceOutcome harvesting
Outcome Harvesting collects (“harvests”) evidence of what has changed (“outcomes”) and, working backwards, determines whether and how an intervention has contributed to these changes.Approach52 weeks of BetterEvaluation: Week 16: Identifying and documenting emergent outcomes of a global network
Global voluntary networks are complex beasts with dynamic and unpredictable actions and interactions. How can we evaluate the results of a network like this? Whose results are we even talking about?BlogCases in outcome harvesting
This report from The World Bank documents the pilot of a program that examines the use of outcome harvesting and the Bank's results management approach to understand how change happens in complex environments.RessourceRetrospective 'outcome harvesting': Generating robust insights
This paper describes the use of the Outcome Harvesting approach to evaluate a global voluntary network.RessourceDiscussion note: Complexity aware monitoring
USAID’s Office of Learning, Evaluation and Research (LER) has produced a Discussion Note: Complexity-Aware Monitoring, intended for those seeking cutting-edge solutions to monitoring complex aspects of strategies and projects.RessourceCosecha de alcances
Cosecha de Alcances es una herramienta centrada en la utilización y altamente participativo que permite a evaluadores, donantes y gerentes de proyectos y programas identificar, formular, verificar y dar sentido a los alcances en que han infRessourceOutcome harvesting
This 27-page brief, written by Ricardo Wilson-Grau and Heather Britt, introduces the key concepts and approach used by Outcome Harvesting (published by the Ford Foundation in May 2012; revised in Nov 2013).RessourceContribution analysis: An approach to exploring cause and effect
This brief from the Institutional Learning and Change Initiative (ILAC) explores contribution analysis and how it can be used to provide credible assessments of cause and effect.RessourceRealistic evaluation bloodlines
This article, written by Ray Pawson and Nick Tilley analyses six different social science inquiries from around the globe that use a variety of methods and strategies in order to draw conclusions about realistic evaluations.RessourceHow to manage an evaluation and disseminate its results
This guide from the United Nations World Food Programme (WFP) outlines the roles and responsibilities of evaluation managers during and after the evaluation has taken place.RessourceValidation workshop
A validation workshop is a meeting that brings together evaluators and key stakeholders to review an evaluation's findings.MethodEthical guidelines
Ethical guidelines are designed to guide ethical behaviour and decision-making throughout evaluation practice.MethodHuman rights and gender equality
Human rights and gender equality refer to the extent to which an evaluation adequately addresses human rights and gender in its design, conduct, and reporting.MethodStrengthening national evaluation capacities
Strengthening national evaluation capacities refers to the ways in which an evaluation can have broader value beyond a single evaluation report by increasing national capacities.MethodValidity
Validity refers to the extent to which evaluation findings are correct.MethodRespect for people
Respect for people during an evaluation requires those engaged in an evaluation to respect the security, dignity, and self-worth of respondents, program participants, clients, and other evaluation stakeholders.MethodBudgeting for developmental evaluation (DE)
An interview with internationally recognised evaluation expert Michael Quinn Patton by Heather Britt for BetterEvaluation, April 2012.RessourceEvaluation ethics, politics, standards, and guiding principles
This is a module taken from the International Program for Development Evaluation Training (IPDET) program.RessourceCausal mapping
Causal mapping helps make sense of the causal claims (about "what causes what") that people make in interviews, conversations, and documents.MethodClearing the fog: New tools for improving the credibility of impact claims
This IIED Briefing Paper shows that the methods of process tracing and Bayesian updating can facilitate a dialogue between theory and evidence that allows for the assessing of the degree of confidence in ‘contribution claims’ in a transpareRessourceGuidelines to avoid conflict of interest in independent evaluations
These guidelines from the Asian Development Bank (ADB) set out the procedures that need to be taken into account when assessing the independence of evaluation and audit functions to ensure ethical requirements are met.RessourceThe African evaluation guidelines: 2002
This paper from the African Evaluation Association (AfrEA) provides a brief description of the guidelines and a series of checklists to assist with the planning, implementation and completion of the evaluation process.RessourceAnalyse data
Decide how to analyse the data that have been collected or retrieved in order to answer the Key Evaluation Questions.Cadre/GuideRainbow FrameworkAsset mapping
Asset mapping is a process of identifying existing assets within a community, organisation or network. It complements the "deficit focus" of needs analysis.MethodNeeds analysis
A needs analysis identifies the current needs of an individual, organisation, or community. Four different types of need were identified by a classic paper by Bradshaw in 1972:MethodProcess tracing
Process tracing is a case-based and theory-driven method for causal inference that applies specific types of tests to assess the strength of evidence for concluding that an intervention has contributed to changes that have been observed orMethodPower analysis
A power analysis identifies the main types of power in a system of interest.MethodUpfront evaluation design
An upfront evaluation design is done before or near the beginning of the evaluation and then implemented as designed or as revised at the end of the inception period.MethodCommissioner-led evaluation design
The organisation commissioning an evaluation develops an evaluation design as part of setting out the terms of reference for the evaluation.MethodJoint evaluation design
A collaboration is involved in designing the evaluation, which might involve an implementing agency, an evaluation team and/or a community working together.Method