Search
3510 results
Filter search resultsAccuracy
Accuracy refers to the correctness of the evidence and conclusions in an evaluation. It may have an implication of precision.MétodoAccessibility
Accessibility of evaluation products includes consideration of the format and access options for reports, including plain language, inclusive print design, material in multiple languages, and material in alternative formats (such as online,MétodoCompetence
Competence refers to ensuring that the evaluation team has or can draw on the skills, knowledge and experience needed to undertake the evaluation.Método52 weeks of BetterEvaluation: Week 34 Generalisations from case studies?
An evaluation usually involves some level of generalising of the findings to other times, places or groups of people.BlogAEA guiding principles for evaluators
This webpage from the American Evaluation Association (AEA) outlines the guiding principles to be used by evaluators in order to promote ethical practice in evaluations.RecursoOutcome harvesting
Outcome Harvesting collects (“harvests”) evidence of what has changed (“outcomes”) and, working backwards, determines whether and how an intervention has contributed to these changes.Enfoque52 weeks of BetterEvaluation: Week 16: Identifying and documenting emergent outcomes of a global network
Global voluntary networks are complex beasts with dynamic and unpredictable actions and interactions. How can we evaluate the results of a network like this? Whose results are we even talking about?BlogValidation workshop
A validation workshop is a meeting that brings together evaluators and key stakeholders to review an evaluation's findings.MétodoHuman rights and gender equality
Human rights and gender equality refer to the extent to which an evaluation adequately addresses human rights and gender in its design, conduct, and reporting.MétodoStrengthening national evaluation capacities
Strengthening national evaluation capacities refers to the ways in which an evaluation can have broader value beyond a single evaluation report by increasing national capacities.MétodoValidity
Validity refers to the extent to which evaluation findings are correct.MétodoRespect for people
Respect for people during an evaluation requires those engaged in an evaluation to respect the security, dignity, and self-worth of respondents, program participants, clients, and other evaluation stakeholders.MétodoEvaluation ethics, politics, standards, and guiding principles
This is a module taken from the International Program for Development Evaluation Training (IPDET) program.RecursoCausal mapping
Causal mapping helps make sense of the causal claims (about "what causes what") that people make in interviews, conversations, and documents.MétodoGuidelines to avoid conflict of interest in independent evaluations
These guidelines from the Asian Development Bank (ADB) set out the procedures that need to be taken into account when assessing the independence of evaluation and audit functions to ensure ethical requirements are met.RecursoAnalyse data
Decide how to analyse the data that have been collected or retrieved in order to answer the Key Evaluation Questions.Marcos/GuíasRainbow FrameworkAsset mapping
Asset mapping is a process of identifying existing assets within a community, organisation or network. It complements the "deficit focus" of needs analysis.MétodoNeeds analysis
A needs analysis identifies the current needs of an individual, organisation, or community. Four different types of need were identified by a classic paper by Bradshaw in 1972:MétodoProcess tracing
Process tracing is a case-based and theory-driven method for causal inference that applies specific types of tests to assess the strength of evidence for concluding that an intervention has contributed to changes that have been observed orMétodoPower analysis
A power analysis identifies the main types of power in a system of interest.MétodoUpfront evaluation design
An upfront evaluation design is done before or near the beginning of the evaluation and then implemented as designed or as revised at the end of the inception period.MétodoCommissioner-led evaluation design
The organisation commissioning an evaluation develops an evaluation design as part of setting out the terms of reference for the evaluation.MétodoJoint evaluation design
A collaboration is involved in designing the evaluation, which might involve an implementing agency, an evaluation team and/or a community working together.MétodoSingle evaluation approach design
The evaluation design is based on selecting a single existing evaluation model or approach and using it for an evaluation.MétodoEvaluator-led evaluation design
An evaluation team develops an evaluation design in response to an evaluation brief which sets out the purposes of the evaluation.MétodoCommunity-led evaluation design
A community develops an evaluation design, sometimes with facilitation or technical support.MétodoDevelop a design for the evaluation
An evaluation design sets out how data will be collected and analysed in terms of the methods used and the research design.Marcos/GuíasRainbow FrameworkWhat counts as good evidence?
This paper, written by Sandra Nutley, Alison Powell and Huw Davies for the Alliance for Useful Evidence, discusses the risks of using a hierarchy of evidence and suggests an alternative in which more complex matrix approachesRecursoImpartiality
Impartiality in evaluation refers to conducting an evaluation without bias or favouritism, treating all aspects and stakeholders fairly. Key aspects of impartiality in evaluation can include:MétodoIdentifying evaluative criteria
This chapter details the use of a needs assessment to identify evaluative criteria.RecursoUnderstand the situation
A situation analysis examines the current situation and the factors contributing to it. This might include identification and analysis of needs, resources, strengths, weaknesses, opportunities, threats, and/or power analysis.Marcos/GuíasRainbow FrameworkExample outcome journal template
This template is based on the original Outcome Mapping guidance, incorporating elements from Outcome Harvesting (e.g. significance) and allowing tagging to particular progress markers (rather than listing them all as the original does).Recurso