Search
3447 results
Filter search resultsIterative evaluation design
An iterative evaluation design involves setting out an initial overall evaluation design or process at the beginning of the evaluation.МетодSingle evaluation approach design
The evaluation design is based on selecting a single existing evaluation model or approach and using it for an evaluation.МетодEvaluator-led evaluation design
An evaluation team develops an evaluation design in response to an evaluation brief which sets out the purposes of the evaluation.МетодCommunity-led evaluation design
A community develops an evaluation design, sometimes with facilitation or technical support.МетодDevelop a design for the evaluation
An evaluation design sets out how data will be collected and analysed in terms of the methods used and the research design.Концептуальные основы / руководстваRainbow FrameworkWhat counts as good evidence?
This paper, written by Sandra Nutley, Alison Powell and Huw Davies for the Alliance for Useful Evidence, discusses the risks of using a hierarchy of evidence and suggests an alternative in which more complex matrix approaches for identifyinМетодQualitative comparative analysis
Qualitative Comparative Analysis (QCA) is an evaluation approach that supports causal reasoning by examining how different conditions contribute to an outcome.ApproachReview evaluation quality
Evaluating the quality of an evaluation can be done before it begins (reviewing the plan) or during or after the evaluation (reviewing the evaluation products or processes). This is sometimes called a quality review or meta-evaluation.Концептуальные основы / руководстваRainbow FrameworkCommon good and equity
Consideration of common good and equity involves an evaluation going beyond using only the values of evaluation stakeholders to develop an evaluative framework to also consider common good and equity more broadly.МетодImpartiality
Impartiality in evaluation refers to conducting an evaluation without bias or favouritism, treating all aspects and stakeholders fairly. Key aspects of impartiality in evaluation can include:МетодIdentifying evaluative criteria
This chapter details the use of a needs assessment to identify evaluative criteria.Метод20 years of outcome mapping: Evolving practices for transformative change
This paper reflects on the evolving use of Outcome Mapping 20 years after the first publication on this approach. The resource also provides a "set of guiding practices to support transformative change":МетодA guiding framework for needs assessment evaluations to embed digital platforms in partnership with Indigenous communities
This open-access journal article describes a needs assessment with a subarctic Métis community in Saskatchewan, Canada.МетодOutcome mapping + equity, gender, and social justice
This paper introduces OM+, a new approach to thinking about and using Outcome Mapping (OM) for supporting transformative change through a focus on inclusion and leadership for equity, gender, and social justice.МетодUnderstand the situation
A situation analysis examines the current situation and the factors contributing to it. This might include identification and analysis of needs, resources, strengths, weaknesses, opportunities, threats, and/or power analysis.Концептуальные основы / руководстваRainbow FrameworkPower analysis: A practical guide
This guide was developed in response to a recommendation of the Swedish Government Policy on Democratic Development and Human Rights, that power be analysed as part of context-specific poverty analysis.МетодQuick guide to power analysis
This 2-page guide developed by Oxfam in 2014 and updated in 2021, provides an overview of key concepts in power analysis and why it is useful, with links to additional resources.МетодProcess tracing as a practical evaluation method: Comparative learning from six evaluations
This 2020 paper by Alix Wadeson, Bernardo Monzani and Tom Aston presents and reflects on six evaluations where process tracing was used and identifies some key learnings.МетодPower analysis briefing: Review of tools and methods
This paper, developed for WaterAid in 2012, provides an overview of different types of tools and methods for power analysis, including the use of lists, network maps, and classification systems.МетодExample outcome journal template
This template is based on the original Outcome Mapping guidance, incorporating elements from Outcome Harvesting (e.g. significance) and allowing tagging to particular progress markers (rather than listing them all as the original does).МетодCausal map app
This site includes a range of resources on causal mapping and the use of the causal map app including a guide that covers basic and advanced coding and analysis.МетодReflecting: journals and learning (b)logs – Using journals and learning (b)logs to assess learning
The "Reflecting: Journals and Learning (b)logs" resource from the University of Warwick provides comprehensive guidance on the use of reflective writing for assessment and personal development.МетодDiaries and logs
This article provides comprehensive guidance on using diaries and logs to assess physical activity.МетодKey informant attribution
A method for testing causal reasoning by asking key informants.МетодThe art and craft of bricolage in evaluation
This CDI Practice Paper, by Tom Aston and Marina Apgar, makes the case for ‘bricolage’ in complexity-aware and qualitative evaluation methods.МетодDecide who will conduct the evaluation
Clarify who will actually undertake the evaluation. This might include people who are involved in what is being evaluated (such as implementers, clients and community members), an internal or external evaluator, or some combination of these.Концептуальные основы / руководстваRainbow FrameworkIdentify potential unintended results
It is useful and ethical to consider possible negative impacts (that make things worse not better) and how they can be identified before an intervention (project, programme, or policy) is implemented and addressed in an evaluation or M&E System.Концептуальные основы / руководстваRainbow FrameworkDevelop theory of change / programme theory
A programme theory or theory of change (TOC) explains how an intervention (a project, a programme, a policy, a strategy) is understood to contribute to a chain of results that produce the intended or actual impacts.Концептуальные основы / руководстваRainbow FrameworkLost causal: Debunking myths about causal analysis in philanthropy
This 2021 paper, updated in 2024, advocates for more causal analysis in philanthropic evaluation - not just describing actions taken and changes observed, but also learning how and why change occurred.МетодRethinking rigour to embrace complexity in peacebuilding evaluation
This 2024 open-access journal article presents the inclusive rigour framework and applies it to three cases of peace-building evaluation.МетодCausal Pathways 2023 Symposium and 2024 introductory sessions
This series of webinars was first presented at the Causal Pathways Symposium 2023, which focused on "connecting, learning, and building a shared understanding of the evaluation and participatory practices that make causal pathways more visible"МетодCausal Pathways introductory session: How do I mix and combine methods?
This session of the Causal Pathways Symposium 2023 explores when to use different methods and how to combine them to better understand and visualise causal pathways.Метод