Search

139 results

Filter search results
  • A critical review of applications in QCA and fuzzy-set analysis and a ‘toolbox’ of proven solutions to frequently encountered problems

    This paper from Patrick A. Mello focuses on reviewing current applications for use in Qualitative Comparative Analysis (QCA) in order to take stock of what is available and highlight best practice in this area.
    Resource
  • Set-theoretic methods for the social sciences: A guide to qualitative comparative analysis

    This book, by Schneider and Wagemann, provides a comprehensive overview of the basic principles of set theory to model causality and applications of Qualitative Comparative Analysis (QCA), the most developed form of set-theoretic method, fo
    Resource
  • Using qualitative comparative analysis to explore causal links for scaling up investments in renewable energy

    This paper illustrates how qualitative comparative analysis (QCA) was used to identify causal pathways for scaling renewable energy to meet sustainable development and climate goals.
    Resource
  • An introduction to applied data analysis with qualitative comparative analysis

    This article by Nicolas Legewie provides an introduction to Qualitative Comparative Analysis (QCA). It discusses the method's main principles and advantages, including its concepts.
    Resource
  • Compasss: Comparative methods for systematic cross-case analysis

    COMPASSS (Comparative methods for systematic cross-case analysis) is a website that has been designed to develop the use of systematic comparative case analysis as a research strategy by bringing together scholars and practitioners who shar
    Resource
  • What is qualitative comparative analysis (QCA)?

    This slide show from Charles C Ragin, provides a detailed explanation, including examples, that clearly demonstrates the question, 'What is QCA?'.
    Resource
  • Outcome mapping: Building learning and reflection into development programs

    This book by Sarah Earl, Fred Carden and Terry Smutylo takes an original approach to assessing development impacts by focusing on the way in which people relate to each other and to their environment rather than simply evaluating the produc
    Resource
  • Outcome mapping: A method for tracking behavioural changes in development programs

    This guide published by the Institutional Learning and Change (ILAC) Initiative provides a detailed overview of using outcome mapping as an evaluation tool.
    Resource
  • Outcome Mapping Learning Community

    This website from the Outcome Mapping Learning Community is a resource and sharing hub for resources and ideas related to outcome mapping.
    Resource
  • 10 years of outcome mapping

    This webinar from the Outcome Mapping Learning Community (OMLC) presents the key findings from research conducted into the extent of Outcome Mapping use and the support required for its implementation.
    Resource
  • Understanding process tracing

    This 2011 paper, from David Collier, outlines a new framework for process tracing to achieve greater systemisation of qualitative methods. This version includes some reflections in 2019 on subsequent developments.
    Resource
  • Assessing rural transformations: Piloting a qualitative impact protocol in Malawi and Ethiopia

    This working paper reports on findings from four pilot studies of a protocol for qualitative impact evaluation of NGO-sponsored rural development projects in Malawi and Ethiopia.
    Resource
  • QuIP in action: Save the Children case study

    This resource provides an example of the use of the Qualitative Impact Assessment Protocol (QuIP) approach in evaluations of Save the Children's programmes.
    Resource
  • Attributing development impact: The qualitative impact protocol (QuIP) case book

    This freely available, online book brings together case studies using an impact evaluation approach, the Qualitative Impact Protocol (QUIP), without a control group that uses narrative causal statements elicited directly from intended proje
    Resource
  • Joint Committee on Standards for Educational Evaluation (JCSEE) program evaluation standards

    This resource from the Joint Committee on Standards for Educational Evaluation provides statements that describe each of the standards developed to support program evaluation.
    Resource
  • Credibility

    Credibility refers to the trustworthiness of the evaluation findings, achieved through high-quality evaluation processes, especially rigour, integrity, competence, inclusion of diverse perspectives, and stakeholder engagement.
    Method
  • Integrity

    Integrity refers to ensuring honesty, transparency, and adherence to ethical behaviour by all those involved in the evaluation process.
    Method
  • Cultural competency

    Cultural competency involves ensuring that evaluators have the skills, knowledge, and experience necessary to work respectfully and safely in cultural contexts different from their own.
    Method
  • Feasibility

    Feasibility refers to ensuring that an evaluation can be realistically and effectively implemented, considering factors such as practicality, resource use, and responsiveness to the programme's context, including factors such as culture and
    Method
  • Inclusion of diverse perspectives

    Inclusion of diverse perspectives requires attention to ensure that marginalised people and communities are adequately engaged in the evaluation.
    Method
  • Independence

    Independence can include organisational independence, where an evaluator or evaluation team can independently set a work plan and finalise reports without undue interference, and behavioural independence, where evaluators can conduct and re
    Method
  • Evaluation accountability

    Evaluation accountability relates to processes in place to ensure the evaluation is carried out transparently and to a high-quality standard.
    Method
  • Transferability

    Transferability involves presenting findings in a way that they can be applied in other contexts or settings, considering the local culture and context to enhance the utility and reach of evaluation insights.
    Method
  • Utility

    Utility standards are intended to increase the extent to which program stakeholders find evaluation processes and products valuable in meeting their needs.
    Method
  • Professionalism

    Professionalism within evaluation is largely understood in terms of high levels of competence and ethical practice.
    Method
  • Propriety

    Propriety refers to ensuring that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in it and those affected by its results.
    Method
  • Systematic inquiry

    Systematic inquiry involves thorough, methodical, contextually relevant and empirical inquiry into evaluation questions. Systematic inquiry is one of the guiding principles of the American Evaluation Association:
    Method
  • Transparency

    Transparency refers to the evaluation processes and conclusions being able to be scrutinised.
    Method
  • Ethical practice

    Ethical practice in evaluation can be understood in terms of designing and conducting an evaluation to minimise any potential for harm and to maximise the value of the evaluation.
    Method
  • Accuracy

    Accuracy refers to the correctness of the evidence and conclusions in an evaluation. It may have an implication of precision.
    Method
  • Accessibility

    Accessibility of evaluation products includes consideration of the format and access options for reports, including plain language, inclusive print design, material in multiple languages, and material in alternative formats (such as online,
    Method
  • Competence

    Competence refers to ensuring that the evaluation team has or can draw on the skills, knowledge and experience needed to undertake the evaluation.
    Method