Utilization-Focused Evaluation

Synonyms: 
Use-focused evaluation, Utilisation Focused Evaluation, Utilization Focused Evaluation

Utilization-Focused Evaluation (UFE), developed by Michael Quinn Patton, is an approach based on the principle that an evaluation should be judged on its usefulness to its intended users.  Therefore evaluations should be planned and conducted in ways that enhance the likely utilization of both the findings and of the process itself to inform decisions and improve performance.

UFE has two essential elements.  Firstly, the primary intended users of the evaluation must be clearly identified and personally engaged at the beginning of the evaluation process to ensure that their primary intended uses can be identified.  Secondly, evaluators must ensure that these intended uses of the evaluation by the primary intended users guide all other decisions that are made about the evaluation process.

Rather than a focus on general and abstract users and uses, UFE is focused on real and specific users and uses.  The evaluator’s job is not to make decisions independently of the intended users, but rather to facilitate decision making amongst the people who will use the findings of the evaluation.

Patton argues that research on evaluation demonstrates that: “Intended users are more likely to use evaluations if they understand and feel ownership of the evaluation process and findings [and that] [t]hey are more likely to understand and feel ownership if they've been actively involved. By actively involving primary intended users, the evaluator is preparing the groundwork for use.” (Patton, 2008, Chapter 3) 

UFE can be used for different types of evaluation (formative, summative, process, impact) and it can use different research designs and types of data.

The UFE framework can be used in a variety of ways depending on the context and the needs of the situation.  Patton's original framework consisted of a 5 step process which can be seen in the example below.  However, there is also a 12 step framework (see Utilization-Focused Evaluation (U-FE)Checklist in the resources below) and the latest update which is 17 steps is outlined below.

The 17 Step UFE Framework

  1. Assess and build program and organizational readiness for utilization-focused evaluation
  2. Assess and enhance evaluator readiness and competence to undertake a utilization-focused evaluation
  3. Identify, organize, and engage primary intended users: the personal factor
  4. Situation analysis conducted jointly with primary intended users
  5. Identify and prioritize primary intended uses by determining priority purposes
  6. Consider and build in process uses if and as appropriate
  7. Focus priority evaluation questions
  8. Check that fundamental areas for evaluation inquiry are being adequately addressed: implementation, outcomes, and attribution questions
  9. Determine what intervention model or theory of change is being evaluated
  10. Negotiate appropriate methods to generate credible findings that support intended use by intended users
  11. Make sure intended users understand potential methods controversies and their implications
  12. Simulate use of findings: evaluation's equivalent of a dress rehearsal
  13. Gather data with ongoing attention to use
  14. Organize and present the data for interpretation and use by primary intended users: analysis, interpretation, judgment, and recommendations
  15. Prepare an evaluation report to facilitate use and disseminate significant findings to expand influence
  16. Follow up with primary intended users to facilitate and enhance use
  17. Meta-evaluation of use: be accountable, learn, and improve

Example

This example utilizes the 5 step UFE framework.

International Network for Bamboo and Rattan (INBAR)
In late 2006, the International Network for Bamboo and Rattan engaged one of the authors (Horton) to evaluate its programmes. Headquartered in Beijing, INBAR's mission is to improve the wellbeing of bamboo and rattan producers and users while ensuring the sustainability of the bamboo and rattan resource base. The Dutch Government had initially requested and funded the evaluation as an end-of-grant requirement.

Step 1. Identify primary intended users. The first task was to ascertain the 'real' purposes and potential users of the evaluation. This process began with a face-to-face meeting with INBAR's Director and a call to a desk officer at the Dutch Ministry of Foreign Affairs, which revealed that the intent of both parties was for the evaluation to contribute to strengthening INBAR's programmes and management. During an initial visit to INBAR's headquarters, additional stakeholders were identified, including INBAR board members and local partners.

Step 2. Gain commitment to UFE and focus the evaluation. From the outset, it was clear that key stakeholders were committed to using the evaluation to improve INBAR's work. So the main task was to identify key issues for INBAR's organizational development. Three options were used: (1) a day-long participatory staff workshop to review INBAR's recent work and identify main strengths, weaknesses and areas for improvement; (2) interviews with managers and staff members; and (3) proposing a framework for the evaluation that covered the broad areas of strategy, management systems, programmes and results.

Step 3. Decide on evaluation options. After early interactions with the Dutch Ministry of Foreign Affairs on the evaluation Terms of Reference (ToR), most interactions were with INBAR managers, staff members and partners at field sites. It was jointly decided that INBAR would prepare a consolidated report on its recent activities (following an outline proposed by the evaluator) and organize a self-evaluation workshop at headquarters. The evaluator would participate in this workshop and make field visits in China, Ghana, Ethiopia and India. INBAR regional coordinators proposed schedules for the field visits, which were then negotiated with the evaluator.

Step 4. Analyze and interpret findings and reach conclusions. At the end of each field visit, a debriefing session was held with local INBAR staff members. At the end of the field visits, a half-day debriefing session and discussion was held at INBAR headquarters; this was open to all staff. After this meeting, the evaluator met with individual staff members who expressed a desire to have a more personal input into the evaluation process. Later on, INBAR managers and staff members were invited to comment on and correct a draft evaluation report.

Step 5. Disseminate evaluation findings. The evaluator met personally with representatives of three of INBAR's donors to discuss the evaluation's findings, and the final report was made available to INBAR's donors, staff members and the Board of Trustees. A summary of the report was posted on the INBAR website.

Utility of the evaluation. The evaluation process helped to bring a number of issues to the surface and explore options for strengthening INBAR's programmes. For example, one conclusion of the evaluation was that INBAR should seek to intensify its work in Africa and decentralize responsibilities for project management to the region. There has been a gradual movement in this direction, as new projects have been developed. INBAR has recently opened a regional office for East Africa, in Addis Ababa and is putting more emphasis on collaboration with regional and national partners.

Source: Patton, M.Q. and Horton, D. 2009. Utilization-Focused Evaluation for Agricultural Innovation. International Labor Accreditation Cooperation (ILAC) Brief No. 22. ILAC, Bioversity, Rome.

Advice

Advice when USING this approach

  • While in principle there is a straightforward, one-step-at-a-time logic to the unfolding of a utilization-focused evaluation, in reality the process is seldom either as simple or as linear. For example, the evaluator may find that new users become important as the evaluation proceeds or that new questions emerge in the midst of options decisions. Nor is there necessarily a clear and clean distinction between the processes of focusing evaluation questions and making options decisions; questions inform options, and methodological preferences can inform questions.
  • U-FE requires active and skilled guidance from and facilitation by an evaluation facilitator.
  • Time resources available for the evaluation must be clearly negotiated, built in from the beginning, and stated in the agreement. The essentially collaborative nature of U-F evaluation demands time and active participation, at every step of the entire process, from those who will use the evaluation results. Additional resources may be needed if new uses or users are added after the evaluation has begun.
  • Financial resources available for the evaluation must be clearly stated. They must include financial resources beyond mere analysis and reporting. Resources that facilitate use must be available.
  • In conducting a U-F evaluation the evaluator must give careful consideration to how everything that is done, from beginning to end, will affect use.

Resources

Guides

Tools

  • Utilization-Focused Evaluation (U-FE)Checklist:   designed by Michael Quinn Patton in 2002 this is a comprehensive checklist for undertaking a utilization focused evaluation. It draws attention to the necessary tasks and associated challenges of conducting Utilization-Focused Evaluation (UFE).

Examples

Sources

Patton, M.Q. (2008). Utilization-focused evaluation, 4th edition. Thousand Oaks, CA: Sage.

Patton, M.Q. and Horton, D. (2009). Utilization-Focused Evaluation for Agricultural Innovation. Institute of Learning and Change (ILAC) Brief No. 22. ILAC, Bioversity, Rome.

Comments

David McDonald's picture
David McDonald

Thanks, great material here already.

The List of References cum Further Reading could include Patton, MQ 2011, Essentials of utilization-focused evaluation: a primer, Sage Publications, Thousand Oaks.

Patricia Rogers's picture
Patricia Rogers

Thanks, David. We've added this great resource, as you suggested.

Please login or register to comment