Thanks for taking part in our user survey - here's what we've learnt

By
Nick HerftAlice Macfarlan
closing_ux_survey-02.png

We've now completed the first component of our user research - the user survey.  

Thank you for the helpful feedback from 50 different countries and from a wide range of users, including evaluators, people who sometimes do evaluation, evaluation managers and users, people involved in evaluation capacity strengthening, students, and others.  (If you missed the chance, we're always pleased to get feedback through our contact form).

Next week we will draw the names of the three lucky survey respondents who will win one of these books:

  1. Developing Monitoring and Evaluation Frameworks (2016, Anne Markiewicz and Ian Patrick)
  2. Principles-Focused Evaluation The GUIDE (2018, Michael Quinn Patton.)
  3. Interactive Evaluation Practice: Mastering the Interpersonal Dynamics of Program Evaluation​:  (2013, Jean King and Laurie Stevahn)

We're now moving onto the other components of our user research, involving observation of using the site, interviews, and review from feedback panels on early prototypes. 

What we've discovered...

We'll be using the survey responses to guide these next components and, in some cases, to identify improvements we can make straight away to the site. Here are some of the early findings:

User survey findings

We learnt:
Whilst many users said they find the website easy to navigate and find information, others said they find it difficult to search as well as navigate bigger sections such as the Rainbow Framework and the Manager's guide to evaluation.
What we're doing about it:
We're conducting interviews and observation studies to learn more about the barriers which make navigating and finding information difficult, and will develop some different options to make it easier to find useful information.

We learnt:
Many users encounter broken links - especially with resources that link to broken websites or missing documents. 
What we're doing about it:
A big challenge is many of the broken links are caused by missing resources hosted on other websites. We keep a back-up of downloadable resources but unfortunately that's not always possible. We're going to explore how we can deal with this better.

We learnt:
Users reported it's difficult to find the Terms of Reference GeneraTOR.
What we're doing about it: 
We will address this systematically as part of the review of the whole site information architecture.  In the meantime, we've created a new landing page for the GeneraTOR and added it under Resources in the main menu.

We learnt:
Users have reported difficulty in submitting information on the site due to the Captcha (anti-spam tool).
What we're doing about it: 
The Captcha is meant to be difficult for bots to complete but easy enough for humans. We've now replaced the previous Captcha system with Google's reCaptcha, which is designed to be much simpler for humans and even more challenging for bots.

Common evaluation challenges our users face

We learnt:
When asked about what challenges users face in their work, many mentioned issues related to the MANAGE cluster of tasks (using the Rainbow Framework's categorisation).  These included: finding it challenging (in general) to explain what evaluation is to those who either don't know much about evaluation or think it's just a survey; working with and within organisations without a strong evaluative culture; managing the politics of evaluations; working with constraints (budgets, time, staff capacity); and managing stakeholder expectations in line with these constraints.

In addition, a number of people named creating evaluation plans and frameworks as a challenge - in particular finding new methods and choosing which methods would be most appropriate.  Two common concerns were ensuring evaluation quality and being stuck using the same methods again and again regardless of their suitability. Other big challenges people mentioned were to do with collecting and analysing data, supporting the use of findings, and using different and engaging methods for reporting to stakeholders.
What we're doing about it: 
Over the next month, we're going to look closely at how what we can do on the site to assist with these challenges. This will include looking through our existing information to find useful options and resources that we can highlight and share, or, if there's a gap on the site, working with the BetterEvaluation community to plug it - either by finding new resources, revisiting the Rainbow Framework to add new options, or documenting experiences from practice to show share real-world solutions (something which a number of people said would be valuable).

We're also keeping these challenges in mind as we revisit the user interface of the site, as one of the goals of this work is to make it easier to navigate through the evaluation tasks and options and find information to make informed choices.

This is just a small sample of what we've discovered so far, so please don't think your voice hasn't been heard if you don't see anything relevant to your survey submission! We'll continue to share what we find and explore these issues, and how we might respond to them.

Feedback panels

Another big thanks to everyone who opted in to be part of our feedback panels. We'll be getting in touch with some of you to participate in interviews and observation studies so we can learn more about how people with different roles and levels of expertise use and navigate the website and understand better the usability barriers. Following this, we'll be coming up with solutions and prototypes for site improvements and distributing them to the entire feedback panel for input. We're really excited to be able to work with you in the coming months (and beyond) to keep making the site easier to use and more useful for everyone.

Stay tuned for updates—and thanks again to everyone who participated!