|

Participatory Analysis

A report released by Public /Private Ventures in March 2011 titled "Priorities for a New Decade: Making (More) Social Programs Work (Better)," discussed a critical problem with the evaluation process for non-profit programs: Often times, evaluators do not collaborate with a program and therefore programs are passively evaluated. In addition, funders may not ask for the right evidence and an often impractical report is usually produced months later. This gives the non-profit no voice in the evaluation process and no time to make adjustments or improvements in their program.

The authors recommended that evaluators collaborate with the program staff (i.e. the stakeholders) in the evaluation design, and that the stakeholders are provided real-time, actionable feedback that allows the program to _improve its effectiveness_. They believe that evaluators need to help non-profits better understand evaluative data and more effectively use the data to improve their programs.

In a recent daily alert email, the American Evaluation Association (AEA) discussed a potential solution for this disconnect, which included the involvement of the stakeholders in the analysis and interpretation of data. In a subsequent white paper, the same authors (Veena Pankaj and Myia Welsh from Innovation Network) recommended three questions for determining if this type of participatory analysis is a good fit for an evaluation:

  1. How would including the stakeholders in the analysis improve the quality of the findings?
  2. How would including the stakeholders in the analysis improve their engagement with the evaluation?
  3. Will including the stakeholders in the analysis fit with a project's timeline and resources?

Once an evaluation is completed using participatory analysis, the evaluator leads the stakeholders through a constructive discussion of the data and the results. This may include:

  1. Presenting the data to stakeholders for their feedback, or
  2. Providing drafts of reports to stakeholders and including their comments in the final report.

This novel approach helps keep the client engaged and interested in the evaluation process and also ensures that their voice is included in the discussion and evaluation.Additionally, it provides more relevant results that are more likely to be put to use.In line with the Public/Private Ventures report, this process would help the organization better understand the data, which would ultimately lead to them using the results.

These reports prompted me to reflect on the evaluations I've conducted with_ non-profit organizations.I was struck by the impact that my use of collaboration has had on my experience conducting evaluations.When I'm fortunate enough to have a close partnership with the program staff of the program that I'm evaluating, the results I provide are much richer and much more comprehensive.

These partnerships give me an insider's perspective to the program implementation. By taking the time to build these close relationships, I can provide real-time feedback on the data trends I notice, which may in turn help to improve program implementation and program effectiveness. The program gets a more thorough and insightful evaluation, since partnering means I have a better understanding of what's really going on. In return program staff often ask me for data or insights when they are trying to make decisions about their own program implementations.

What is your experience with evaluators as strategic partners?

Let’s work together!

Most nonprofits spend days putting together reports for board meetings and funders. The Inciter team brings together data from many sources to create easy and effortless reports. Our clients go from spending days on their reports, to just minutes.