|

Theory-driven & process evaluation: The art of getting inside and beyond the “black box”

Before working in program evaluation, I received education and training as a clinician, specifically as an art therapist. Through my work as an art therapist, which was based in a personal belief in and, more importantly, empirical observations supporting the mental health-promoting effects of making art, I became curious about how to demonstrate arts impacts to the general public (including dubious funders and policymakers) and found a lack of relevant research to back up what Id seen in practice. At the same time, I talked to numerous art therapists and community artist-practitioners who were doing good work with vulnerable populations in my city of Baltimore, all around the U.S., and throughout the world. The individuals I talked to all voiced their need to prove the value of their work, but many were unsure that research could do justice to the creative process, and were leery of evaluators who might try to force fit what they do into little boxes. Ive even heard it said (more than once!), Theres no way to measure what we do! On that sweeping point, I have to respectfully disagree with some of my art world colleagues.


After having supported program evaluation projects for some time now, I can understand the common wariness, held by many practitioners of all sorts, about evaluations ability to capture the richness and complexity of their work. (And granted, depending on the nature of the evaluation project, achieving that lofty goal may not even be the point!) However, a programs very survival may rest on its ability to demonstrate that its interventions are high-quality and cost-effective, and program stakeholders have increasingly sought these assurances. That program quality can and should be measured is something that most of us might agree on, but the issue seems complicated by the continued hold of outcomes-oriented thinking on even the most unconventional of community artists, constraining ideas of what measurement means and offers and forcing a focus on outputs before the intervention design itself is clear. More broadly, a tenacious, prevailing focus on outcomes measurement continues to dominate stakeholders expectations of program evaluation, even though this does not either assess or inform better development of programs theories of change, which actually allow one to look inside the black box of intervention effectiveness and help one to assess quality.

Evaluations known as black box, or input-output evaluation, have the primary goal of assessing the relationship between intervention and outcome. They do not systematically evaluate change processes that turn interventions into outcome but seek out information about a program's merits. If evaluators and stakeholders, including practitioners, need to understand the merits of a program and how processes can be tailored to improve the intervention then another evaluation strategy, such as theory-driven and process evaluation 1, is a better choice (Chen, 2005). Such efforts allow for a more in-depth examination of program components to show which areas, have been more or less effective (Harachi, Abbott, Catalano, Haggarty, Fleming, 1999; Linnan & Steckler, 2002). Looking inside the black box, in order to get beyond it, still recognizes the role of outcomes measurement but also examines implementation fidelity and other issues to determine if it is an intervention or entire program, or merely aspects of it that actually succeeded or failed.

Process evaluation serves an important role for program evaluation both when interventions produce significant outcomes and when interventions do not produce intended impacts. When outcomes are significant it is important for stakeholders to have some way of knowing which intervention components actually contributed to the outcomes; when outcomes are not significant, process evaluation can help explain why they were modest or insignificant (Linnan & Steckler, 2002; Susser, 1995). Programs can also learn whether or not their theories of change clearly specify the intervening processes or mechanisms that link activities to intended outcomes. Such evaluative information can be readily applied by practitioners, and it is just such information that is most needed by programs _right now_, including community arts programs. It also can contribute to a broader body of research about social impacts of the arts that will make the adoption of more useful and appropriate outcomes-oriented measurement possible for this field in the future

[1] Theory-driven evaluation (TDE) was devised to provide an answer to problems encountered in traditional evaluation strategies that are limited by before-after and input-output thinking. Proponents of TDE hold that for any intervention, a theory that explains how the intervention is expected to work can be described. These theories are often made up of implicit assumptions that guide the design of an intervention. Process evaluation, also known as _implementation assessment_, essentially analyzes the quality and sometimes even the effectiveness of program operations, implementation, and service delivery.

References:

Chen, H.T. (2005). Practical program evaluation: Assessing and improving planning, implementation, and effectiveness. Thousand Oaks, CA:  Sage Publications, Inc.

Harachi, T. W., Abbott, R.D., Catalano, R.F., Haggerty, K.P., & Fleming, C.B. (1999). Opening the black box: Using process evaluation measures to assess implementation and theory building. American Journal of Community Psychology, 27, 715735.

Linnan, L. & Steckler A. (2002). Process evaluation and public health interventions: An overview. In Steckler, A. and Linnan, L (Eds.), Process evaluation in public health interventions and research (pp. 1-23). San Francisco: Jossey-Bass Publishers.

Susser, M. (1995). Editorial: The tribulations of trialsInterventions in communities. American Journal of Public Health, 85, 156158.

Let’s work together!

Most nonprofits spend days putting together reports for board meetings and funders. The Inciter team brings together data from many sources to create easy and effortless reports. Our clients go from spending days on their reports, to just minutes.