|

Design & Evaluation: Radical Collaboration

People at club/festival/meet up

_(Pardon our silence over these past several months! After our unintentional hiatus, were be getting back into our blogging routine, sharing evaluation related news, tips, and tricks on a somewhat monthly basis. Starting with todays post, the first in a series of posts about design and evaluation…) _

Introduction

Over the past few years, as CRC has explored and embraced visual thinking, information visualization, and the use of technology in evaluation, Ive gotten a real world education in design, technology, and design thinking. Ive done a lot of reading, even more experiments, and gotten an actual education in these areas through completing a Masters in Information Visualization from the Maryland Institute College of Art.

Researchers have plenty to do. Keeping up with the current trends in their field(s), new data collection methods, the latest in propensity scoring, and the changing context of neighborhoods and communities, to say nothing of keeping an eye on changing funding priorities from foundations and government agencies … its a lot of work. So I wanted to make this part incorporating design thinking a little easier for you. Through this series, Ill be sharing some insights to help you think differently about how you work, with hopes of starting a conversation about what the world of social sciences can learn from the world of design. (These thoughts are informed by many books, conversations, and conferences, but especially by the work of Don Norman and others in the Human Centered Design Field, that of the Institute of Design at Stanford, and of Tim Brown and his colleagues at IDEO.)

This first post will focus on a principle of design thinking. Those that follow will talk about how we might adopt design frameworks, and take a look at Normans work on Human Centered Design, which examines how to concretely make things that people can and will use, and will actually enjoy using. (And if youve ever made a form that people hate, you know we can certainly learn a thing or two from Norman.)

Part 1: Radical Collaboration

There are many, many types of evaluation and research, and some focus on explicitly being collaborative, participatory, and/or empowering. But research is often a top down endeavor. It may require people like us, researchers and evaluators with advanced degrees and specialized skills. We know things others dont. We know how to write survey questions, how to do representative sampling, how to conduct focus groups and analyze data. But non-researchers know things that we dont. Radical collaboration involves acknowledging that, while we have some important specialized skills, we dont hold all knowledge (in broad strokes, it means collaborating in a solutions-focused, action-oriented, rather than problems-focused way). In fact, this is the case even when we have all evident information.

Particularly when working with a program, we know that program staff have amazing insights into the research process. We do our best work when we find out why and how program staff interact with clients, especially around collecting information. They are the ones who can tell us whether our questions make sense, whether we are asking the right questions, or why no one is filling out that one field on the one form. They often know the best way to get information to us, and they also know what information they need from us.

Theyre also excellent at helping interpret our data analysis. For example, we presented school-based health staff with a chart showing when students went to the health center. There was a huge spike in September. We all thought that was because students hadnt been getting needed health care during the summer, so when they came back to school they went to the school nurse to get their health needs taken care of.

Fortunately, we kept our mouths shut and asked the staff what they thought the data pattern meant. They knew immediately. And we were so very off-base with our assumption (again good thing we kept our mouths shut). Oh, there is always a new school nurse in the fall, and all the kids ask to go to the nurse to see if she will let them out of class. She sends them right back to class, and by the end of September they stop trying to use that trick.

980623_538210389593958_68868396_o

But program staff are just the beginning. Working with survey participants can also be an opportunity for radical collaboration. Recently, we were working on a community survey, and we used cognitive interviewing to help us craft the questions. We had the potential respondents think out loud while they answered our initial questions, telling us what they thought we were asking them. This process highlighted areas where we clearly thought a question meant one thing, but these respondents interpreted the question totally differently. By simply listening to people, we were able to craft a survey that was much more valid BEFORE we sent it out to hundreds of people. A little collaboration saved us a lot of headaches.

Foundations also hold unique perspectives. Because they often have the resources and the staff to really dig deep into difficult social problems, they have insights into long-term and national trends, and the many intersecting factors that impact a particular issue. They may know the literature, the players, and all the sites nationwide who are working on the same problem, and what has worked for them and what has not. These insights can help fuel program development, implementation, design, interpretation of research and more. This deep knowledge can help smaller programs learn from larger efforts and avoid reinventing the wheel.

Radical collaboration means recognizing that everyone has the answers. It also means that everyone occupies a unique position and sees things based on where they stand. We may not always be wholly accurate, but everyone can offer a piece of the puzzle. Radical collaboration requires openness, a willingness to try new things and be open to new ideas, and to try out new strategies, even though they may not work out. But together the perspectives of all who are involved can create a more accurate picture of what the problem is and more innovative ideas about how to address it.

Check back soon for Part 2 in this series!

Let’s work together!

Most nonprofits spend days putting together reports for board meetings and funders. The Inciter team brings together data from many sources to create easy and effortless reports. Our clients go from spending days on their reports, to just minutes.