Why the Hell was Taj at a Design Conference and What Did She Learn There?
Last week I was at the Interaction Design conference. Now, you probably know that evaluation and design dont exactly go hand-in-hand, so I understand if your next thought is a befuddled, Huh? You dont usually find evaluators at a design conference.
So, was I just hopping a plane to San Francisco in February because of the awesome weather and the cool town? Well, not exactly, although I have to say those were both nice perks of being there. Did I travel to San Fran for some much needed R&R and to clear my head? Not entirely, but I do always feel inspired and innovative when Im there. (And its a bit surreal to go by the Uber headquarters in your Uber.)
The truth is, Ive been doing a ton of reading and thinking about design lately. The more I learn, the more I find that there is a common ground between the way we do our work here at CRC and the work of design firms. Ill be writing a series of blogs in the future about how we use design thinking in our approach to research and evaluation, but for now Ill just quote Phi Hong Ha, who said in her conference talk,** Design is about helping people to make sense of the world.**
Thats what we evaluators do, too! Help make sense of the world. Its our job to help people understand what is going on, what their clients are thinking and feeling, and what the reality is of the communities they work in and what the impact is of their programs.
Design thinking, combined with CRCs love of beautiful data, makes a design conference the perfect place for me to learn new things, embrace old ways of doing things with a new language, and continue to be inspired. As Tim Brown (Tim Brown!) said, Information is the material we are most using in design today.
If that doesnt sound relevant to what we do here at CRC, I dont know what does.
So, what did I get out of this conference?
Creative innovation is risky. Smaller firms and individuals are often able to innovate, try new things, and play more than larger firms because we smaller firms have a higher tolerance for risk and less bureaucracy to fight.One barrier you encounter to creativity, as Dan Saffer pointed out, is that Efficiency is the enemy of creativity. Creativity takes time. And its not always linear. Researchers love linearlinear relationships, linear regression. Here at CRC we have a certain tolerance for a bit of meandering if it means a better process, a better relationship, and a better outcome.
Things never turn out like you think they will. So stop expecting them to. Its been our experience that projects never go exactly as intended. Thats because our work involves people, not widgets (or pharmaceuticals). And people change, make decisions, and adapt constantly. Jan Chipchase said, Once you begin, assume that everything you planned is not relevant.
While this is an extreme perspective, I have found it very helpful to let go of the idea that nothing will change. Being flexible in our thinking has worked for us very well, and its a constant in the design world.
Silicon Valley is full of people telling you failure is great. Fail fast they say. Foundations especially struggle with this. They often hold onto approaches or initiatives long after they should let go and move on to new and innovative ideas. Everyone struggles with change, individuals and organizations. But Saffer made a good point. Failure sucks! he said. Learning from failure is where its at. We are constantly trying out new approaches and new tools. Some of them succeed, some of them fail. But I have yet to try out something new that we didnt learn from, whether it was software, process, or research methods.
Another theme of design work is_ empathy_. The idea of empathy often makes researchers uncomfortable. We embrace empathy here at CRC. Watching people work with their clients or fill out forms or struggle with databases gives us empathy for how we might develop solutions to make their work easier. Indi Young did a great job of defining empathy in a way that I think even evaluators and researchers can embrace. She differentiated between emotional empathy (feeling what someone is feeling) and cognitive empathy, which is about understanding how people are thinking and why and how they react.
I would argue that we need some of both because we also need to understand the emotional state people are in when we are asking them to fill out forms, take attendance, complete surveys, etc. Sometimes they are upset, frustrated, or anxious (forms can bring it out in the best of us). But cognitive empathy applies too. How do people see and understand our data collection tools? How are they interpreting the questions we ask on surveys (hello, cognitive interviewing)? What makes them skip over some parts of the forms they fill out? Observing and talking with people about these things specifically can be very fruitful.
Danielle Malik wins the award for best presentation title with Go Home Data, Youre Drunk. She talks about how future trends will be all about analytics and customization (even more so than now). Because of the visual storytelling we do here, I was excited to hear data points are the words, and its up to us to construct the sentences. Her presentation resonated with me, as she focused on the fact that data is not neutral. You have to constantly think critically about where it comes from, why you collect it, and how you intend to use it. And well definitely be looking for ways to use the hashtag from her presentation #drunkdata.
I think evaluators have much to learn from the design community. Design is user centered, process oriented, and collaborative. The design process requires that you empathize with your users, understand what their problems are, come up with creative solutions, test them, and then build in a process of iteration and tweaking until you get where you need to be.
I think evaluation should be more like design. I will admit that I often have doubts about the way we design and implement evaluations in this field. Evaluators often do not take the users into account. Some evaluators work in virtual isolation from their end-users. How many actually spend time in the schools, public health clinics, and program sites where their data collection takes place (besides us, of course)? How many actually talk to the people filling out the surveys to find out what they think was meant? Karl Fass, professor of User Interface design said, We shouldnt be using the vocabulary of natural science, and I agree. I often question why we use the vocabulary AND the methods of the natural sciences in a contest that is very human-centered and (lets face it) often chaotic.
So I leave you with these questions.
What would happen if we used human-centered design principles and user testing to develop and evaluate social programs instead of evidence-based practices? 1
_What if we began every program implementation with empathy, planned it collaboratively with the end user in mind, and assumed that some iteration would take place before it got to where it needed to be? _
_What if we conducted our evaluations not like science experiments but like user analytics? _
What if we were careful to collect only what we (or our clients) needed, to constantly review the data and collaboratively make decisions with it, and to not assume there was a beginning or an end point? How revolutionary would that be?
Well, if Im going to be truthful, we know that good evaluation practices do these things, even though we dont always like to talk about it. But in my upcoming posts on design thinking, I plan to do just that.
[1] Its a radical notion, but its one worth considering. Large, elaborate studies are conducted using the principals of the natural sciences. Then we plop those programs into a variety of locations as if context doesnt matter. Designers know that context always matters. And that users can be almost infinitely segmented.
Recent posts
Prioritize Your Data and Reporting Needs When Shopping for a Data Management System
Marry your Data Management System or Date Other Systems: The Difference Between an All-in-One System and a Best-of-Breed (or Best of Need) Solution
Navigating the Cloud: AWS vs Azure
Let’s work together!
Most nonprofits spend days putting together reports for board meetings and funders. The Inciter team brings together data from many sources to create easy and effortless reports. Our clients go from spending days on their reports, to just minutes.