Building a research practice is intimidating. You’re basically embarking on a path of creating a consistent discipline of seeking, synthesizing, and communicating knowledge that is either unknown or inaccessible by most. Nonetheless, this is something that thousands perform daily across a multitude of fields.
While I loathe suggesting there are shortcuts, I do believe there are tools and techniques to make the journey more efficient. We can figure out the right gearing to make our energy expenditure more efficient, and we can look for opportunities to recapture some of the same.
This is the first in a series of posts applying the reflective tools of design against different parts of the research process. Design has evolved into a somewhat generalized and transdisciplinary field, albeit one which requires considerable depth in craft to perform well. My hope is that this transdisciplinary quality can surface some of the opportunities to do this work better through new tools, methods, and languages, to do what good research does best: seeking, synthesizing, and communicating new knowledge.
Of course, first, you need to figure out what you want to figure out. Your research questions are usually the starting point for this, and there’s plenty of good literature around this (I personally quite like this book). But if we’re to engage, not with just a single study, but rather constructing the starter for a personal practice of repeatably high-quality research, we need to investigate, why we believe that is the salient question and where those beliefs are grounded.
Let’s say we were investigating how individuals get engaged with retail investing applications like Robinhood, and how they transition from low-to-no knowledge beginners to advanced users. We might pose the question “How do retail investors come to understand the stock market through low-barrier tools like Robinhood?”
This question is problematic for a variety of reasons. First, it presupposes a group defined by the tool of access, instead of knowledge and traits inherent to the individual. It assumes that what people are engaging with from a knowledge perspective is the market itself, vs. the tool or communities situated around the market. It suggests that understanding and expertise is a defining trait of retail investing, vs. other dynamics (such as those we recently witnessed with the WallStreetBets and Gamestop). Ultimately, it’s a pretty poor question, but it exposes some useful ways of improving our inquiry.
We can look at the definition of Retail Investor and expand that dramatically by asking WHO and what types of people engage with retail investing applications. What resources do they have access to? Who is excluded?
We might reframe the market not as something people look to understand, but rather as the medium through which some other goal is sought. Instead of focusing on the market as a monolith, we focus on the act of “investing” or “betting.”
What’s important is that we’re building tools to reflexively reflect on HOW and WHY we believe what we believe when we construct this earliest tool of inquiry. If our first action when structuring a research question is not to look into the logistics of implementing it, but rather deconstruct the components of that question. We may well arrive at the same result after a few interviews, but catching some of these assumptions early can save us plenty of time and resources as we plan our research.
Many people believe — I think erroneously — that reflection is in one’s nature. But this dismisses the rigour that reflective practice often requires. It can manifest as forced periods of isolation and focus (a friend once said she did her best work on planes, and so had come to spend dedicated chunks of time locked in a phone booth sized workspace in our office), or protected chunks of time on a calendar that otherwise has a high billable rate. What sets researchers apart is that the reflective time necessarily emerges in response to the experiences and data we collect in the field.
If our reflection is always something that is triggered and responsive, then what we have to build is a practice that is reflexively reflective — and structure our environment to that end. Ideally, after a rich and eye-opening interview, you have a moment to stop with your team and debrief. You share your own impressions and responsive notes, capture any responses or impressions from the note taker, flag any additions or weak points in the interview guide, and schedule time to start transcribing and synthesizing that data quickly. When planning the research, you were careful to protect that time post interview — limiting the number of interviews per day to both conserve your own energy, but also buy time for reflection.
Ideally, you’ve cultivated a habit somewhere in your life that is quotidian and flow-inducing. That might be running, meditating, lifting, walking alone to a particular coffee shop, commuting to work, or journaling. Whatever that is, it is the unseen and unscripted crucible for your accumulated research experiences and data to alloy into something grander than before.
Of course, the “ideal” is rarely what manifests. It’s trite, but one simply has to deal with what comes and respond accordingly. The trick that’s worked for me — and that I think is supportive of a reflexively reflective practice — is to build in a degree of automation into the procedural tasks that follow your research. There is a huge amount in a qualitative research process that can’t be automated. But we CAN automate the triggers and procedural actions surrounding a research task.
Let’s say we’ve dropped the ball on planning, and end up with interviews that are too tightly packed and don’t leave enough room for proper reflection. A way to automate some level of reflection and debriefing is to create some tools that force the point.
An example: we might have set up two automations in Airtable and Zapier. One looks at our calendar, and immediately following a calendar event tagged with “Interview”, both researchers are sent a form that asks us to quickly summarize our impression, upload any audio recordings we have taken, and quickly take any feedback notes on the interview guide. We fill this out quickly on the way to the next interview. A further automation looks for an audio recording and, if present, sends it to a service for transcription so that other team members will have quick access and we’ll have a head start for reviewing at the end of a tough day — as well as our notes and impressions that might not have been recorded if we were focused exclusively on prepping for the next interview.
Building a practice is not just about the skills and discipline involved in performing research at a high level, but also taking the time to reflect on those points where you might be a bit weak. These areas of weakness are great opportunities to ask for help: either from others on your team or from the various robots that make up our digital ecosystem.
Our research infrastructure should fit to us like an old and well-worn leather jacket: a second skin that protects us, fits in with everything else, and makes us look and feel good at the end of the day. So I encourage you to look for those opportunities to “wear in” your tools and processes with each new project. If you drop the ball on debriefing, figure out how to fill that gap. If your interviewing is overly formulaic, practice taking non-linear notes and techniques for tracking tangents. If you find yourself overwhelmed by synthesis at the end of your field research, look at scheduling and blocking time differently in the future — or build automation into how information from your interviews is processed as you go.
These types of considerations help us build a reflexively reflective practice that empowers continuous, introspective growth in our journey as design researchers. rs.
Monthly updates from Andrew Lovett-Barron, mostly writing about design practice, theory, and projects. Occasionally, I may link out to a new project.