Design-Thinking Research Design

How can we combine the speed and user-orientation of commercial research methods with the trustworthiness of traditional academic research? Management researchers face increasing pressures from hyper-competitive schools hungry for new programs and hyper-innovative consulting clients hungry for new advice. This article presents a study that treated the new grounded-theory framework to be developed as a design object and borrowed Design Thinking (DT) tools and techniques to supplement Qualitative Research (QR) methods. DT and QR have something to offer each other, and researchers should be able to draw from both.

KEYWORDS

research methods; design methodology; design problem(s); conceptual design; academic research

ACKNOWLEDGEMENTS

I would like to thank my ever-patient research assistant, Dr. Lee Poh Chin, and our executive, Mr. Shareff Uthuman, for their enlightened collaboration. i2i is us, not me.

FUNDING

The author received full financial support for this research from SP Jain School of Global Management. The funding source did not influence the collection, analysis, or interpretation of data; writing; or decision to submit the article to the current venue for publication.

Highlights

· Commercial research speed and user-focus can supplement academic research validity

· A new theory/framework can be a design object and use Design Thinking methods

· This research benefitted by supplementing traditional methods with Design Thinking

· Benefits include speed, 10 for exploration, generalizability, and usefulness

· Particular methods used, why, and impact are presented

ARTICLE

After 17 years outside academe as a consultant and entrepreneur, I walked back into a very different world. I had been trained 20 years earlier in lengthy methodologies for grounded theory (and other methods) at an 88-year-old business school within a well-endowed 360-year-old university. I now sat in a 10-year-old business school without university or endowments in a hyper-competitive education market in a hyper-innovative business world. Even grounded theory and other methodologies had advanced, becoming leaner and quicker. My new commercial pressures were keen, as was pressure from the school authorities, who wanted peer-reviewed academic publications, in a new push to extend beyond teaching and into research.

The reality of starting a new profit center inside the school was that I had to establish sellable programs and a consulting practice quickly. Endowments were impractical for a variety of reasons, and research grants wouldn’t contribute to my profitability targets. Research would contribute if it generated new knowledge for new education and consulting programs, with us as the originators (so providers of choice).

So, having partnered with well-branded organisations to offer trendy programs while I did research, I consulted my old academic friends. The few that remained in academe advised me that, yes, I could do a rapid study which would result in article submissions in only a year and a half to 2 years, with publication a year after that. I complained at a new colleague and was shocked to find her continually adding new material to a submission that’s been in the review process 5 years.

So I needed the speed and marketability of commercial research and the trustworthiness of traditional academic research. In fact, I had been hired for my unusual background in teaching, consulting, research, and entrepreneurship, in order to meld and draw on them all.

As luck would have it, one of the new programs I started used design thinking (DT) — a successful commercial approach in product and service development. While in the throes of teaching DT and struggling with research design, I realized they have similar roots, and I might be able to leverage the strengths of both.

Could I borrow and blend methods from DT into a qualitative research (QR) study? After thorough consideration, I did, and found the approach useful. I offer this case study of my research study with the hope that it would benefit other researchers who face pressures of time, resources, teaching, and consulting (so, basically everyone). Key questions I address here include:

• Can a new management theory or framework be treated as a design object?

• Are DT methods sufficient for an academic study?

• Can DT methods enhance academic research (also which ones and when)?

Below, I outline the relationship between DT and QR, highlight some similarities and differences, and propose that each has something to learn and use from the other, especially in grounded theory. I present how I used DT to enhance my QR design and found that, indeed, it was helpful for speed, generation of insight, practicality of outcomes, and generalizability.

Design Thinking (DT) and Qualitative Research (QR)

Relationship, similarities and differences.

Design thinking is basically qualitative research in a commercial context, with the aim of discovering needs and designing solutions and has a long-standing relationship with QR. DT draws heavily on ethnology and anthropology, and its teams often include classically-trained ethnologists, anthropologists, and other social scientists. Both DT and QR are human-centric, involving organized and documented immersion in the lives of people being studied, as well as practiced methods of social investigation. They involve non-linear (messy) processes, and both DT and QR (especially for grounded theory) use abductive reasoning (see section below).

However, DT is newer. Its roots can be traced back to Peter Rowe’s book Design Thinking (1987) and Herbert Simon’s The Sciences of the Artificial (1969). Qualitative Research dates back to the rejection of positivism in the early 1900’s and beyond (Willis, 2007). Although DT is rooted in QR, it continued to evolve separately as a commercial discipline, and QR methods continued to evolve in academe.

Why did they diverge? Why don’t academic researchers use DT? Part of the reason may be that DT is not taught in doctoral programs. Since DT uses tools and techniques from QR, the original QR tools are taught. In addition, since design is often a messy process aimed more at useful artefacts than rigorous proof, and because of its implicit nature, “researchers are embarrassed by not being able to show evidence of the same kind of control, structure, predictability, and rigorousness in doing design as they are able to show in other parts of their research” (Fallman 2003, p. 230). Indeed, according to Easterday, Lewis and Gerber (2014, p.317), “There seems to be no accepted precisely described DBR [Design-Based Research] process at the level of specificity dedicated to other methodologies such as experiments or grounded theory.”

DT tests its artefact outputs with prototyping and experimentation, since the aim is to come out with a product, service, or solution to a problem. For QR researchers wanting to develop a deeper understanding of a phenomenon without necessarily “fixing” it, they only need the QR tools from which DT draws, plus additional QR tools to establish trustworthiness of the insights. So, today, DT and QR emphasise different aspects of the research process, as seen in Table 1.

Do these differences render the two approaches incompatible, or do today’s pressures on management researchers mean it’s time to meld the two and draw on both? Can they even be melded, or do they inherently support different types of reasoning and inquiry?

Induction, deduction, abduction, analysis and synthesis.

When comparing and contrasting DT with the physical sciences and the scientific method, clear differences emerge in researchers’ thinking patterns. In the discovery phase of the scientific method, a researcher observes phenomena and forms a hypothesis with inductive reasoning. When testing (the “justification” or “validation” phase), the problem is tightly defined, and deductive reasoning prevails. Lawson’s studies (as cited in Cross 1982, p. 223) on problem-solving by scientists vs. architects (designers) suggest that scientists generally approach problems with analysis, whereby designers move more quickly toward synthesis.

The social sciences and QR adopted the scientific method from the physical sciences, but over the years evolved beyond it. Grounded theory methods within QR, for example, can blend, overlap, and iterate the discovery (exploratory) and justification (validation) phases. It begins with “an inductive logic but moves into abductive reasoning as the researcher seeks to understand emergent empirical findings” (Charmaz 2008, p. 157). Indeed, iterative sampling and analysis, leading to emergent findings is a hallmark of grounded theory (Suddaby, 2006).

In a similar progression from linear to non-linear, the design fields prior to 1960 embraced an approach in which problem definition via analysis was then followed by problem solution via synthesis. As an alternative, Rittel and Webber (1973) proposed the “wicked problems” approach, in which designers would simultaneously discover and define problems and solutions — a particularly-useful approach when both the problem and solution are essentially unknown and interdependent, as most design problems are. Having embraced this approach, the hallmark of today’s practice of DT is abductive reasoning (Hassi & Laakso 2011; Dorst, 2011), and both analysis and synthesis occur throughout (Experience Point, 2011).

So, since modern DT and grounded-theory QR share an emphasis on abduction and use analysis and synthesis throughout, can they borrow methodological tools from each other? Does DT, which grew out of QR methods, have something to contribute back?

Framework/Theory as a Design Object

If a management theory (or a new framework) is a useful thing for creating change, not just deeper understanding, can we consider it a design object and use DT methods to develop it? Easterday et al. (2014, p. 321) seem in favour when they asserted, “Scientific findings are also products created (or discovered) by a design process.”. In describing Research through Design (RtD — as opposed to Research for Design or Research into Design), Frayling recast design as research when the goal is knowledge and understanding, rather than the making of an artefact (Gaver, 2012; Godin & Zahedi, 2014).

Creation of a new framework or theory might also be understood as a “wicked problem” per Rittel and Webber (1973). Zimmerman et. al. (2007) noted DT’s usefulness in both discovering theory gaps to research and creating prototypes to test new theory. DT also fits well with QR in exploratory situations “… when the researcher does not know the important variables to examine” (Morse, 1991 as cited in Cresswell, 2014, p. 50).

In some regard, the “going-in” model of my research study was a prototype, and discussing it with participants throughout the research process was analogous to continuous prototyping in DT (Hassi & Laakso, 2011). Nonetheless, prototyping and experimentation are more heavily emphasized towards the end of a DT project and are similar to action research in QR (which will be addressed in a separate section, below). Overall, I have taken a pragmatist (Patton, 1999; Tashakkori & Teddlie 2003; Morgan, 2014) and constructivist approach (Charmaz, 2008; Cresswell, 2014), which fits well with DT.

Question, Model and the Need for Design

My lateral-innovation research question is, “Why and how do world-leading innovators create across domains, and why do others in the same circumstance not do so?” My aim is to help other individuals (and eventually teams and organizations) do more of the same.

After thorough literature review, I found no particular model or theory that addressed this question (Gioia, Corley & Hamilton, 2013), and so took a grounded-theory approach to learning from high-value innovators around the world. I did not know the important variables to investigate — indeed, I wanted to discover the important themes and threads that might tie together a disparate group of innovators who have done something in common — created high-value innovations that crossed significant boundaries of industry, social class, technology, country, etc.

Like the innovators in my study, I took a cross-disciplinary approach, since various disciplines do address various aspects of the phenomenon under study (e.g. creativity, innovation, mindfulness, empathy, etc.). I drew on grounded-theory studies within these disciplines for research-design inspiration and support (e.g. Csikszentmihalyi, 2013) and leveraged theoretical and methodological triangulation where possible (Fossey, Harvey, McDermott & Davidson, 2002).

Team-orientation, co-creation, actively seeking advice from outside-team experts and non-experts all attest to DT’s democratic orientation towards knowledge and the generation of insights and solutions. The DT researcher is a listener, learner, and integrator of good ideas, not just a source.

In keeping with this basic approach, I changed one of Charmaz’ (2008) process of coding, memoing, and sorting, with deconstruction while coding and reconstruction during analysis & writing. Instead of simple memoing, I profile the research participants for discovery (memoing) and blog publication. I share the profiles not only with the participants for validation, but also with a broader (blog) audience, for broader feedback and faster sharing with practitioners.

In keeping with DT’s practical focus, and since I had co-founded an innovation lab fifteen years ago, my practice-based observations of innovators informed my research question, going-in model, and semi-structured interview questions — already a mix of practice and academe. So, although Glazer & Strauss’ classical work on grounded theory espoused beginning research without a pre-existing model, I followed Charmaz’ (2008) approach, i.e. to allow the use of preconceived ideas before going into the field, but to remain open to discovery. So, interviews were flexible, and we coded with participants’ terms (not just literature’s terms) (Gioia et al., 2013).

Design Thinking for Exploration

Figure1. IDEO’s approach to Design Thinking (Reprinted with permission)

A variety of DT methodologies and Design-Based Research approaches are available, e.g. from Stanford’s d.school, Daylight Design consultancy, and researchers such as Easterday (2014). However, I chose IDEO’s approach since IDEO is a well-respected pioneer in DT, and I am most familiar with it. Although both DT and QR feature iterative data gathering and analysis throughout, IDEO’s approach (above) can roughly be translated to QR grounded-theory development as follows:

The first three phases are fairly self-evident, but the rest warrant some explanation. “Frame Opportunities” in DT is the time to define the future state that will be built. In grounded-theory QR, the future state is one that contains a working theory or framework, and after preliminary analysis, the researcher will have a better idea of what it should be. Brainstorming in DT is meant to generate ways to address the Framed Opportunity (i.e. create ways to solve a problem or build a future), and in the case of QR, the researcher would need to revise the preliminary theory/framework, re-focus, and adjust the research design. Finally, designers prototype and test their artifacts or solutions with potential users, while grounded researchers continue to investigate, analyse, gather feedback, and revise the new theory/framework until no longer necessary (i.e. until no substantial changes are required and there is a reasonable body of supporting data and analysis). QR Participant validation is analogous to DT prototype testing and user feedback.

Movement from the concrete to abstract back to concrete (axis at the left of the above graphic) or Inspire-Ideate-Implement (top of graphic) basically follows the grounded-theory QR progression from real-world observation to theory/framework development and real-world validation.

Given overall alignment between DT and QR of research aims and processes, I decided to incorporate particular DT techniques into my QR design. Although the phases are messy and sometimes overlapping in both DT and QR, I present them separately below for clarity.

Exploratory sample of 10.

One of the most common concerns when designing an exploratory study in QR is how many participants to include (Guest, Bunce & Johnson, 2006). This was especially true in my case with commercial pressures, elite interviews (see below), and international travel.

In DT, 10 is the generally-accepted exploratory sample size most practitioners find effective (Brian Ling, personal communication, April 19, 2015), and 10–20 is mentioned by Coughlan in ExperiencePoint (2011). So, I planned for 10 and, indeed, gained saturation with 10.

I used ethnographic techniques common to both DT and QR, including shadowing each of the 10 innovators (day-in-the-life approach, with field notes), semi-structured and open-ended interviewing, and interviewing three or more colleagues (or friends and family) per innovator, for triangulation.

Interviews were recorded and transcribed in full, which is different from DT, due to time and cost constraints, as well as less need for reliability since potential solutions would be prototyped and tested before they are released. Common to DT and QR, interviews were coded, and categories emerged through inductive analysis (Miles & Huberman, 1984; Patton, 1999; Charmaz, 2008).

Interviewing elites (QR) and extremes (DT).

Unlike many market research approaches which investigate the mainstream (since they are the intended users of what will be developed), DT focuses on extremes (who totally love or totally hate your product), because they more easily give voice to new ideas and problems the mainstream may not notice. Extreme sampling is both effective and faster than large mid-population sampling.

At the high extreme are participants QR calls “elites” (Marshall & Rossman, 2016). Interestingly, although the current study’s participants were highly-successful cross-domain innovators, not all could contribute theoretical concepts for framework development. They knew their creative experiences but didn’t always know much about creativity. One participant said he didn’t think he was a good choice for the study since he’s not creative. His colleague said, “Ridiculous — look at how much he’s created,” referring to the business empire he founded, worth over a billion dollars.

At the low extreme, I also interviewed innovators’ colleagues, friends, and family not only for a different perspective on the innovator, but also to explore why they did not create the innovation, despite being in the same place and time, sometimes with the same information. I have not seen this technique specifically mentioned in QR methodologies but found it useful to borrow from DT.

Studying highly-successful (elite) innovators exposed my study to some of the same difficulties as other elite interviewing, including difficulty getting participants (Csikszentmihalyi, 2008), hard-to-find time in over-busy schedules, heightened concerns for confidentiality, etc. At first, no one wanted to join the study, so after clearing my research design and protocols with our ethics committee (a step not taken in DT), I started with seven friends I encountered during my years as a business practitioner. Similar to Csikszentmihalyi’s research (2008), I eagerly asked for referrals, and participation “snowballed” by 12 more. My research assistant found eight more on the internet, and I actively networked at conferences and events, gaining three more.

Teams, co-creation and researcher-as-instrument.

DT and constructivist grounded-theory research within QR both stress the importance of personal immersion into the context of the participant and acknowledge the researcher as a research instrument, along with unavoidable biases. Although DT is usually done by a team, my study included only myself and a research assistant. Since it is not possible to remove all bias from a researcher, we ensured that one of us (I, the principal investigator) conducted all the interviews and did all the in-person shadowing. All the transcription and initial coding was done by my research assistant. I re-coded, profiled, did axial coding and writing, and we collaborated on interpretation.

So, although the study was not conducted with a DT team, we used traditional QR to provide some of the benefits of a team with peer debriefing and analyst triangulation (Patton, 1999), as well as participant review (i.e. member checking and respondent validation) during the validation phase. All participants (innovators and colleagues) have a chance to review work before publication. In fact, many of the colleagues could, themselves, have been innovator participants, e.g. the Pulitzer-prize-winning composer or the cross-disciplinary scientist working with the cross-disciplinary entrepreneur.

Embedded survey.

Some DT methodologies include nested surveys, so we included a concurrent nested survey (Tashakkori & Teddlie, 2003; Creswell, 2014) — the Multicultural Personality Questionnaire (MPQ) short form (van der Zee, van Oudenhoven, Ponterotto & Fietzer, 2013). Two of the anticipated themes of the study were empathy and open-mindedness, and the well-grounded MPQ assesses those, plus three more.

Design Thinking for Validation

At the end of the exploratory phase, having shadowed, interviewed, transcribed and coded (many, not all), and written preliminary results, we found the going-in model basically valid, but with a host of additional findings we were delighted to discover.

Validation phase.

For verification, we targeted 20 innovators, making 30 overall, in keeping with QR sample guide (Baker & Edwards, 2012). I no longer shadowed the participants, and I interviewed only 2 colleagues instead of 3, but the innovator interviews were all still done in person (colleagues could be via phone or skype). I continued with the dialectic approach and participant review, which is somewhat co-creative, like DT.

Emergent design and participant variety in Arts/Technology/Business

Both DT and grounded-theory QR are creative processes whereby part of the design can evolve while preliminary findings are emerging (Brown, 2008; Charmaz, 2008; ExperiencePoint, 2011). Once themes began to emerge in the current study (like global backgrounds and perspectives), we began to seek participants to “round out” the study (theoretical sampling) for those particular themes (e.g. global diversity), or to show enough representation (e.g. for age or sex) so our findings would not be sample-skewed.

Beyond adapting our participant pool (i.e. quickly adapting research design to themes emerging from the data), we also needed the emergent framework and findings to be extra-trustworthy since they would be released before active prototyping (DT) or action research (QR). We used a combination of DT and QR to establish the four areas of trustworthiness identified by Lincoln and Guba (1985): credibility, dependability, confirmability, and transferability.

First, to enhance credibility, the current study used prolonged engagement in the form of “shadowing” (similar to the ethnographic technique “a day in the life” common in DT) for exploratory-phase (and some validation-phase) participants. Persistent observation had occurred for two participants, who have been my friends for years. Triangulation was included in the design by interviewing others about the participants, as well as gathering third-party data. Peer debriefing and member-checking were also used.

Second, for dependability & confirmability, both DT and QR approaches were used. Field notes were kept, all interviews were recorded, and we maintained organized records of primary and secondary data for all participants, as well as designing for triangulation (as above) and reflexivity (Charmaz, 2008).

Figure 2. IDEO’s innovation perspectives (reprinted with permission)

Finally, for transferability, I drew on DT’s innovation perspectives (see above). DT begins with investigating desirability but then of course aims to create solutions that are feasible and viable, ultimately aiming to create innovations in the middle — the sweet spot that integrates all three. DT team members (like other innovation teams I have worked with over the years) are often chosen to include the desirability perspective (people from arts, humanities, design, etc.), feasibility (IT, manufacturing, science, etc.), and viability (business strategy, economics, etc.). So, I chose a pool of participants spread over arts/humanities, science/technology, and business/not-for-profit. This DT technique should supplement Lincoln and Guba’s (1985) recommendation of thick description (or thick narratives, per Cooper & Hughes, 2015) to enhance transferability.

The spread of participants over the above three perspectives was hard to assign but roughly translates to their training (which impacts how we think) and the work that fed into the innovation. Some fields are not easily classifiable, e.g. anthropology, which is sometimes classified as humanities and sometimes as social science within the overall domain of science. Nonetheless, roughly 14 participants were counted in arts/humanities, 22 in science/technology, and 24 in business/not-for-profit. Obviously (given the numbers) and predictably (given their cross-domain work), most participants fit into more than one category.

“Country” was identified by each participant in the embedded survey as places where they have lived 6 months or more, for cultural adjustment reasons (Demes & Geeraert, 2015), and they included 31 nations on five continents. Age was not specifically collected, but the study did include participants in their 20’s through their 80’s, roughly categorized as six “young” (approximately 20’s and 30's), nineteen “mid” (roughly 30’s to 50's), and five “senior” (60’s, 70’s, and 80's).

Analysis.

I did not need to supplement grounded-theory QR with DT techniques during analysis, so I present here a few analysis concepts only for completeness. Both my research colleague and I treated this as data-driven research and searched for underlying meaning in the interviews (Boyatzis, 1998), as well as performing iterative analysis (re-reading/re-listening and continuous reflection and inquiry) (Creswell, 2014).

My research assistant transcribed verbatim for all innovators and exploratory-phase friends; and selectively for validation-phase friends. She created the initial, open codes, and I revised them (Miles & Huberman, 1984; Charmaz, 2008) with reading and selective re-listening, identifying more frequent codes and selecting ones that best described the emerging insights. I did the axial coding, profiling and writing, and she, participants, and blog readers reviewed.

Action Research

A DT process is not complete without prototyping and experimentation, and the real test of any framework or theory is whether it is useful in the field and continues to generate new insights. Although this study’s framework and selected data will be released before the action research, QR methods were used in this study that are not normally included in DT, to provide additional support and validation. Indeed, Gaver (2012, p. 927) describes theory that emerges from Research through Design as, “provisional, contingent, and aspirational.”

Both DT and QR approaches suggest shared responsibility (Fossey et. al., 2002) for proof via testing, and action research is planned. Given the need for research that makes an impact on practice (Santini, Marinelli, Boden, Cavicchi & Haegeman, 2016), the current study will integrate participatory methods and action research with the book that outlines the new framework and findings. It will include free online self-diagnostics, development tools, and feedback venues which will be integrated into paid educational programs and consulting projects designed to collect data as well as make an impact.

Results and Conclusions

DT and QR are, at their roots, human-centric, and both can be powerful forces for innovation, not only through rigorous understanding of the human condition, but also by producing useable results quickly enough to make a difference and embracing co-development. DT has always focused on usefulness and speed, and QR can do more by adopting some techniques and perspectives from DT. Per Easterday et al. (2014, p. 317), “Theory derives its purpose from application and application derives its power from theory. Our problem as DBR [Design-Based Research] researchers is to devise a means of conducting DBR that reliably produces both theory and interventions.”

This case study builds on the discussion of Design-Based Research and Research through Design, offering an example in which DT was a viable approach to draw from when an academic researcher faced commercial-type constraints, addressed a “wicked problem,” explored and prototyped, all while aiming to produce a useful framework (which was then treated as a design object). QR has leaned towards the physical sciences in the quest for truth, but what if it leaned towards design in the quest for usefulness?

So — to revisit our first question — can a new management theory or framework be treated as a design object? If the theory or framework is intended to be a useful tool, then the answer should be yes. It is a tool, to be designed for usability and effectiveness, like any other.

Second, are DT methods sufficient for an academic study? When a theory or framework will be released before action research or other form of rigorous testing, then additional methods should be employed to strengthen the results, despite the responsibility of the researcher to provide rich description and the reader to make informed choices.

Finally, can DT methods enhance academic research (also which ones and when)? I believe the answer is yes. I borrowed from DT in my QR study and found it helpful in the following ways:

• Fast learning through the use of extremes — by looking at extreme innovators and those with them who did not innovate (although some were collaborators or were innovators in other areas), I gained saturation and insights more quickly than by looking through a whole population or mid-population random sample. Innovators and their friends, for example, both talked about the importance of boundary-crossing — by not realizing boundaries existed (e.g. between disciplines), or by crossing them anyway, despite resistance. Hearing both extremes discuss boundary-crossing helped me realize it is an important theme and learn how they did so. Regarding speed, I collected most of my data around the world in six months and had management insights to share within another six months via early publications.

• “10 for Exploration” — This simple rule-of-thumb in DT allowed me to plan my participants and travel arrangements well enough in advance without either under-committing to my exploratory phase or wasting exploratory effort beyond what was needed for saturation. I did not need to interview and shadow each participant and their colleagues, check for saturation, then plan the next, which can take an enormous amount of time when interviewing CEO’s, composers booked two years in advance, participants dispersed around the world, and scientists pushing towards their own research deadlines.

• Generalizability to art/humanities, technology, & business — The very definition of DT (see above) shows that it integrates people, technology, and business, and for a new management framework or theory to be maximally useful, it should apply across all three types of human endeavour. In order to generalize my findings to everyone, I used these three substantive/thematic dimensions to vary my sample population to innovators in the arts/humanities, science/technology, and business/organizations, instead of simply the more traditional dimensions of age, sex, etc. (which I also varied). Towards the end of the study, we did not have enough representation in the arts or enough women or anyone from China but rebalanced the sample and gained valuable insights from them.

• Theory as a design object, QR rigour, and action research — Viewing my intended new framework or theory as a product to be designed pushed me to develop it from the user’s perspective, not just fellow-researchers who would review my work but never be truly impacted. This perspective has, ironically, caused me to be more careful to use QR techniques for reliability and validity since the framework will be used both in and prior to the action research. This approach has also pushed me not to waste the experiences and learnings of future users, but to capture, learn, and continue developing via action research, which needs to be designed into the outcomes (book, software, and programs/projects). In fact, one of the participants introduced me to software I will use for the book to enhance its functionality — a productive research outcome beyond the research content. I introduced three of the innovators to each other, two to my school for recruiting interns, and 12 to others in my network to encourage them to collaborate, build social capital, and pursue potential business opportunities — a generative outcome in addition to the research content (Gaver, 2012).

On the whole, both DT and QR have evolved since their early connection with each other, and both of them today have something to offer the other. In “innovation-speak,” they have diverged, and now it is time to converge.

Perhaps, like neighbors, borrowing each other’s tools will help bridge the gap between “thinkers and doers,” providing more useful research, enlightened practice, and relevant education (Santini et al., 2016). Integrating commercial and academic research methods can enhance both, by integrating the practicality, speed, and usefulness of DT with the rigor of QR.

Now, if I can only find a methodology to help me find time to write….

References

Baker, E., & Edwards, R. (2012). How many qualitative interviews is enough? Expert voices and early career reflections on sampling and cases in qualitative research. Southampton, UK: National Centre for Research Methods Review Paper. Retrieved from http://eprints.ncrm.ac.uk/2273/4/how_many_interviews.pdf

Boyatzis, R. E. (1998). Transforming qualitative information. Cleveland: Sage.

Brown, T. (2008). Change by design. New York: Harper Collins.

Charmaz, K. (2008). Grounded theory as an emergent method. In S.N. Hesse-Biber, & P. Leavy (Eds.), Handbook of emergent methods (pp.155–172). New York: The Gilford Press.

Cooper, K. A. & Hughes, N.R. (2015). Thick narratives: Mining implicit, oblique, and deeper understandings in videotaped research data. Qualitative Inquiry, 21, 1, 28–35. doi: 10.1177/1077800414542690

Creswell, J.W. (2014). Research design: Qualitative, quantitative, and mixed methods approaches (4th ed.). Thousand Oaks: Sage Publications.

Cross, N. (1982). Designerly ways of knowing. Design Studies, 3, 4, 221–227. Retrieved from http://www.makinggood.ac.nz/media/1255/cross_1982_designerlywaysofknowing.pdf

Csikszentmihalyi, M. (2013). Creativity: Flow and the psychology of discovery and invention. New York: Harper Collins.

Demes, K. A., & Geeraert, N. (2015). The highs and lows of a cultural transition: A longitudinal analysis of sojourner stress and adaptation across 50 countries. Journal of Personality and Social Psychology, 109, 2, 316–37. doi: 10.1037/pspp0000046

Dorst, K. (2011). The core of ‘Design Thinking’ and its application. Design Studies, 32, 521–532. doi: 10.1016/j.destud.2011.07.006

Easterday, M.W., Lewis, D. R., & Gerber, E. M. (2014). Design-based research process: Problems, phases, and applications. Proceedings of International Conference of the Learning Sciences, ICLS, 1, 317–324. Retrieved from https://www.researchgate.net/publication/288434629_Design-based_research_process_Problems_phases_and_applications

ExperiencePoint. (Interviewers). (2011). What’s design thinking really like? [Interview transcript]. Retrieved from http://thinkprimed.com/wp-content/uploads/design_thinker_tool_whatsDesignThinkingReallyLike_en.pdf

Fallman, D. (2003). Design-oriented human-computer interaction. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 225–232. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.4.4522&rep=rep1&type=pdf

Fossey, E., Harvey, C., McDermott, F. & Larry, D. (2002). Understanding and evaluating qualitative research. Australian and New Zealand Journal of Psychiatry, 36, 6, 717–32. Retrieved from http://pathways.bangor.ac.uk/fossey-et-al-evaluating-qual-research.pdf

Gaver,W. (2012). What should we expect from research through design? CHI ’12 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 937–946. Retrieved from http://teaching.paulos.net/cs160_FL2013/images/d/de/P937-gaver.pdf

Gillham, B. (2000). The research interview. London: Continuum.

Gioia, D. A., Corley, K.G., & Hamilton, A.L. (2012). Seeking qualitative rigor in research: Notes on the Gioia methodology. Organizational Research Methods 16, 1, 15–31. doi: 10.1177/109442811245215ductive.

Godin, D., & Zahedi, M. (2014). Aspects of research through design: A literature review. Proceedings of the Design Research Society conference 2014, Umeå, Sweden, 16–19. Retrieved from http://www.drs2014.org/media/648109/0205-file1.pdf

Guest, G., Bunce, A. & Johnson, L. (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18, 1, 59–82. doi: 10.1177?152822x05279903

Hassi, L. & Laakso, M. (2011). Design thinking in the management discourse: Defining the elements of the concept. Proceedings of IASDR2011, the 4th World Conference on Design Research, 1–10. Retrieved from

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.4.4522&rep=rep1&type=pdf

Lincoln, Y. S., & Guba, E. (1985). Naturalistic inquiry (1st ed). Newbury Park, California: Sage.

Marshall, C., & Rossman, G.B. (2016). Designing qualitative research (6th ed). Los Angeles, California: Sage.

Miles, M. B., & Huberman, A.M. (1984). Qualitative data analysis: A sourcebook of new methods. Newbury Park: Sage.

Morgan, D. L. (2014). Pragmatism as a paradigm for social research. Qualitative Inquiry, 20, 8, 1045–1053. doi: 10.1177/1077800413513733.

Patton, M. Q. (1999). Enhancing the quality and credibility of qualitative analysis. Health Services Research, 34, 5, 1189–1208. Retrieved from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1089059/pdf/hsresearch00022-0112.pdf

Rittel, H. W. & Webber, M. M. (1973). Dilemmas in a general theory of planning. Policy Studies 4, 155–169. Retrieved from http://www.cc.gatech.edu/fac/ellendo/rittel/rittel-dilemma.pdf

Rowe, P. G. (1987). Design thinking. Cambridge: The MIT Press.

Santinia, C., Marinellib, E., Bodenb, M., Cavicchic, A. & Haegeman, K. (2016). Reducing the distance between thinkers and doers in the entrepreneurial discovery process: An exploratory study. Journal of Business Research 69, 1840–1844. Retrieved from http://www.sciencedirect.com/science/article/pii/S0148296315004890

Simon, H. A. (1969). The sciences of the artificial. Cambridge: MIT Press.

Suddaby, R. (2006). From the editors: What grounded theory is not. Academy of Management Journal 49, 4, 633–642. Retrieved from http://www.idi.ntnu.no/grupper/su/publ/ese/suddaby-groundedtheory-ednote06.pdf

Tashakkori, A., & Teddlie, C. (2003). Handbook of mixed methods in social & behavioral research. Thousand Oaks: Sage.

Van der Zee, K., Van Oudenhoven, J.P., Ponterotto, J.G., & Fietzer, A. W. (2013). Multicultural personality questionnaire: Development of a short form. Journal of Personality Assessment, 95, 1, 118–124. doi: 10.1080/00223891.2012.718302

Willis, J. W. (2007). Foundations of qualitative research: Interpretive and critical approaches. Thousand Oaks, California: Sage.

Zimmerman, J., Forlizzi, J., & Evenson, S. (2007). Research through design as a method for interaction design research in HCI. CHI ’07 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 493–502. Retrieved from http://repository.cmu.edu/cgi/viewcontent.cgi?article=1041&context=hcii