What’s Missing from Your Design Toolkit?

This last month we have been reading about problem solving, the work of designers and design processes. Although all still in the domain of theory, rather than practice, these authors are grappling with the question, “How do we do design?” Authors like Chris Pacione, Nigel Cross and Horst Rittel have defined the designer’s process in contrast with fields of mathematics, engineering, and economics, respectively. The designer’s toolkit is full of tools that enable us to leave behind the rules and traditions of scientific inquiry in exchange for a more humanized, multi-dimensional, and inclusive picture of the world. 


Pacione says today’s fundamental educational competencies are not reading, writing and arithmetic, but rather the fundamental skills of design, “creativity and innovation, critical thinking, problem solving, communication and collaboration.” Cross highlights intuition as the key differentiator between the problem solving performed by engineers and the problem solving done by designers. He quotes an engineering designer saying, “I believe in intuition. I think that’s the difference between a designer and an engineer.” Cross defines the core competencies of design as “the abilities to: “resolve ill-defined problems, adopt solution-focusing strategies, employ abductive/productive/appositional thinking, use non-verbal, graphic/spatial modeling media.”


Rittel discusses the economists’ application of classical physics in the pursuit of efficiency and the elevation of efficiency to moralistic heights within industry and government. Yet, he finds these methods falling short when applied in the social sciences or in government or societal planning. “We shall suggest that the social professions were misled somewhere along the line into assuming they could be applied scientists–that they could solve problems in the way that scientists can solve their sorts of problems.” In many ways the tools of the designer are as much about what they are not, as what they are.


Design has thus been defined in opposition to the empiricism of math and science. In advancing design as a superior method for solving societal problems, design theorists have rejected the tools of the engineer, the scientific method, statistical analysis, algorithms, and the like. The righteous justification for casting aside those problem solving tools that societies have found invaluable for centuries is in defining the types of problems these tools are well-suited to solve. Walter Reitman first categorized problems as ill-defined or well-defined in the 1960s as a means of understanding human cognition and problem solving. A well-defined problem is one with a single, definite solution state and a single, definite starting state, and a finite set of ‘legal moves’ and constraints.


Herbert Simon builds on the idea of separating problems into well-specified or ill-specified in the interest of articulating what types of problem solving are best suited to each. But rather than honoring the binary that Reitman constructed, Simon reimagines them in two critical ways. First, he discards the binary in favor of seeing ISPs and WSPs as a continuum. One problem might be more well-structured than another while neither being objectively well-structured. Second, he considers problem spaces that are fundamentally ill-structured, and yet composed of many sub-problems that are actually well-structured. 


The particular lens through which Simon is considering ISPs and WSPs is the implications for artificial intelligence to solve problems. He both limits the potential application of AI by positing that many problems commonly conceived as well-structured (such as a chess match) are actually ill-structured. Yet, problems commonly conceived as ill-structured, such as an architect designing a house, are largely composed of well-structured sub-problems. So although other theorists have built a wall between the design and scientific methodology or data, Simon’s construction of problems in which WSPs are embedded in ISPs call for problem solvers with both the empiricists’ and the designers’ toolkits. 


Richard Buchanan echos this sentiment saying, “The significance of seeking a scientific basis for design does not lie in the likelihood of reducing design to one or another of the sciences-an extension of the neo-positivist project and still presented in these terms by some design theorists. Rather, it lies in a concern to connect and integrate useful knowledge from the arts and sciences alike, but in ways that are suited to the problems and purposes of the present.” Buchanan doesn’t want to turn design into a science, but he argues that we need to thoughtfully consider where science and design intersect. The rise of design hasn’t (and shouldn’t) mean the fall of science, but for these two ways of problem solving to exist perpetually in parallel rather than in conversation with each other is a major missed opportunity. Buchanan synthesizes the work of Simon with John Dewey’s call for “new disciplines of integrative thinking.” Design work that cannot integrate with the sciences is a poorer realization of design.


We haven’t done enough to integrate that which is valuable from the empiricist tradition in modern design methodology. Buchanan tell us that interactions between designers and the scientific community are problematic. “Instead of yielding productive integrations, the result is often confusion and a breakdown of communication, with a lack of intelligent practice to carry innovative ideas into objective, concrete embodiment.” This is unsurprising given that design has been a discipline that has historically defined itself as the antithesis of science. Not to mention the continued skepticism that science and industry have of design. 

Jocelyn Wyatt recognizes the reluctance of industry leaders to embrace design methodology, saying, “Nobody wants to run an organization on feeling, intuition, and inspiration.” Wyatt’s view is ultimately that design has already arrived at the perfect intersection of integrating the rational and analytical with the creative and intuitive. Yet, she acknowledges that more organizations are structured around “conventional problem solving practices.” She posits that this may be due to fear of experiencing failure inherent to prototyping and experimentation processes.


What if instead, the explanation was that designers are still leaving too much on the table? Focusing on the strength of their own process and failing to leverage to advantages of a more quantitative understanding of the problem space?  In reflecting on this idea my classmate, Lauren and I considered the heightened value that qualitative data we had gathered in contextual inquiry had when considered in the context of quantitative data that differed from or directly contradicted the perceptions of our participants.


The gaps between what our participants perceive and what we know to be true are reliably interesting to us. How did they develop this different view of reality or “alternative facts”? We may trust that our knowledge exceeds the knowledge of the person we are observing or the disparity between our beliefs and those of our participants may lead us to question our own perceptions. In a data-gathering phase of design research, how do you respond to misrepresentations of reality? Do you accept it as a pertinent and interesting distortion or does it prompt you to interrogate your own beliefs or understanding?


There are a few ways that we have seen the perception-reality divide manifest in design research. First, the observed hypocrisy. While doing research on user behavior in public parks a participant emphatically told us that off-leash dogs were not acceptable to him or his neighbors and that the neighborhood had a strong ethos of self-policing around this particular norm. Less than half an hour later, one of his neighbors walked by with two dogs, one of which was off-leash. The two had an amicable discussion that included observing how this elder dog was inoffensively violating the off-leash rules. The strong self-policing ethos described was entirely absent. These are common. The food service worker who mentions always washing his hands before starting work, and then doesn’t wash his hands. A preschooler who describes the universe of Dora the Explorer in detail after his mom has said that he doesn’t watch any television. This inconsistency illustrates the gap between who we are and the idealized version of ourselves.


However, we aren’t always so lucky to always catch a person in these contradictions in a one- to two-hour contextual observation. How important is it to differentiate between the behavior “parents of preschoolers don’t allow screen time” and the belief “parents of preschoolers don’t think their children should be interacting with screens”? The primary resolution to this problem to to prioritize observing behavior, rather than eliciting opinions. We can ask questions about what we observe and given what we know about people’s tendency to present an idealized self, take with a grain of salt if the participant insists that we are observing something anomalous rather than routine. 


Yet, what about when the observed behavior isn’t rational for a given context? We’ll call this, the irrational behavior. This is the person who travels out of their way to visit a particular farmers’ market because they double SNAP (food stamp) benefits, when actually every farmer’s market in town offers the same deal. If we know the rules about SNAP and farmers’ markets we can identify this as an irrational behavior, and glean some interesting insights from this misconception. Otherwise, this behavior might pass as rational and we would miss the additional understanding the comes from identifying a gap between perception and reality. 


As designers, we are seldom subject are experts in the fields we are designing for. There are ways that this is an asset, rather than a liability. Familiarity with the subject area means familiarity with a set of beliefs and judgments that may limit innovation and creativity. “This is how it’s always been done.” “This is the right way; this is the wrong way.” Additionally, it’s not realistic or an effective use of time to become a subject matter expert in each industry in which a designer works. Absent subject area expertise, how can designers increase their knowledge base to further develop their ability to spot pertinent and interesting gaps between perception and reality?


In the field of remote sensing and GIS, practitioners talk about “ground data.” When I did research using satellite imagery and GIS to identify areas of reforested and old-growth forests in the Pacific Northwest, I couldn’t just rely on the images, I needed ground data. Real-world points of reference where I knew that the areas that I was seeing in my satellite data were known to be either reforested or old-growth. Using these known areas as a baseline, I could analyze the properties of those areas of the imagery and use correlation to identify which other areas on the map likely shared a common history of being logged or pristine.


How can designers employ ground data in their work? In the example of the farmers’ market, ground data might come in the form of existing knowledge, or from additional research. How can we cultivate data pertinent to our areas of research and integrate them into our qualitative research processes? I’m not suggesting that designers become statisticians or scientists, but advocating for the integrative approach advanced by Simon and Buchanan. If we understand the problems we are solving to be complex enough to contain both well-structured problems and ill-structured problems, then surely some of the data or tools of the sciences can advance our understanding of problems in meaningful and actionable ways, particularly the well-structured components of complex problems. 


My recent parks research provides a potential example of the intersection of design research and quantitative data. While talking to a participant who works in parks she emphatically described the ways that her organization seeks to approach their work with an equity mindset. She was thoughtfully aware of the history of racial inequality in Austin and the ways that had manifested in parks. Yet, this seemed like a possible example of an observed hypocrisy, as the methods that the organization employs to direct resources are subjective and thus may reflect misconceptions or blindspots despite the best of intentions. Further, the process seems highly vulnerable to “squeaky wheel” bias that might favor those with the means and agency to advocate for themselves.


As a design researcher I wanted ground data to validate or invalidate the claim that the organization was achieving the equity outcomes that are part of their mission. If their impact and their intentions were not aligned, this would be a fruitful problem space to explore, or it would not be a problem at all if the organization was effectively achieving their equity goals. I could ask more park users about their perceptions of equity (which I did). This gave me important and valid data about users’ perceptions of equity (they didn’t think funding was equitable). But I still wanted to know if we made a perception problem to solve or a systems problem to solve or both.


Serendipitously we ended up talking to another person who works in parks who had a shared passion for equity and concern about resource allocation in parks. She shared with us a tool that she uses to map potential investments in Austin parks. She told us, “It has all these different layers. You can turn on a master layer that puts together an aggregate layer of things like low income, low food access, high obesity, high chronic disease–all these like high need things–children under eighteen, low socioeconomic status, all of that coming together.” I could map the recent investments made by the organization using subjective methodology to allocate funding and see how those correlated with the empirically identified areas of high need in Austin. That would be great ground data to validate the organizations claims.

Where does this fit into my current framework for design research? The way our research process is organized looks like this:



The first two (contextual research, themes) are focused on the processes of collecting and organizing perceptions. The latter two incorporate the designers intuition and knowledge to make meaning of the first two. I would argue the space between theming and insight formation is the place to apply quantitative data. Adding different types of data at this stage fortifies the designer to approach the formation of insights and design ideas from a stronger vantage point.


The highly regarded philosopher of design and creativity, Edward de Bono, advocates literally putting on different hats when participating in creative work. A hat for criticism and skepticism, a hat for optimism and “blue sky” thinking, one for intuition and emotion, one for provocation and deviant thinking, and so on. Yet, none of the hats on his hat rack is a statistics, data and science hat. Perhaps because this lens is commonly seen as a creativity killer. 


People may believe quantitative data is reductive, dry, lacks personality or nuance. But when I hear those critiques, I think, ‘You’re just looking at the wrong data!” The right quantitative data for your problem will spark curiosity, will express nuance, will prompt expansive thinking. With practice interpreting or visualizing data, quantitative data can tell a lively and highly specific story, or at least point you in the direction of one. If my overlay of the high-need GIS data with the non-profit’s recent project sites shows neglected areas of high need, I can explore why. I can visit those parks and talk to neighbors their to get a fuller picture of what is happening across the city outside of my convenience sample.


First, to fully exploit the value of a data-informed design practice first thing we need to do as designers is to let go of the idea that design is defined in opposition to science. While it is true that design methodology is distinct from scientific methodology, positioning these two disciplines at odds unduly influences designers to abandon both the tools and the products of scientific inquiry. While your design toolkit is powerful without any scientific resources, it is only more powerful when you are able to thoughtfully incorporate scientifically derived data or methods.


Second, we need to examine the design tools and methodologies that we rely on and consider how, when and where we might integrate empirical data. A choice to incorporate it at the beginning of the process might limit building empathy and understanding the problem through your users eyes. At the end might be too late to make use of the data. Wyatt describes three phases of design, “inspiration, ideation, and implementation.” Within this model, the most effective place to employ empirical data is somewhere within the ideation phase. Critically examine your design research process and consider where quantitative data best fits.


Finally, just as Pacione makes a case for basic design literacy for everyone, designers need to embrace basic data literacy. A data-informed designer is knowledgeable about what types of quantitative data sources are publicly available and what types of quantitative data your clients likely have access to. She also has at least basic competency in techniques for quantitative data gathering and processing. Data literacy requires being able to interpret and visualize quantitative data sets and identify bad or unreliable data sources. With practice, thinking with your statistics, data and science hat will become second nature, and the results of your design research will be even more persuasive and powerful.