The Future of Design (Or, How Things May Not Be That Different From Now)
I would like to tell you about a book I’ve just finished. It’s one of the finest books I have ever read.
Reading this book also bolstered the belief that I’m in no position to make the previous claim. I’m a human, a fledgling designer, and a biological creature. Which makes me a product of evolution, prone to cognitive dissonance, and recency bias.
The implications of these faulty human traits become clear in Yuval Noah Harari’s Homo Deus: A Brief History of Tomorrow. But more than proving we’re faulty creatures, Harari’s thesis should resonate with anyone who has a stake in design.
Humans, Just Another Animal?
Harari is a historian, a realist, and most likely, atheist or agnostic. He contends that humans have dominated the planet due to our creation of fictions.
Our fictions are the fuel that propels our dominance on the planet. These stories define our uniqueness within the animal kingdom: the ability to organize in groups far above 100. The utility of money is a tangible example of a shared global fiction. A $100 bill is a piece of paper, but we all agree on the fiction that it has value.
Harari details how our ability to organize under shared fictions has led to the Anthropocene, the current age where human activity is the dominant force on the Earth’s environment. But it’s not only the Earth we’re changing.
We’re beginning to modify our genetic code, we’ve extended our lifespans, and we can ingest mood-lightening pills to ease our worries. Harari writes that as we reshape our biology with greater precision, it’s the old and new fictions that will guide what we create.
Huxley, Orwell, and Atwood all warn us through their literature that as we gain greater biological control over ourselves, we’ll lose our humanness. But maybe those warnings aren’t predictions just yet. Today, mitigating depression with an accurate dosage of chemicals has made life livable for many people. The trouble, Harari writes, is that our scientific developments in the life sciences (biology, behavioral economics, cognitive psychology) are quickly eroding the dominant shared narrative most of us live by.
You’re a Humanist, Like It or Not
The dominating narrative of our age is humanism. Most of us have grown up with a feeling that “I am a unique individual with a clear inner voice that provides meaning to the universe.” Humanism sanctifies life, happiness, and the power of Homo Sapiens. It’s a story that says, “It’s up to me to choose what is right, what is art, what ice cream is best.” It also says “It’s up to you to choose the same for yourself.”
“Feelings” can have a very touchy-feely connotation, but it’s also humanism that fuels capitalism, choice, and our sense of freedom. It’s what legitimizes voting, and urges us to seek more equitable justice.
And it’s here that Harari’s warnings begin:
The humanist belief in feelings has enabled us to benefit from the fruits of the modern covenant without paying its price. […] What, then, will happen once we realize that customers and voters never make free choices, and once we have the technology to calculate, design, or outsmart their feelings? If the whole universe is pegged to the human experience, what will happen once the human experience becomes just another designable product, no different in essence from any other item in the supermarket?
We’re Walking, Justifying, Narrative Machines
If you’ve ever concluded that humans are irrational and fickle, you’ll not find much argument from Harari. What sets his writing apart is how he synthesizes scientific developments in light of our ancient human fictions.
A few of the unsettling scientific developments in irrationality:
- fMRI scanners have proven your brain makes decisions before you’re aware you’ve made them
- Split brain experiments have shown that humans are experts in cognitive dissonance, sliding into rational explanations even under bewildering circumstances
- Behavioral economics has shown humans “narrative self” consistently overpowers our “experiencing self,” which turns the irrational, unpredictable choices we make into the illusion of a coherent, individual story (referred to as “System 2” in Daniel Kahneman’s terminology)
Dear Algorithm, Tell Us What To Design
Most of us don’t design by religious guidelines or by a dictator’s demands, we design for an environment where individuals are free to choose: my barstool or Ikea’s, your app or Apple’s. We believe in choice, and we design knowing there is a choice on the user’s end (most of the time).
Harari foresees a conflict here. On one side, the humanistic legacy and individual choice. On the other, the developments of science and technology, along with the rise of algorithms which increasingly make choices for us. Harari writes:
Humans are relinquishing authority to the free market, to crowd wisdom and to external algorithms partly because we cannot deal with the deluge of data. In the past, censorship worked by blocking the flow of information. In the twenty-first century censorship works by flooding people with irrelevant information. […] Today having power means knowing what to ignore. So considering everything happening in our chaotic world, what should we focus on?
This leads me to imagine a loop that will impact designers at a high level:
- We’ll continue to design based on witnessed human behavior, but will have to fight louder for a voice among the screams for large quantitative data
- Big data collection and the algorithms will increasingly affect what gets made
- What gets made = what gets used
- We’ll increasingly study human behavior that’s created or influenced by algorithms
Pessimistic enough? I suppose Harari brought it out of me. But his book isn’t simply pessimistic. It’s an exercise in reflection and an attempt at broad foresight. More optimistically, it helped me gain a sense of the big picture of creating things for fellow humans.
Are we in danger of designing for “data fiends” who may trust algorithms over individual feelings? In reaction maybe we’ll design for the opposite, an “intentional ignorance,” or peace of mind that purposely avoids the prescriptiveness of data. Perhaps the data will show us more clearly our cognitive dissonance and we’ll act differently, more efficiently, or even more ethically.
Whatever we’re called to design, we should recognize the fictions we’ve been operating on, and act on the the stories we want told in the future.