Random finds (2019, week 14) — On getting tired of heroes, designing a more humane world, and the Asian century
I have gathered a posy of other men’s flowers, and nothing but the thread that binds them is mine own.” — Michel de Montaigne
Random finds is a weekly curation of my tweets and, as such, a reflection of my fluid and boundless curiosity.
If you want to know more about my work and how I help senior executives and leadership teams find their way through complexity and change, please visit my ‘uncluttered’ website.
This week: Rebecca Solnit and when the hero is the problem; how design can make the human world more accessible, more inclusive and more humane; why the west’s two-century epoch as global powerhouse is coming at an end; productivity is all about attention management, not time management; the tyranny of metrics; why our brain hates slowpokes; sakura and cultural transformation; rebuilding an ancient Japanese shrine; and, finally, Daniel Kahneman, who thinks he hasn’t improved his thinking in any way.
Rebecca Solnit and why we need catalysts instead of heroes
“We like heroes and stars and their opposites, though I’m not sure who I mean by we, except maybe the people in charge of too many of our stories, who are themselves often elites who believe devoutly in elites, which is what heroes and stars are often presumed to be,” Rebecca Solnit writes in When the Hero is the Problem.
“There’s a scorching song by Liz Phair I think about whenever I think about heroes. She sang [in Soap Star Joe]:
He’s just a hero in a long line of heroes
Looking for something attractive to save
They say he rode in on the back of a pick-up
And he won’t leave town till you remember his name
It’s a caustic revision of the hero as an attention-getter, a party-crasher, a fame-seeker, and at least implicitly a troublemaker in the guise of a problem-solver. And maybe we as a society are getting tired of heroes, and a lot of us are certainly getting tired of overconfident white men. Even the idea that the solution will be singular and dramatic and in the hands of one person erases that the solutions to problems are often complex and many faceted and arrived at via negotiations. The solution to climate change is planting trees but also transitioning (rapidly) away from fossil fuels but also energy efficiency and significant design changes but also a dozen more things about soil and agriculture and transportation and how systems work. There is no solution, but there are are many pieces that add up to a solution, or rather to a modulation of the problem of climate change.”
The idea that all problems are personal and soluble by personal responsibility is another part of our rugged individualism and hero culture, says Solnit.
“It’s a framework that eliminates the possibility of deeper, broader change or of holding accountable the powerful who create and benefit from the status quo and its myriad forms of harm. The narrative of individual responsibility and change protects stasis, whether it’s adapting to inequality or poverty or pollution.
Our largest problems won’t be solved by heroes. They’ll be solved, if they are, by movements, coalitions, civil society. The climate movement, for example, has been first of all a mass effort, and if figures like Bill McKibben stand out — well he stands out as the cofounder of a global climate action group whose network is in 188 countries and the guy who keeps saying versions of ‘The most effective thing you can do about climate as an individual is stop being an individual.’ And he’s often spoken of a book that influenced him early on, The Pushcart War, a 1964 children’s tale about pushcart vendors organizing to protect their own in a territorial war against truck drivers on the streets of New York. And, plot spoiler, winning.
I was thinking about all this when I was thinking about Sweden’s Greta Thunberg, a truly remarkable young woman, someone who has catalyzed climate action across the world. But the focus on her may obscure that many remarkable young people before her have stood up and spoken passionately about climate change. Her words mattered because we responded, and we responded in part because the media elevated her as they had not elevated her predecessors, and they elevated her because somehow climate change has been taken more seriously, climate action has acquired momentum, probably due to the actions of tens of thousands or millions who will not be credited with this change. She began alone, but publicly, not secretly, and that made it possible for her actions to be multiplied by more and more others.
A general is not much without an army, and social change is not even modeled on generals and armies, because the outstanding figures get others to act willingly, not by command. We would do well to call them catalysts rather than leaders. Martin Luther King was not the Civil Rights Movement and Cesar Chavez was not the farmworker rights movement and to mistake them for that denies the multitudes the recognition they deserve but more importantly denies us strategic understandings when we need them most. Which begins with our own power and ends with how change works.”
The universal design ideal
At every turn, the design of our environments either creates barriers or opens doors. Let’s design a more humane world, Anna Leahy argues in The universal design ideal.
In What Works: Gender Equality by Design, the behavioural economist Iris Bohnet writes, “There is no design-free world.” The material world in which we engage every day is “shaped and reshaped through design — for better or worse. When entering a space, people have long been expected not only to compensate for but also to overcome their disabilities. But sometimes, no amount of ingenuity can overcome the mismatch between a given individual and a given space — and then, exclusion is the result,” Leahy writes.
In the 19th century, the Industrial Revolution gave way to standardisation. The fabrication of goods by hand was replaced by manufacture of goods by machines. It benefitted production, trade and the bottom line. What people get through this mass production is something designed for an abstraction of an ideal male body. “In cars, radios, furniture, telephones, and other gadgets of the automated and electrified 20th century, small details of design echoed the assumptions that users of technology were generally spry, and male by default,” Bess Williamson writes in Accessible America: A History of Disability and Design. “Customarily, those whose bodies did not fit the standard, who were not spry or male, had to make do. Designers didn’t account for variations of the human body. Yet in order to design a more accessible human world, the mismatches must be recognised,” Leahy adds.
In 1985, the American architect Ronald Mace, a polio victim who devoted his life as an architect and industrial designer to devising environments suitable for all people regardless of age or condition, “introduced the term ‘universal design’. Instead of addressing shortcomings with add-ons, universal design focuses on what works for everyone, where ‘everyone’ is a basic criterion, and the emphasis is on commonality rather than on difference.”
A classic example of early universal design is OXO Good Grips. OXO founder Sam Farbar redesigned a vegetable peeler to make a specific task easier for a specific person — his wife had arthritis — but it also turned out to be a better design for people who didn’t have joint disease. “Addressing the mismatch his wife had with her kitchen revealed a mismatch that the rest of us didn’t even realise we had, which was terrific for the company’s bottom line. This type of universal design makes inclusion look easy and profitable.
But this product’s attention to disability was what Williamson in Accessible America calls ‘a silent contributor’. Commercially, universal design was promoted as aesthetically appealing and functional, and marketing it that way fuelled sales. The focus was on the ‘everyone’ of universal design, which exuded a feel-good quality that promised to bring people together instead of singling out anyone. The fact that such successful products addressed disability appeared as a nice side effect, rather than inherent in the process or goals. As marvellous as universal design ultimately pledges to be, and can be in practice, it addresses disability by hiding it.”
“Because a person experiences a tool or a space, whether physical or digital, via the senses, and because one’s senses are unique, the curators Ellen Lupton and Andrea Lipps advocate something they call sensory design. This is a version of universal design that fosters inclusion based on acknowledgment of all five human senses as our means of access to the material world. In their book The Senses: Design Beyond Vision, [Lupton and Lipps] write: ‘Sensory design supports everyone’s opportunity to receive information, explore the world, and experience joy, wonder, and social connections, regardless of our sensory abilities.’ They apply their sensory approach to the seven universal design principles established in 1997 by the Center for Universal Design at North Carolina State University: equitable use, flexibility in use, simple and intuitive use, perceptible information, tolerance for error, low physical effort, and size and space for approach and use.
Importantly, Lupton and Lipps focus not only on the isolated individual’s experience with the physical space but also on ‘social inclusion’ that allows ‘us all to interact and enjoy the offerings of an institution together’. Instead of responding to an individual access problem, Lupton and Lipps assert: ‘By addressing multiple senses, designers support the diversity of the human condition.’ Sensory design, then, attempts to blend universal design with acknowledgment of individuality so that, for instance, a mismatch between a visual cue and a person with a visual impairment doesn’t prevent access. In taking into account all five senses, designing for everyone comes closer to designing for (the variety of) individuals too because multisense wayshowing, to adapt Mollerup’s term [the Danish designer and author of Wayshowing > Wayfinding: Basic & Interactive, Per Mollerup, who defines wayfinding as ‘what we do when finding our way in unknown quarters’ and wayshowing as ‘the professional activity of planning and implementing orientation systems in buildings or outdoor areas’], accounts for a variety of sensory abilities and habits.”
Design establishes — and can revise — deeply embedded and often unspoken cultural and cognitive defaults. This makes it fundamental to more inclusive societal structures. It is, however, important to keep in mind that universal design and inclusive design are not necessarily the same thing, even though they can overlap. As Kat Holmes writes in Mismatch: How Inclusion Shapes Design, “The principles of universal design are focused on attributes of the end result, such as ‘simple intuitive use’ and ‘perceptible information.’ In contrast, inclusive design was born out of digital technologies of the 1970s and ’80s, like captioning for people who are deaf and audio recorded books for blind communities.”
It is also important, Leahy notes, “to acknowledge that there exists no truly universal design that works equally well for all individuals. Interactions with physical and digital environments differ depending on who’s navigating them.” The Australian-born geographer Reginald Golledge acknowledges a related shortcoming of universal design. In Wayfinding Behavior: Cognitive Mapping and Other Spatial Processes, he writes, “perfect knowledge of a complex environment cannot be achieved, because real-world environments change over time.”
“Every physical or digital space and every societal structure is a work in progress,” Leahy adds. “In the words of Lupton and Lipps: ‘Inclusion is a state of thinking and acting toward a shared purpose based on a commitment to iteration, refinement, and self-improvement.’
In that sense, inclusive design must incorporate ‘the participation of excluded communities’ in the design process itself; the excluded are the experts on mismatches. With this broader understanding of expertise, the emphasis shifts from designing for people to designing with people so that each of us can make our ways, individually and collectively, through the material world.”
“When I think of navigating the world, I think of maps. The foldout road map of my childhood vacations, the map app on my cellphone, the tsunami evacuation map on the California street corner — maps embody wayshowing. Here’s where you are, and here’s where you can go. ‘Every map tells a story,’ writes the English historian Simon Garfield in On the Map: a Mind-Expanding Exploration of the Way the World Looks. Maps tell stories of discovery and of empire, of voter turnout and of epidemiology, of where people have been and of what they do there — they suggest what’s possible for whom.
Like maps, designs are abstractions that tell stories. Who is here, and who is not? Who gets where they’re going, and how much time and effort does it take? A design is an idea, and the real-world implementation of that design — the ways people interact in the material world — is the story. We have the ability to change the story about humanity by changing the design of our physical, digital and societal environments. Through design that forces us to face and actively grapple with assumptions and biases, the human world can be made more accessible, more inclusive and more humane.”
The Asian century is set to begin
“The west’s two-century epoch as global powerhouse is at an end,” argues Kishore Mahbubani in his latest book Has the West Lost It?
Asia, the envy of Europe in 17th century, dominated the world economy for most of human history until the 19th century. Its recent surge, which began in postwar Japan, represents a return to a historical norm, Valentina Romei and John Reed argue in The Asian century is set to begin.
“‘Around the end of the 17th century, Europe was looking with admiration and envy at a region of the globe which concentrated more than two-thirds of the world’s gross domestic product, and three-quarters of the world’s population,’ says Andrea Colli,” a professor of economic history at the Bocconi University in Italy. Around the same time, India’s share of the world economy was as big as Europe’s, according to the Indian politician and author Shashi Tharoor.
But since then, Asia’s place in the world shrank as western economies took off. “What you are looking at is the great reversal,” says Joel Mokyr, a professor at Northwestern University. Powered by what academics refer to as the Scientific Revolution, then the Enlightenment and the Industrial Revolution, “between 1500 and 1750 Europe changed dramatically; the rest of the world did not.”
By the 1950s, Asia accounted for less than per cent of world output, despite hosting more than half the world’s population.
In the 19th century, says Robert Allen, a professor of economic history at NYU Abu Dhabi, “Asia was transformed from the world’s manufacturing centre into classic underdeveloped economies exporting agricultural commodities.”
“But in recent decades that trend has been reversed. The dramatic rise of Japan and South Korea, the first countries in Asia to catch up with the west, has been ‘dwarfed’ by China’s take-off following the country’s introduction of market-oriented reforms under Chinese leader Deng Xiaoping in the late 1970s,” Romei and Reed write.
“In just a couple of generations, a ‘winning mix of integration with the global economy via trade and foreign direct investment, high savings rates, large investments in human and physical capital, and sound macro-economic policies’ contributed to Asia’s economic leap forward, according to the IMF’s latest regional outlook […].”
Despite hundreds of millions of people have been lifted out of poverty over the past five decades, and many economies have graduated to middle-income or advanced economic status, Asia remains poorer than the rest of the world. But the gap is narrowing.
By any measure, Asia is about to reoccupy the centre of the global economic stage. When it does, “the world will have come full circle,’ Allen said.”
China’s economic growth and expansion seems unstoppable. Understanding its “old-new, holistic-monistic, hybrid ideological muscle that is at work will take us a long way toward responding smartly to [its] rise,” writes journalist and former MERICS Research Fellow Didi Kirsten Tatlow in a recent paper (2018), China’s cosmological Communism: a challenge to liberal democracies.
To fully understand China’s rise, she argues, we must look at deeply embedded norms of power and imperial statecraft, which are reproduced by the Chinese Communist Party (CCP) to project power and build legitimacy. In her paper, Tatlow traces the relevance of the terms ‘tianxia’ (‘all-under-heaven’), ‘tianchao’ (‘heavenly empire’), and ‘jimi’ (literally ‘bridling and feeding’ horses and cattle) for modern CCP politics.
And also this …
According to Adam Grant, productivity isn’t about time management. It’s all about attention management. “Prioritize the people and projects that matter, and it won’t matter how long anything takes. Attention management is the art of focusing on getting things done for the right reasons, in the right places and at the right moments.”
He also believes that our goal is not just to be more productive; we also want to be creative. This is where we struggle because productivity and creativity demand opposite attention management strategies. “Productivity is fueled by raising attentional filters to keep unrelated or distracting thoughts out. But creativity is fueled by lowering attentional filters to let those thoughts in,” he writes.
“I’m pretty sure there’s an eighth habit of highly effective people. They don’t spend all their time reading about the seven habits of highly effective people.”
“How do you get the best of both worlds? In his book When, Dan Pink writes about evidence that your circadian rhythm can help you figure out the right time to do your productive and creative work. If you’re a morning person, you should do your analytical work early when you’re at peak alertness; your routine tasks around lunchtime in your trough; and your creative work in the late afternoon or evening when you’re more likely to do nonlinear thinking. If you’re more of a night owl, you might be better off flipping creative projects to your fuzzy mornings and analytical tasks to your clearest-eyed late afternoon and evening moments. It’s not time management, because you might spend the same amount of time on the tasks even after you rearrange your schedule. It’s attention management: You’re noticing the order of tasks that works for you and adjusting accordingly.
Paying attention to timing management also means thinking differently about how you plan your work. I love Paul Graham’s suggestion to divide the week into ‘maker days’ and ‘manager days.’ On manager days, you hold your meetings and calls. On maker days, you block out time to be productive and creative, knowing you’ll be free from distractions that would normally interrupt your flow. Unfortunately, few of us have the luxury to manage every week that way, which means we need to find ways to carve out maker moments.”
Adam Grant’s article is part of The New York Times’s Attention Week in Smarter Living, a series about taking back your attention — and spending it wisely.
Here’s another one: Stop Letting Modern Distractions Steal Your Attention. Anna Goldfarb argues that making yourself inaccessible from time to time is essential to boosting your focus.
Jerry Z Muller, a professor of history at the Catholic University of America in Washington DC and the author of The Tyranny of Metrics (2018), wonders to what extent our ‘culture of metrics’ — with its costs in employee time, morale and initiative, and its promotion of short-termism — has itself contributed to economic stagnation?
“More and more companies, government agencies, educational institutions and philanthropic organisations are today in the grip of a new phenomenon. I’ve termed it ‘metric fixation,’” he writes in Against Metrics:how measuring performance by numbers backfires.
“The key components of metric fixation are the belief that it is possible — and desirable — to replace professional judgment (acquired through personal experience and talent) with numerical indicators of comparative performance based upon standardised data (metrics); and that the best way to motivate people within these organisations is by attaching rewards and penalties to their measured performance.”
Yet, contrary to commonsense belief, Muller writes, “attempts to measure productivity through performance metrics discourage initiative, innovation and risk-taking.”
“The source of the trouble is that when people are judged by performance metrics they are incentivised to do what the metrics measure, and what the metrics measure will be some established goal. But that impedes innovation, which means doing something not yet established, indeed that hasn’t even been tried out. Innovation involves experimentation. And experimentation includes the possibility, perhaps probability, of failure. At the same time, rewarding individuals for measured performance diminishes a sense of common purpose, as well as the social relationships that motivate co-operation and effectiveness. Instead, such rewards promote competition.
Compelling people in an organisation to focus their efforts on a narrow range of measurable features degrades the experience of work. Subject to performance metrics, people are forced to focus on limited goals, imposed by others who might not understand the work that they do. Mental stimulation is dulled when people don’t decide the problems to be solved or how to solve them, and there is no excitement of venturing into the unknown because the unknown is beyond the measureable. The entrepreneurial element of human nature is stifled by metric fixation.
[…] The more that work becomes a matter of filling in the boxes by which performance is to be measured and rewarded, the more it will repel those who think outside the box.”
In Why Your Brain Hates Slowpokes, Chelsea Wald writes about how the high speed of society has jammed your internal clock.
“The pace of our lives is linked to culture. Researchers have shown society’s accelerating pace is shredding our patience. In tests, psychologists and economists have asked subjects if they would prefer a little bit of something now or a lot of it later; say, $10 today versus $100 in a year, or two pieces of food now versus six pieces in 10 seconds.
Subjects — both human and other animals — often go for the now, even when it’s not optimal. One study showed that exposing people to ‘the ultimate symbols of impatience culture’ — fast-food symbols like McDonald’s golden arches — increases their reading speed and preference for time-saving products, and makes them more likely to opt for small rewards now over larger ones later.”
“Our rejection of slowness is especially apparent when it comes to technology. ‘Everything is so efficient nowadays,’ Wittmann says. ‘We’re less and less able to wait patiently.’ We now practically insist that Web pages load in a quarter of a second, when we had no problem with two seconds in 2009 and four seconds in 2006. As of 2012, videos that didn’t load in two seconds had little hope of going viral.
Of course, we’re not going to die if a website doesn’t load immediately. But in what is probably a hangover from our primate past — when we could starve if impatience didn’t spur us to act — it sure can feel like it. ‘People expect the payoff to come at some kind of rate, and when it doesn’t come, this creates annoyance,’ posits evolutionary anthropologist Alexandra Rosati, a primate expert, who is finishing a postdoc at Yale before joining the faculty at Harvard.”
“The web once made something of a biblical promise to give all of us a voice, but in the ensuing flood — and the ensuing floods after that — only a few bobbed to the top. With increased diversity, this hasn’t changed — there are more diverse voices, but the same ones float up each tim,” Soraya Roberts writes in On Flooding: Drowning the Culture in Sameness.
“I read an article this weekend that I didn’t see being shared anywhere. You had to scroll down the Times pretty far to find it; it was in the arts section and it was about a group of black artists who were suddenly being recognized in their 70s and 80s. It was a frustrating read, a sort of too-little-too-late scenario because, sure, it’s always nice to get half a million dollars for your work, but where was the money when you were actually producing the work, while supporting a family and paying a mortgage, with many decades of life ahead of you? When you could run to speaking engagements instead of rolling to them? ‘The kind of elation I may have had back 30 years, I’m past that point,’ 75-year-old artist Howardena Pindell said. Where was all of this back in 1989? Oh, right, Robert Mapplethorpe, Jean-Michel Basquiat, Jeff Koons — guess they took up the allotted spots.”
“My frustration was for these overlooked artists, but also for the artists being overlooked now, the ones with interesting new ideas (if not necessarily revolutionary ones) that can inch the discourse forward in some way. We choose virality instead — repackaged, reshaped, shareable versions of what has come before — and equate it to quality because of its resonance. Which is itself resonant because the irony of the web is that even though everyone can have a voice, the ones that we project are projected over and over and over again. This isn’t quality, or real diversity; it’s familiarity. We model ourselves on fandom, where there is no sense of proportionality — there is everything, there is nothing, and there is little else — and the space between now and the future, the space in which critics used to sit, increasingly ceases to exist.
We need a mass realization that pulls us out of this flooding culture. That is: the acknowledgment by powerful organizations that we do in fact engage more with original stories — it’s a fact, look it up — that lasting conversations do not come out of Twitter trends, and that diversity means diversity — more that is different, not more of the same differences. As one curator told the Times in the piece about older black artists getting their due, ‘There has been a whole parallel universe that existed that people had not tapped into.’ Tap into it.”
(Watching cherry blossoms, things past whirl through my mind)
— Matsuo Bashou (1644–1694)
One of the most loved poems in Japanese culture revolves around the sakura or cherry blossom. It’s called the Iroha Song, Anna Garleff writes in Sakura and Cultural Transformation. Taught to children as a way of teaching them their ‘A-B-C’s,’ the poem’s syllables together also form a story about life and change.
“Because the flowers don’t all blossom at the same time — some sooner, some later — the tree looks and smells like it is in full bloom all at once. But in fact, as it blossoms, it is losing the flower at the same time. Just like the syllables in the poem, they are there fleetingly, and only once.”
The Japanese poet Matsuo Bashou “was fascinated with change. He played with the tension between tradition, culture, and change — not only in his poetry, but also in his life as an itinerant traveller. He explored what things can and should change, and what things should not. The difference is learned through practice of tradition. In Japan, some examples of tradition are: the tea ceremony, martial arts, and poetry.
Like all traditions, they are rigorously practiced. Traditions are practiced in order to understand what can be changed. A true master can rise to the challenges of change because s/he knows what is important to retain as tradition, and what is not. The Master understands that if you change the wrong thing, or change too much, tradition is gone and it is impossible to pass along. Purpose is rendered obsolete.”
If you retain tradition, but lose the practice, culture dies, Bashou argued.
“Tradition is the past, making one unified body, enabling us to move forward. Culture is the future — it’s our stories of who we are striving to become. But practice is the present — and we are perpetually in the present. In fact, there is no other time in which we can exist. This is why Masters teach mindfulness.
So we cannot talk about kaizen without its counterpart, wabi sabi. Wabi is defined as ‘rustic simplicity’ or ‘finding mental richness (fulfillment) in poverty or material shortage.’ Sabi is ‘taking pleasure in the imperfect’ or ‘to find deep meaning (richness) in peaceful tranquility.’ Kaizen alone is like expecting the cherry tree to be in full flower, each blossom, all the time.
The sakura are a symbol of transience. We, too, are perpetually in a state of change, miraculously rooted in our pasts, even as we imagine our futures — breath by breath, blossom by blossom.”
In the West, we have become familiar with many Japanese aesthetic concepts such as wabi, sabi and kawaii. But kehai (気配), the idea of the existence of truth and beauty in the natural world, is relatively unknown. Kehai is a key influence on Yukihito Masuura’s work documenting masterpieces of Western sculpture and sacred Shinto architecture. “I would like to show the world the evidence that it is possible to continue beauty and maintain a sustainable society for more than 1,000 years,” the Tokyo-born photographer says.
Masuura has documented the reconstruction of one of Japan’s most sacred Shinto shrines, Izumo Oyashiro. A 12-year process, dating back some 1,300 years, which takes place every 60 years (the Ise Jingu shrine is being rebuild every 20 years). Masuura was the only photographer to gain access to the sacred shrine in Western Japan.
It is believed that the ‘shikinen sengū’ ritual, the repeated rebuilding, renders the sanctuaries eternal, reinvigorating spiritual and community bonds. Sengu also preserves the artisanal skills required for continuity of Japan’s traditional architectural aesthetics.
Via The Guardian
“I have been shifting positions all my life. I like changing my mind, and I look for ways of changing my mind. This is what I’m doing now in questioning the importance of biases. But as I said, I don’t believe — I’m certainly less smart than I was when I was younger. I’m in my 80s, so — but it’s not only that. I haven’t become more sensitive to biases. I really haven’t improved my thinking in any way, I think. And if I have, it’s accumulating experience. It’s not by learning better ways to think.” — Daniel Kahneman in an interview with Krista Tippett for On Being (2017)