Reading notes (2020, week 42) — On the rise and rise of creativity, why we can’t have billionaires and stop climate change, and why AI is an ideology

Mark Storm
25 min readOct 17, 2020
Shanghai studio Roarc Renew has slotted two sweeping brick corridors between a pair of disused granaries in Jiaxing, China, to create the TaoCang Art Centre. Located in the old village of Wangjiangjing in the Zhejiang province, the art gallery was developed as a landmark for the area while demonstrating how new life can be given to old buildings, originally built in the 1950s to store grains. Source: ArcDaily (Photograph by Wen Studio)

Reading notes is a weekly curation of my tweets. It is, as Michel de Montaigne so beautifully wrote, “a posy of other men’s flowers and nothing but the thread that binds them is mine own.”

In this week’s edition: Once seen as the work of genius, how did creativity become an engine of economic growth and a corporate imperative?; if we are going to survive the 21st century, we need to distribute income and wealth more fairly; AI is a perilous belief that fails to recognize the agency of humans; does having a purpose in life make us happy?; the science of wisdom; ‘adab,’ the proper aesthetic and social forms; the beauty of Donatello; two sweeping brick corridors between a pair of disused granaries; and, finally, Tom Waits and the ‘deficit of wonder.‘

The rise and rise of creativity

“Creativity doesn’t have a deep history,” Steven Shapin, the Franklin L Ford Research Professor of the History of Science at Harvard University, writes in The rise and rise of creativity. The word was scarcely used until the 1920s. But changes in the mid-20th century, especially after the end of the Second World War.

“In 1950, a leading psychologist lamented that only a tiny proportion of the professional literature was then concerned with creativity; within the decade, a self-consciously styled and elaborately supported ‘creativity movement’ developed,” Shapin writes. “Definitions of creativity were offered; tests were devised; testing practices were institutionalised in the processes of educating, recruiting, selecting, promoting and rewarding. Creativity increasingly became the specific psychological capacity that creativity tests tested. There was never overwhelming consensus about whether particular definitions were the right ones or about whether particular tests reliably identified the desired capacity, but sentiment settled around a substantive link between creativity and the notion of divergent thinking. Individuals were said to have creativity if they could branch out, imagine a range of possible answers to a problem and diverge from stable and accepted wisdom, while convergent thinking moved fluently towards ‘the one right answer.’ Asked, for example, how many uses there are for a chair, the divergent thinker could come up very many; the convergent thinker said only that you sat on it. Convergent and divergent thinking stood in opposition, just as conformity was opposed to creativity.

There was an essential tension between the capacities that made for group solidarity and those that disrupted collective ways of knowing, judging and doing. And much of that tension proceeded not just from practical ways of identifying mental abilities but from the moral and ideological fields in which creativity and conformity were embedded. For human science experts in the creativity movement and for their clients, conformity was not only bad for science, it was also morally bad; creativity was good for science, and it was also morally good.”

“In the same year that Charles Darwin’s On the Origin of Species (1859) was published, a book called Self-Help by the Scottish author Samuel Smiles sold far more copies. Self-Help was a kind of entrepreneur’s guide to success through hard work, and Smiles made clear what he thought about the relative significance of genius versus disciplined application in making new knowledge.” — Self Portrait as Charles Darwin (2011), by Adrian Ghenie; oil on canvas, 202.5 x 238.3 cm. Sold at Sotheby’s in 2017 to a private collector.

“Is creativity’s history progressive? The word itself, and the expert practices for identifying it, became prominent in the Cold War but, if you’re not bothered about that, you might say that related categories — being creative, a creative person — transitioned over time from the sacred power to a secular capacity, from a sacred to a secular value, from categories belonging to the vernacular to controlled ownership by academic experts, from something no one pays to find out about to elaborately funded expert practices. From the 1950s to the present, creativity has been established as something that everybody wants: in 1959, the director of scientific research at General Electric began an address to government officials with the bland assertion: ‘I think we can agree at once that we are all in favour of creativity,’ and he was right. Creativity had become an institutional imperative, a value that was the source of many other values.

That linear story is almost, but not quite, right. Thomas Kuhn’s case for tradition and dogma was a reactionary move, and a condition for its being even thinkable was reaction to mid-century celebrations of individualism, free-thinking and market-place models of progressive enquiry. But Kuhn’s small-c conservativism was widely misunderstood and confused with iconoclasm, and some radical ‘Kuhnians’ oddly took his book as a licence for political revolution.

About the same time, there was a more consequential response to creativity-enthusiasm emerging from the heart of forward-facing capitalism. The Harvard Business Review published an essay by Theodore Levitt, a marketing expert at the Harvard Business School, called Creativity Is Not Enough (1963). Levitt didn’t doubt that there was an individual capacity called creativity, or that this capacity might be a psychological source of new ideas. Creativity was not, however, the royal road to good business outcomes, and conformity was being seriously undervalued. New ideas were not in short supply; executives were drowning in them: ‘there is really very little shortage of creativity and of creative people in American business,’ he wrote.

Many business people didn’t dwell on distinctions between creativity and innovation, but Levitt did: creativity was having a new idea; innovation was the realisation of an idea in a specific outcome that the organisation valued; and it was innovation that really mattered. Creative people tended to be irresponsible, detached from the concrete processes of achieving organisational ends: ‘what some people call conformity in business is less related to the lack of abstract creativity than to the lack of responsible action.’ Levitt’s views were widely noted, but in the 1960s he was swimming against the tide. Creativity had been incorporated.

From that time to the present, you can say that creativity rose and rose. Everyone still wants it, perhaps more than ever; politicians, executives, educators, urban theorists and economists see it as the engine of economic progress and look for ways to have more of it. The creativity tests are still around, most continuing to identify creativity with divergent thinking; they have gone through successive editions; and there is a greater variety of them than there used to be. There are academic and professional organisations devoted to the study and promotion of creativity; there are encyclopaedias and handbooks surveying creativity research; and there are endless paper and online guides to creativity — or how, at least, to convince others that you have it.

But this proliferating success has tended to erode creativity’s stable identity. On the one hand, creativity has become so invested with value that it has become impossible to police its meaning and the practices that supposedly identify and encourage it. Many people and organisations utterly committed to producing original thoughts and things nevertheless despair of creativity-talk and supposed creativity-expertise. They also say that undue obsession with the idea of creativity gets in the way of real creativity, and here the joke asks: ‘What’s the opposite of creativity?’ and the response is ‘Creativity consultants.’

Yet the expert practices of identifying people, social forms and techniques that produce the new and the useful are integral to the business world and its affiliates. There are formally programmed brainstorming sessions, brainwriting protocols to get participants on the same page, proprietary creative problem-solving routines, creative sessions, moodboards and storyboards, collages, group doodles, context- and empathy-mapping as visual representations of group thinking, lateral thinking exercises, and on and on. The production of the new and useful is here treated as something that can be elicited by expert-designed practices. Guides to these techniques now rarely treat creativity as a capacity belonging to an individual, and there are few if any mentions of creativity tests.

In the related worlds of high-tech and technoscience, Google (collaborating with Nature magazine and a media consultancy) sponsors an annual interdisciplinary conference called SciFoo Camp: it’s a freeform, no-agenda occasion for several hundred invited scientists, techies, artists, businesspeople and humanists — meant only to yield new, interesting and consequential ideas: Davos for geeks and fellow-travellers. […]

So SciFoo is a star-dusted gathering of creative people, intended to elicit creative thoughts. I might well have missed something, but when I Googled ‘SciFoo’ and searched the first several pages of results, I didn’t find a single mention of creativity. When Google — and other high-tech and consulting businesses — hire people in nominally creative capacities, typical interview questions aim to assess personality — ‘What would you pick as your walk-up song?’ — or specific problem-solving abilities and dispositions — ‘How many golf balls can fit into a school bus?’ or ‘How would you solve the homelessness crisis in San Francisco?’ They want to find out something concrete about you and how you go about things, not to establish the degree to which you possess a measurable psychological quality.

In educational settings, creativity testing continues to flourish — where it is used to assess and select both students and teachers — but it is scarcely visible in some of late modernity’s most innovative settings. Producing new and useful things is not less important than it once was, but the identity of the capacity called creativity has been affected by its normalisation. A standard criticism of much creativity research, testing and theorising is that whatever should be meant by the notion is in fact task-specific, multiple not single. Just as critics say that there can be no general theory of intelligence, so you shouldn’t think of creativity apart from specific tasks, settings, motivations and individuals’ physiological states.

Creativity was a moment in the history of academic psychology. As an expert-defined category, creativity was also summoned into existence during the Cold War, together with theories of its identity, distinctions between creativity and seemingly related mental capacities, and tests for assessing it. But creativity also belongs to the history of institutions and organisations that were the clients for academic psychology — the military, business, the civil service and educational establishments. Creativity was mobilised too in the moral and political conflict between supposedly individualistic societies and their collectivist opponents, and it was enlisted to talk about, defend and pour value over US conceptions of the free-acting individual. This served to surround creativity with an aura, a luminous glow of ideology.

The Cold War ended, but the rise of creativity has continued. Many of its expert practices have been folded into the everyday life of organisations committed to producing useful novelty, most notably high-tech business, the management consultancies that seek to serve and advise innovatory business, and other institutions that admire high-tech business and aim to imitate its ways of working. That normalisation and integration have made for a loss of expert control. Many techniques other than defining and testing have been put in place that are intended to encourage making the new and useful, and the specific language of creativity has tended to subside into background buzz just as new-and-useful-making has become a secular religion. Should this continue, one can imagine a future of creativity without ‘creativity.’”

We can’t have billionaires and stop climate change

When it comes to ecological impact, the richer you are, the more damage you do. This pattern is evident across a wide range of indicators. In We can’t have billionaires and stop climate change, Jason Hickel explains why.

“According to recent research published by scientists at the University of Leeds, it’s not only that rich people consume more stuff than everybody else, but also because the stuff they consume is more energy-intensive: huge houses, big cars, private jets, business-class flights, long-distance holidays, luxury imports and so on. And it’s not only their consumption that matters — it’s also their investments. When the rich have more money than they can possibly spend, […] they tend to invest the excess in expansionary industries that are quite often ecologically destructive, like fossil fuels and mining,” Hickel writes.

“Knowing how income correlates with ecological breakdown should make us think twice about how our culture idolises rich people. There is nothing worth celebrating about their excesses. In an era of ecological breakdown, excess is literally deadly.”

Another issue has to do with how our economy works. “We live in an economy that is organised around perpetual expansion, or ‘growth,’ which we measure in terms of Gross Domestic Product (GDP). GDP has to grow exponentially just so the system can stay afloat. This might be fine if GDP was just plucked out of thin air, but it’s not. On the contrary, it is tightly coupled to ecological impact; the more we grow the economy, the more pressure we put on planetary boundaries.

[…]

Over and over again, the evidence points to the fact that billionaires — and millionaires, for that matter — are incompatible with planetary boundaries. If we want to live on a safe and habitable planet, we need to do something about inequality. This argument might sound radical, but it is widely shared among researchers who study this issue. The French economist Thomas Piketty, one of the world’s leading experts on inequality and climate, doesn’t mince his words: ‘A drastic reduction in the purchasing power of the richest would in itself have a substantial impact on the reduction of emissions at global level.’”

“Knowing income correlates with ecological breakdown should make us think twice about how our culture idolises rich people. There is nothing worth celebrating about their excesses. In an era of ecological breakdown, excess is literally deadly.” (Illustrations by Reza Hasni for The Correspondent)

So what do we do?

“One approach would be to introduce a cap on wage ratios — what some have called a maximum wage policy. Sam Pizzigati, an associate fellow at the Institute for Policy Studies, argues that we should cap the after-tax wage ratio at 10 to one. This is an elegant solution that would immediately distribute income more fairly, and it’s not unheard of,” according to Hickel.

“But it’s not just income inequality that’s a problem — it’s wealth inequality too. […] One way to solve this problem is with a wealth tax — an idea that is presently gaining a lot of steam. The economists Emmanuel Saez and Gabriel Zucman have proposed a 10% annual marginal tax on wealth holdings over $1 billion. This would push the richest to sell some of their assets, thus distributing wealth more fairly and cutting rent-seeking behaviour. The upshot is that the rich would lose their power to force us to extract and produce more than we need, and as a result, remove pressure from the living world.”

“In the United States, for instance, the richest 1% have nearly 40% of the nation’s wealth. The bottom 50%, by contrast, have almost nothing: only 0.4%. On a global level, it’s worse still: the richest 1% own around half of all the wealth in the world.”

“Given the severity of our ecological crisis, perhaps we should be more ambitious than what Saez and Zucman propose. After all, nobody ‘deserves’ extreme wealth. It’s not earned, it’s extracted — from underpaid workers, from nature, from monopoly power, from political capture and so on. We should have a democratic conversation about this: at what point does hoarding become not only socially unnecessary, but actively destructive? $100m? $10m? $5m?

The ecological crisis — and the science of planetary boundaries — focuses our attention on one simple, undeniable fact: that we live on a finite planet, and if we are going to survive the 21st century, then we need to learn to live on it together. Toward this end, we can take lessons from our ancestors. Anthropologists tell us that, for most of human history, most people lived in societies that were actively and intentionally egalitarian. They saw this as an adaptive technology. If you want to survive and thrive within any given ecosystem, you quickly realise that inequality is dangerous, and you take special precautions to guard against it. That’s the kind of thinking we need.

There is an extraordinary opening for this right now. The Covid-19 crisis has revealed the dangers of having an economy that’s out of balance with human need and the living world. People are ready for something different.”

Further reading

AI is an ideology, not a technology

“At its core, ‘artificial intelligence’ is a perilous belief that fails to recognize the agency of humans,” Glen Weyl and Jaron Lanier write in AI is an Ideology, Not a Technology.

“‘AI’ is best understood as a political and social ideology rather than as a basket of algorithms. The core of the ideology is that a suite of technologies, designed by a small technical elite, can and should become autonomous from and eventually replace, rather than complement, not just individual humans but much of humanity. Given that any such replacement is a mirage, this ideology has strong resonances with other historical ideologies, such as technocracy and central-planning-based forms of socialism, which viewed as desirable or inevitable the replacement of most human judgement/agency with systems created by a small technical elite,” Weyl and Lanier argue.

It is therefor not surprising that the Chinese Communist Party finds AI to be a welcome technological formulation of its own ideology. What is surprising, however, is that leaders of Western tech companies and governments have been so quick to accept this ideology.

“One reason might be a loss of faith in the institutions of liberal democratic capitalism during the last decade. (‘Liberal’ here has the broad meaning of a society committed to universal freedom and human dignity, not the narrower contemporary political one.) Political economic institutions have not just been performing poorly in the last few decades, they’ve directly fueled the rise of hyper-concentrated wealth and political power in a way that happens to align with the elevation of AI to dominate our visions of the future. The richest companies, individuals, and regions now tend to be the ones closest to the biggest data-gathering computers. Pluralistic visions of liberal democratic market societies will lose out to AI-driven ones unless we reimagine the role of technology in human affairs.”

China’s greatest advantage in AI is less surveillance than a vast shadow workforce actively labeling data fed into algorithms. (Photograph by Qilai Shen, courtesy of Bloomberg Getty Images)

“Not only is this reimagination possible, it’s been increasingly demonstrated on a large scale in one of the places most under pressure from the AI-fueled CCP ideology, just across the Taiwan Strait. Under the leadership of Audrey Tang and her Sunflower and g0v movements, almost half of Taiwan’s population has joined a national participatory data-governance and -sharing platform that allows citizens to self-organize the use of data, demand services in exchange for these data, deliberate thoughtfully on collective choices, and vote in innovative ways on civic questions. Driven neither by pseudo-capitalism based on barter nor by state planning, Taiwan’s citizens have built a culture of agency over their technologies through civic participation and collective organization, something we are starting to see emerge in Europe and the US through movements like data cooperatives. Most impressively, tools growing out of this approach have been critical to Taiwan’s best-in-the-world success at containing the Covid-19 pandemic, with only 49 cases to date in a population of more than 20 million at China’s doorstep.

The active engagement of a wide range of citizens in creating technologies and data systems, through a variety of collective organizations offers an attractive alternative worldview. In the case of Taiwan, this direction is not only consistent with but organically growing out of Chinese culture. If pluralistic societies want to win a race not against China as a nation but against authoritarianism wherever it arises, they cannot make it a race for the development of AI which gives up the game before it begins. They must do it by winning on their own terms, terms that are more productive and dynamic in the long run than is top-down technocracy, as was demonstrated during the Cold War.

As authoritarian governments try to compete against pluralistic technologies in the 21st century, they will inevitably face pressures to empower their own citizens to participate in creating technical systems, eroding the grip on power. On the other hand, an AI-driven cold war can only push both sides toward increasing centralization of power in a dysfunctional techno-authoritarian elite that stealthily stifles innovation. To paraphrase Edmund Burke, all that is necessary for the triumph of an AI-driven, automation-based dystopia is that liberal democracy accept it as inevitable.”

And also this…

Does having a purpose in life make us happy?, Marina Benjamin wonders in the latest edition of the NewPhilosopher’s (N29: Purpose).

“Vocation. A calling. From the Latin vocare. The term was originally applied to candidates moved to enter the priesthood in response to divine imperative: priests were called into service, and in servitude they found happiness and meaning. These days, when we talk about having a vocation we think of doctors and lawyers, teachers and artists, people who, in heeding their calling, find direction but also connection to a higher set of virtues — healing, justice, pedagogy, poiesis. In pursuit of these virtues, something of the self gets sacrificed, gets melted into the collective for the greater benefit of all.

I’ve always felt moved by this counter-intuitive approach to happiness, by the idea that only by giving stuff away (in this case one’s selfish desires, but it could just as easily be our material accoutrements) one is more, not less fulfilled. Now more than ever I want to cling to the idea that finding happiness bears some connection to purpose, or even duty. We inhabit a world in which too many people appear convinced that our first duty is to self and that happiness and fulfilment are consequently attendant on making sure that whatever the self desires, the self acquires. Enter consumerism, and the vicious cycle of chasing empty materialism. Enter the hollow pursuit of peak experiences — those fleeting highs that society hoodwinks us into believing are gateways to happiness, when in fact their very nature (and maybe, as a result, their beauty) is evanescent.”

“In researching his wonderful study of late life and human finitude, Being Mortal, the doctor and writer Atul Gawande interviewed dozens of elderly and terminally ill people in an attempt to discover what factors, mental and physical, contributed to their happiness and wellbeing as they confronted the very real prospect of dying. Time and again, he found that having purpose led to greater happiness than having grandchildren who visited, or enjoying social connection or material comfor,” Marina Benjamin writes in Does having a purpose in life make us happy?. (Photograph: Atul Gawande during his TEDTalk, Want to get great at something? Get a coach, in 2017)

“Lockdown has paralysed so many of us these past months, suspending us inside our domestic bubbles and robbing us of purpose. The enforced passivity is infantilising: furloughed from our working lives, banned from caring for family and friends beyond our immediate households, debarred from public service — unless we’re ‘key workers’– we’ve sunk into a kind of collective funk. In some ways this suspension of real-world activity turns us into a de facto experimental lab, where some of the latest ideas about goal-driven behaviour and happiness might be tested. At the Greater Good Science Center (GGSC) at the University of Berkeley in California, happiness experts are busy studying our lockdown behaviour. They want to see if, in these straitened times, small steps can fill the shoes of larger life goals.

It is no accident that people have been sewing masks, baking bread, taking online courses, embracing DIY. It’s exactly the kind of ‘intentional activity’ that GGSC psychologists such as Sonia Lyubomirsky recommend as part of what she calls the ‘architecture of sustainable happiness,’ which supports and builds feelings of subjective wellbeing.”

“As an ideological backbone to medieval Christian philosophy, Platonic and Aristotelian ideas about wisdom continued to dominate the scholarly debates for centuries. Theologians such as Augustine of Hippo and later Thomas Aquinas interpreted the Greeks’ writings in terms of Christian ideals when debating the foundations of reason, understanding or the nature of ethics,” Igor Grossmann, an associate professor of psychology and director of the Wisdom and Culture Lab at the University of Waterloo in Canada, writes in The science of wisdom.

The fascination with wisdom started to decline during the Renaissance and the Age of Enlightenment and didn’t re-emerge in mainstream philosophical circles until the 1970s, along with post-Second World War stability and the New Age zeitgeist, Grossmann writes. “A new focus on ‘optimal’ happiness and psychological fulfilment led to the rediscovery both of Aristotelian ideas about virtues and of non-Western perspectives on flourishing. In philosophy, Alasdair MacIntyre’s After Virtue (1981) led to renewed interest in the topic of wisdom from an Aristotelian perspective.

Behavioural scientists were fairly late to the game, with the first humble attempt to empirically study wisdom surfacing in the 1970s in a dissertation by Vivian Clayton. Today a geriatric neuropsychologist in California, Clayton aimed to explore the characteristics a group of 83 lay Americans associated with a wise person, and arrived at three commonly held elements: cognition, reflection and compassion.

Like Aristotle millennia before, the psychologists and social scientists who came after Clayton agreed that wisdom was oriented toward the pursuit of a good life. Yet, what does such pursuit entail? The meaning of the good life and ways to achieve it vary by philosophical school. Because of the long and diverse intellectual history of wisdom, scientists could pick and choose definitions. Whereas some scientists followed Platonic and Aristotelian ideas, others borrowed from Buddhism, Hinduism or Taoism — promoting worldviews that relate to the environment in quite different if not opposite ways. Some researchers sidestepped philosophy altogether, reserving the label ‘wisdom’ for the mature level of adult development. And some embarked on a Quixote-like quest to find wisdom in old age without a clear definition of either wisdom or ‘old age.’ After all, the meaning of ‘old’ has changed dramatically in the past few centuries, with lengthening average human lifespans.”

“For Pythagoras, numbers were an underlying feature of reality, mysticism, the harmonic balance of opposites, and wisdom itself. Building on some of Pythagoras’ ideas, Plato and his best student Aristotle considered wisdom an essential human virtue — ‘a habit of mind in harmony with reason and the order of nature,’ as the Roman philosopher and statesman Cicero would write a few centuries later. For Aristotle, as for Near Eastern thinkers millennia before him, wisdom was a key element on the path to achieving a good life, a path that required balance and moderation between extremes,” Igor Grossmann writes in The science of wisdom — Painting: Aristotle with a Bust of Homer (1653), by Rembrandt; oil on canvas, 143.5 x 136.5 cm. Collection of The Metropolitan Museum of Art, New York.

“In one of my first experimental studies on wisdom, my colleague Ethan Kross and I invited persons in a monogamous romantic relationship to come to our lab, randomly assigning them to two groups. One group reflected on a situation in which their partner admits being unfaithful. Another group reflected on a similar situation, but this time it’s a close friend who’s been unfaithful. When we assessed the extent to which participants recognised the limits of their knowledge, were willing to consider diverse viewpoints or find different ways in which the situation might unfold, we were surprised. Even though the participants reflected on the same type of situation, those who thought about a friend’s partner being unfaithful were substantially more likely to engage in each aspect of meta-cognition than the people whose own partner revealed they’d been unfaithful. This finding is not intuitive: don’t we know ourselves better than we know our friends? Aren’t we more motivated to work through an issue that concerns us personally? Despite possibly more in-depth insight and greater motivation when the situation concerns ourselves, our ability to reflect on interpersonal transgressions in our personal lives appears diminished.

Since I conducted this first study, asymmetry in wisdom was replicated in other laboratories, not only for infidelity but also for betrayal of trust. Yet other scholars have observed a similar asymmetry for creativity and loss aversion in decision-making. I coined the term ‘Solomon’s paradox’ to describe this asymmetry, named after the Jewish king. He showed great wisdom when it came to other people’s problems but also a great deal of folly when it came to his personal challenges.

As it turns out, wisdom doesn’t vary only between people who read about hypothetical scenarios in a laboratory. Even the same person typically shows substantial variability over time. Several years back, researchers asked a group of Berliners to report their most challenging personal issue. Participants also reported how they reasoned about each challenge, including meta-cognitive strategies similar to those described above. When inspecting the results, scholars observed a peculiar pattern: for most characteristics, there was more variability within the same person over time than there was between people. In short, wisdom was highly variable from one situation to the next. The variability also followed systematic rules. It heightened when participants focused on close others and work colleagues, compared with cases when participants focused solely on themselves.

These studies reveal a certain irony: in those situations where we might care the most about behaving wisely, we’re least likely to do so. Is there a way to use evidence-based insights to counter this tendency?

My team addressed this by altering the way we approach situations in which wisdom is heightened or suppressed. When a situation concerns you personally, you can imagine being a distant self. For instance, you can use third-person language What does she/he think? instead of What do I think?), or mentally put some temporal space between yourself and the situation (how would I respond ‘a year from now’?). Studies show that such distancing strategies help people reflect on a range of social challenges in a wiser fashion. In fact, initial studies suggest that writing a daily diary in a distant-self mode not only boosts wisdom in the short term but can also lead to gains in wisdom over time. The holy grail of wisdom training appears one step closer today.

As the sun set over Toronto, the first Wisdom Task Force meeting was coming to an end. We made a strong start, finding a unified voice about wisdom’s psychological pillars, establishing a common language, and identifying best practices for assessment. Given the brevity of the meeting, many questions remained unanswered. Can there be artificial wisdom (and how would it be distinct from artificial intelligence)? Are the psychological pillars of wisdom always desirable? How exactly can insights about wisdom be applied during times of uncertainty and civic unrest? In the months that followed, the task force members began working on a report from the meeting, as the whole world started getting closer to midnight on the atomic clock. The first half of the year 2020 has brought us bushfires in Australia, a worldwide pandemic, societal unrest, global economic fallout and counting. In such times, wisdom appears more needed than ever before. By deconstructing it, scientists can now turn an eye toward nurturing and sustaining wisdom in challenging times.”

The Persian ‘adba’ is often translated into English as ‘manners’ or ‘etiquette.’ “However, adab is about far more than politeness or ethics even,” Mana Kia, the author of Persianate Selves, writes in Persianate ‘adab’ involves far more than elegant manners. “It means proper social and aesthetic form and, across Persianate culture, form conveyed substance and, by extension, meaning.”

For six centuries, adab lived in Persianata — the term used by scholars as the cultural descriptor of Persian as a transregional lingua franca — through its widely circulated texts, stories, poetry: the corpus of a basic education. To learn adab, these particular forms of writing, expression, gesture and deed, to identify their appropriate moments, and to embody them convincingly, was to be an accomplished Persian.

“The widely read and influential Gulistan (1258), or Rose Garden, by the 13th-century poet Sa‘di, for instance, was an exemplar of beautiful prose writing but also a model of social conduct. From the Balkans to Bengal, from south-east Asia to Siberia, the Gulistan was the first Persian book that children read as part of their elementary education. Throughout Persianate Asia, it was also studied well into adulthood. Sa‘di’s stories and style entertained readers, but they also instructed them in adab. Many could be amused and grasp the lessons only in limited ways (or not at all). Some could prove themselves as the possessors of heart (sahibdil, the heart being the seat of understanding), and grasp the text’s underlying wisdom.”

Sa’di in a Rose Garden (فارسی: سعدی در گلستان.), from a manuscript of Sa’di’s Gulistan, Mughal dynasty, India (c. 1645); opaque watercolor, ink and gold on paper, 25.4 x 33.9 cm. Collection of The Freer Gallery of Art, Washington D.C.

Adab was acquired through education. Learning enabled one to practise appropriate behaviours that, in turn, instilled desired dispositions. At the same time, not everyone could learn, or not to the same degree.

[…]

To show possession of social adab, one had to know what to do and say in any given situation. This required knowing prized values, their relative relations to one another, and how to respond properly when positioned between more than one demand. In other words, to be a possessor of adab, a Persian had to know when to strive and when to accept, when to be silent and when to speak — and then how to speak the right way. Obviously, the adab — the conduct and speech appropriate in a given situation — wasn’t always straightforward, and sometimes the tension was the point. Those who could negotiate this tension most convincingly were the most admired (pointing to the always-present audience of other Persians). The terms were widely agreed upon among a wide swathe of people, from men of letters (such as poets, scholars and scribes) to men of power in this world (judges, officials, military leaders) and in the other, unseen world (such as Sufis), though there were frequent struggles and disagreements over interpretation on everything from politics, the meaning of history or even what constituted good poetry.”

But just as adab could divide opinion, it could also bring people “together across lines of difference, both within and across the great multiconfessional and multiethnic empires of the early modern, eastern Islamic world. It was the hermeneutical ground of meaning (including morals) through which Persians across regions and religious communities understood and enacted themselves, and related to other people and to their social collectives.”

“Great art affirms our humanity, but also transcends it. From earliest days, Western and Byzantine art were bound up with iconography and faith. ‘Ad maiorem Dei gloriam’ was not just a conventional formula. In most cases, it was heart-felt. The achievements of the greatest masters were exercises in technique. They were also soul-deep,” Bruce Anderson writes in The great beauty of Donatello.

“That was never truer than in the case of Donatello. Outside the academy, Michelangelo has always enjoyed more fame. But considered purely as a sculptor, Donatello was at least his equal.”

Donatello’s bronze statue of David (circa 1440s, hight: 158 cm) is famous as the first unsupported standing work of bronze cast during the Renaissance, and the first freestanding nude male sculpture made since antiquity. It depicts David with an enigmatic smile, posed with his foot on Goliath’s severed head just after defeating the giant. The youth is completely naked, apart from a laurel-topped hat and boots, and bears the sword of Goliath. Collection of the Museo Nazionale del Bargello, Florence.

Donatello’s best-known work is, undoubtedly, the nude statue of David. “The future ruler of Israel is not only portrayed as a beautiful boy, but also as one aware of his own looks. It is hard for us not to concentrate on the work’s homo-erotic qualities. Earlier generations sought to side-step this by references to the neo-Platonists. According to them, beauty could be admired without descending to the indecencies of sex. But that does not sound like Donatello: too ethereal. There is another possible aspect to this complex work. David matured to become a warrior, statesman, psalmist and libertine who could behave badly in pursuit of pleasure. Could that beautiful boy have been father to the man? Could he also have been grandfather to Absolom, who inherited his father’s self-indulgent qualities but not his formidable ones? How much did Donatello know about King David? It is impossible to know,” Anderson writes.

“There were many reasons, and not just sexual tolerance, why 15th Century Florence was a marvellous place to live. Few cities have ever known such artistic riches, within neighbouring streets and workshops. The young Donatello was there at the quickening of the Florentine Renaissance. He went to Rome with Brunelleschi, worked for Ghiberti and was a friend of Masaccio’s. Early Renaissance Florence only needed modern medicine to have been an earthly paradise. Donatello spent time in Siena, Padua and Venice, but remained in essence a Florentine, during the city’s most glorious years. We suspect that he generally revelled in all that. Thus inspired, with Terence, he could have proclaimed: ‘humani nihil a me alienum’ (I am human, and think nothing human is alien to me) — and humanity is eternally in his debt.”

The Shanghai architecture studio Roarc Renew has slotted two sweeping brick corridors between a pair of disused granaries from the 1950s in Jiaxing, China, to create the TaoCang Art Centre.

“The soul of every renewing project is to find out the hidden flow and go with it. It is as if the bright moon cannot be seen before clouds move away,” the architects write. “Certainly, the first thing is to identify which cloud should move away and how to move it away. This is the methodology Roarc Renew used in building renewing for the past years.”

“The two granaries witnessed the development of the whole town. So we want to protect this memory in an architectural way, rather than destroying everything,” Roarc Renew told Dezeen. “There are less and less ancient brick buildings in China now. We should learn and protect more.” (Photography by Wen Studio)
.
.
.
.
.
.
Tom Waits, performing live on stage at the Victoria Apollo, London, in 1981. (Photograph by David Corio, courtesy of Redferns/Getty)

“We live in an age when you say casually to somebody ‘What’s the story on that?’ and they can run to the computer and tell you within five seconds. That’s fine, but sometimes I’d just as soon continue wondering. We have a deficit of wonder right now.” — Tom Waits

Reading notes will be back next week, if fortune allows, of course. In the meantime, if you want to know more about my work with senior executives and leadership teams, please visit markstorm.nl. You can also browse through my writings and follow me on Twitter.

--

--

Mark Storm

Helping people in leadership positions flourish — with wisdom and clarity of thought