Post scriptum (2022, week 21) — Imaginology, memory, meaning, and the self, and Aristotle’s framework for transformation

Mark Storm
24 min readMay 31, 2022
“The composition of the new house keeps the spirit of Kaunas modernism alive as the circular windows designed in the concrete planes give the building the impression of modernism.” — House in Kaunas, Lithuania, by Architectural Bureau G. Natkevicius & Partners (Photograph by Lukas Mykolaitis)

Post scriptum is a weekly curation of my tweets. It is, in the words of the 16th-century French essayist and philosopher, Michel de Montaigne, “a posy of other men’s flowers and nothing but the thread that binds them is mine own.”

In this week’s Post scriptum: We need a new kind of approach to learning; memory, meaning, and the self; how Aristotle’s four domains of knowledge help us understand what transformation means; the myth of the white male scientific genius; made to measure; the risk of viewing sustainability through a market lens; how to build a culture that honours quiet time; a Roman object of all of attention; and, finally, Jason Farago, critic at large for The New York Times, on the collectors’ hunger.

Imaginology

According to Stephen T Asma in Imaginology, it is time to initiate Imagination Studies or ‘imaginologyat every level of education. Studying the imagination is not only the most exciting and accurate way to heal the chasm that divides our view of human knowledge and human nature — between the sciences and the humanities — , it also promises to reunite body and mind, reintegrate emotion and reason, and tesselate facts and values.

“According to the logic of [this] chasm, facts are the province of experimental science, while values are the domain of religion and art; the body (and brain) is the machinery studied by scientists, while the mind is a quasi-mystical reality to be understood by direct subjective experience; reason is the faculty that produces knowledge, while emotion generates art; STEM is one kind of education, and the liberal arts are wholly other.

These are no longer productive ways to organise knowledge in the 21st century.

Within the logic of the chasm, one way of thinking tends to be viewed as more capable of producing meaning: the scientific mind. But the literal, logical, scientific mind is the outlier — the weird, exceptional mode of cognition. It is not […] the dominant paradigm of human sense-making activity and yet it remains the exemplar of cognition itself and finds pride of place in our educational systems.

[…]

After years of working on the problem, and countless conversations, it seems to [Asma] that what is required is a third path: to enter the chasm itself, or descend deeper into a submerged mythopoetic cognition, and develop an entirely new way of understanding learning that embraces the true engine of the mind — imagination.”

But what exactly is imagination?

“Imagination is as imagination does,” Asma . “If we treat the imagination as merely a faculty of the mind, then we will miss the dynamic action-oriented aspect: it is part of the organism’s pragmatic attempt to get maximum grip on its changing environment. We are also likely to misunderstand the way it recruits from many brain-processing areas, such as perception, emotions, motivational/conative areas, memory, image representation, executive planning, and so on — ie, it is distributed. But though it would be wrong to view imagination as only a faculty of the mind, it is indeed a brain-based (embodied) system of capacities and applications. It has an involuntary mode (ie, mind-wandering and dreaming) and a voluntary mode (governed by conscious goal-direction).

Broadly stated, the imagination has 5 steps: mimicry; abstraction/decoupling; recombination; expression; and social feedback.

First, our neural mirror-system generates embodied mimicry of our perceptions. Then representational techniques such as drawing or language decouple those mimicked experiences from their original contexts. Next, our combinatory cognition blends and mashes novelty (involuntary or voluntary), and then — in the final two stages — those novel combinations are expressed and read against social feedback. In this way, imagination does not just redescribe a world, but regularly makes a new world. This world-making ability of imagination — its ability to generate Umwelten (perceptual worlds) — is why it should stand as the interdisciplinary foundation underlying both art and science. The more we understand imagination as core cognition, the more we recognise the artificiality of the ‘two cultures’ divide.”

“From the time of Sigmund Freud and Carl Jung and to our present System 1 theory of fast, instinctual cognition, psychology has acknowledged and explored the submerged irrational aspects of mind. But this has had little impact on education. The pre-rational mind is treated as a liability rather than a resource to be cultivated,” Stephen T Asma writes in Imaginology. (Painting: Victory Boogie Woogie, New York, 1942–1944, the last, unfinished work by Piet Mondrian; oil and paper on canvas, 127 × 127 cm. Collection of the Kunstmuseum, The Hague.)

“The core cognition of imagination is quite different from propositional cognition, that is, the ways we manipulate linguistic representations. It’s also distinct from the affective or emotional springs of mind that have been tracked by those studying embodied cognition or, more recently, affordances. Linguistic philosophers and computationalists have been moving from the top down, while affective neuroscience has been moving from the bottom up. A huge middle layer of cognition is missing between the lower conditioned associational mind and the upper symbolic representational mind. That middle layer is the imagination.

The imagination is where our cognitive architecture of imitation (eg, mirror neuron simulations, and matching vertical associations) is structured by narrative and image-based templates. These templates are sense-making tools that are imperatively (rather than indicatively) oriented, they are action-oriented representations. They do not just represent an historic event long ago. Nor are they symbolic in the way that mathematics signifies concepts. And they are not even like words that signify through denotative reference to people, places, things, events. Rather, these templates are imperative enactive symbols, demanding attention and action of us, or otherwise intervening in a causal fashion. A compelling character in a story or painting might inspire me — even if this inspiration is only just on the cusp of conscious awareness — to act differently by emulating or avoiding their behaviour. As such, these root templates of imagination are hard to see and examine. They are active in involuntary imaginings in dreams and mind-wandering (where agency and executive control are low), but they are also deeply embedded in the cultural forms we produce and consume, including folklore, religion, literature and film.

Imperatively oriented hot cognition is ancient, predating the rise of language, logic and even the expanded neocortex. It is closer to how animals get around in the world. It’s the limbic life of gut feelings and rapid responses, helping us detect quickly who is a friend, an enemy, a sexual partner, and more subtle social relations: who is a good hunter, who is reliable, who owes me, and how I should treat this approaching person right now. The mind, from this perspective, evolved to be a so-called ‘hedonic sharpener’ rather than an information processor. A hedonic sharpener reduces experiential noise, bringing each repetition of trial-and-error learning closer to pleasure or satisfaction (or, more broadly, homeostasis). The mind tries to maximise positive affect and reduce negative affect.

In my view, this is also the core of sense-making or meaning-making activity and, once recognised, we can see that imaginative work such as storytelling, image-making, song, dance and so on are some of the earliest and continually powerful forms of knowledge. An epistemology that cannot recognise this and pushes imagination to the peripheral territory of aesthetics has failed to understand the biological mind.

[…]

It’s time to give imagination its due as a core cognitive power, epistemic workhorse, therapeutic wellspring and maker of adventures. In the end, the institutionalised ‘chasm’ between forms of education is entirely of our own making and, ironically, a creation of our outdated imaginations. The yawning gulf resembles the fictional schism of the human into dualistic parts.

The chasm metaphor is no longer helpful as we consider new forms of education, but geological imagery is still valuable. Consider a different metaphor: imagination as the ‘plate tectonics’ of mind and culture. From this submerged mythopoetic perspective, the divisions between traditional disciplines shrink. Fractured territories on either side of a great divide become mere continents riding on the hidden motions of creativity below. Imaginology beckons us deeper.”

Memory, meaning, and the self

“It is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied,’’John Stuart Mill wrote in his 1861 treatise Utilitarianism [A Pig, a Fool, and Socrates]. Most of us are inclined to agree with him. But why?

According to the Britisch philosopher Roger Scruton, “[t]he final end of every rational being is the building of the self” — a job that his fellow contemporary philosophers refer to as self-constitution, Jim Holt writes in In This Is All.

“But who says we have to do this job? Why can’t we shirk it, living happily-go-luckily, like a pig, in the here and now?

Well, some people do claim to shirk it [but] most people seem to take the task of self-constitution seriously. They peg away at it even at great cost to their happiness. How do we know this? Listen to what people say when they talk about what it means to live well. People say they want to be happy. They want their needs filled; they want to be free from suffering; they want their life to be enjoyable from moment to moment, to have ‘positive hedonic tone.’ In this they are not dissimilar to pigs.

But people also say they want their lives to be meaningful. And judging from the way they talk about meaning, this seems to be distinct from happiness. Indeed, meaning and happiness can be in conflict — so the psychological evidence tells us. In surveys conducted by the social psychologist Roy F. Baumeister, people correlate the activities that fill their lives with these two goals. And it turns out that activities that increase meaning can reduce happiness — and vice versa. The most striking example is the activity of raising children, which reliably diminishes measured happiness, both from moment to moment and on the whole. Then why do people do it? This has been called the ‘parenthood paradox.’ And its resolution is simple: people have children because doing so gives meaning to their lives.

So happiness and meaning are our two masters. They are distinct wishes, since we associate them with different activities. And neither dominates the other: we sometimes sacrifice happiness for meaning, and we sometimes sacrifice meaning for happiness. We might put this conclusion into an equation:

Well-being = happiness + meaning

Exactly what do people have in mind when they talk about meaning? A familiar thought is that meaningfulness has to do with a feeling of being connected to ‘something larger.’ But Baumeister’s research suggests that it has equally to do with ‘expressing oneself and thinking integratively about past and future.’ Just thinking about the past and future tends to raise people’s sense of meaningfulness — while lowering their happiness.”

“At the outset, we supposed that what makes Socrates different from a pig is that Socrates has autobiographical memories, and that these memories furnish raw materials for constructing a self that endures over time, from birth to death. This construction is hard work, we are told, and worrying about it may detract from one’s piggish happiness in the moment. But it is necessary,” Jim Holt writes in In This Is All. (Illustration: Rosette inscribed with the names and titles of Mughal emperor Shah Jahan, India, c. 1645; ink, opaque watercolor and gold on paper, 38.6 x 26.5 cm. Collection of The Metropolitan Museum of Art, New York)

“Which brings us back to memory and the self. Let’s say that autobiographical memory furnishes the raw materials for self-constitution. How do we make those raw materials into a meaningful self? The fashionable answer these days, among both psychologists and philosophers, is narrative. To be a self, the thinking goes, is to live a life that is structured like a story — a story of which you are both author and protagonist.

The idea that ‘living well’ means ‘living narratively’ is prefigured in Nietzsche, who wrote that ‘we want to be the poets of our lives.’ It can be traced back to the earliest times in which humans started worrying not just about eating and reproducing but about living a life that matters — a life like that of Achilles, which, though perhaps short and full of destructive emotions, was songworthy. […]

Some philosophers — among them Alisdair MacIntyre, Paul Ricoeur, and Charles Taylor — have insisted that if a narrative is to endow a human life with meaning, it must take the form of a quest for the good. But what makes such a quest an interesting story? There had better be some trouble in it, because that’s what drives a drama. If adversity doesn’t figure prominently in your autobiographical memories, your life narrative will be a bit insipid, and your sense of meaningfulness accordingly impaired.

The claim that big troubles are essential ingredients of a good narrative, and hence of a good life, is called by psychologists the ‘adversity hypothesis.’ If true, this hypothesis ‘has profound implications for how we should live our lives,’ observes the psychologist Jonathan Haidt: ‘It means that we should take more chances and suffer more defeats.’ It also means, Haidt adds, that we should expose our children to the same.

But adversity can be taken too far. We don’t want our lives to assume the form of tragedy (easy as it is to weave our autobiographical memories into such a grim narrative, especially when lying awake at three am). Meaning, after all, is not everything; happiness counts, too.”

A philosopher’s guide to messy transformations

In Metaphysics (Book 1) and Nicomachean Ethics (Book 6), the ancient Greek philosopher Aristotle made a distinction between expertise, science, wisdom, and prudence. Together, they provide a framework for understanding, what Pia Lauritzen calls, the ‘messy middle’ of transformation: “the unknown, uncertain space between past and future, between theory and practice, that characterizes organizations in the process of becoming something new.” To navigate this messy middle of transformation, leaders must understand how the different knowledge domains contribute to the transformation — and help their employees find a way to be part of the same conversation, she argues in A philosopher’s guide to messy transformations.

“Typically, when presented with four quadrants, the one you want to be in, or arrive at, is the upper right (so you’re indexing positively on both attributes). But, in this Aristotelian approach, you must resist the temptation to favor one quadrant over the others. Instead, you must engage and motivate those who are going to carry out the transformation by starting in the bottom left — the domain of expertise — and use this as a starting point to draw on knowledge from the other quadrants to drive the transformation forward,” Pia Lauritzen writes in A philosopher’s guide to messy transformations.

“In the domain of expertise, people base their understanding on practical insight into the history and culture of the company. A question from an attendee on the panel I conducted illustrated this nicely: ‘How do you get an organization with a legacy of being extremely risk averse to embrace agility, which can be perceived as a more risky, trial-and-error approach?’

The question acknowledges and accepts that the company needs to embrace agility but demonstrates neither insight nor interest as to why it needs to do so. Whether the questioner trusts senior management’s decision to embrace agility, or she has other reasons for ignoring the ‘why,’ it is obvious that she wants to know about the ‘how.’ Too often leaders forget about the how. And that can be a costly mistake. If leaders assume they know what questions their employees need answered to drive the transformation forward rather than listen to what they actually ask, their initiatives will have little or no impact.

[…]

In the domain of science, people ask questions that seek to elicit an understanding of why companies approach transformation the way they do. An example of such a question from the panel discussion was: ‘Agile methodologies were formed in the software development world and have eventually become a standard that companies and teams in almost any industry are driving towards. Why do you think it has had this proliferation, and do you think it makes sense?’

The person answering the question started out by saying, ‘I understand agile as a looser policy, while scrum and other specific methods may be more sector-specific.’ There is an assumption here that the person asking the question has a theoretical knowledge of universal frameworks and methods and knows what terms like scrum, SAFe, and ART mean. However, the answer doesn’t show any insight into the practical circumstances in which transformation happens. Leaders who use consulting lingo when answering their employees’ questions risk sending a signal that they are more interested in discussing theoretical problems than they are in finding practical solutions.

To help people co-create a shared language, leaders must bridge the gap between the employees who know the theoretical frameworks and those who don’t. One way for leaders to do that is to adjust their communication to the people they are talking to. Another is to use words that everyone understands. For example, instead of referring to concepts like agile, leaders can talk about the importance of understanding and adjusting to customers’ changing needs.

The domain of wisdom is different from the other knowledge domains in that it draws on a completely different vocabulary. To find wisdom in something that is new to everyone, like digital transformation, leaders sometimes look for inspiration among thought leaders who are discovering new ways of thinking and talking about transformation rather than teaching existing theories. Typically, people representing the domain of wisdom use and combine terms from professional areas that are foreign to the organization. Consider, for example, regenerative leadership, an expression developed by sustainability expert Laura Storm. In her TEDx Talk (2019), she described it this way: ‘I see the dawn of a new leadership logic, a new paradigm, a new generation of leaders. And their secret is that they apply the logic of life itself to how they run organizations.’ The term was born in the world of environmentalism but has since gained currency well beyond it.

Leaders who use consulting lingo when answering employees’ questions risk sending a signal that they are more interested in theoretical problems than they are in practical solutions.

[…]

This does not mean that leaders should not be inspired by abstract thinkers — if I believed that, I wouldn’t have introduced Aristotle. It just means they should be careful not to assume that it is as inspiring to their employees as it is to them.

In the domain of prudence (good judgment), people depend on neither theoretical frameworks nor translation of basic principles to make sense of complexity. Instead, they use the inspiration they get from everyday practice, consultants, and thought leaders to talk about the transformation in a language everyone understands.

An example of this from the panel was an attendee observing, ‘In my experience the greatest threat against change is being busy. There is such a resistance against slack in our schedules and plans. Downtime is still considered ineffective. We invest so heavily in change, but we don’t allow for time to learn. What is your take on this?’ The question demonstrated insight into the basic principles of effective transformation without mentioning either theoretical concepts such as agile and scrum or foreign concepts such as cynefin and teal.

The question shows how insights from the domain of science and wisdom can shed light on a problem in the practical domain. This person asking the question is, in essence, a catalyst for change. The question provides a solution in itself. And leaders who are listening carefully will respond by giving people the time to learn and adjust. While internal and external catalysts play a vital role in translating between the different domains, the leader is responsible for driving the transformation forward by giving people what they need to change.

Transformations often fail because leaders don’t acknowledge the importance of inviting people with different ways of thinking and talking about transformation to join a common conversation. None of the four knowledge domains can exist without the others, and leaders depend on all of them to navigate the messy middle of transformation. As Aristotle would say, ‘All learning comes about from already existing knowledge.’ Which is why leaders shouldn’t introduce and promote new frameworks when driving transformation. Instead, they must help their employees share their already existing knowledge and collaborate on turning it into new ways of working.”

In the margins

“Art cannot exist without its creator, whatever else they may have thought or done. But knowledge — especially the natural laws of physics and mathematics — is discovered. Why mark it with the personal lives or beliefs of the individuals who found it?

As the historian James Poskett points out in Horizons, which tells the story of the global roots of modern science, there are many good reasons to do just that. The call to ‘decolonise’ subjects by acknowledging their cultural context is seen by some as needlessly political, but Poskett argues that science was already politicised. The idea that scientific revolutions are the preserve of the European male genius — Newton, Darwin, Copernicus, Galileo, Einstein — is, he argues, a political project to reinforce the idea that people who support a particular system of government, or live on one side of a border, are more curious, inventive and adept than others.

[…]

What purpose does this myth serve? Science has always been an instrument of power — as Poskett explains, the ability to create a calendar or understand the pharmacology of a certain plant can have far-reaching implications. In the 20th century the power of science became increasingly evident, as ever more technical learning allowed for ever more destructive weapons. With the arrival of the Cold War it became necessary to pretend there was such a thing as Soviet science, or that Islamic science belonged to some past ‘golden age,’ or that Europe was the only place where a renaissance of knowledge happened in the 17th century (it happened everywhere from Timbuktu to Tibet, and the ‘renaissance’ wasn’t named until everyone involved had been dead for 200 years). The truth was far more complex, international and diverse, but the myth was easier to understand. The story of the apple tree is easier to explain than the inverse square law.

But if science is now constrained by a reverence for the past, it’s not the first time this has happened. During the medieval period, studying science or medicine meant reading ancient texts in Latin and Greek; it was the breaking of these traditions that enabled a new age of discovery. An honest conversation about the history of science is therefore not just of moral importance — it is part of what makes discovery possible.”

From: The myth of the white male scientific genius — and why its time is up, by Will Dunn (The New Statesman)

“When I think about what measurement means in today’s society, how it’s used and misused, and how we internalise its logic, I often end up thinking about a single figure: 10,000 steps. It’s often cited as an ideal daily target for activity, and built into countless tracking apps and fitness programmes. Walk 10,000 steps a day, we’re told, and health and happiness awaits.

This number is presented with such authority and ubiquity you’d be forgiven for thinking it was the result of scientific enquiry, the distilled wisdom of numerous tests and trials. But no. Its origins are in a marketing campaign by a Japanese company called Yamasa Clock. In 1965, the company was promoting a then-novel gadget, a digital pedometer, and needed a snappy name for their new product. They settled on manpo-kei, or ‘10,000-steps meter.’ But why was this number chosen? Because the kanji for 10,000 — and hence the first character in the product’s Japanese name, 万歩計 — looks like a figure striding forward with confidence. There was no science to justify 10,000 steps, it seems — just a visual pun.

If the 10,000 steps are an illusion, though, they are a useful one. Research into how many steps a day we should pursue offers more finely graded targets — they say 10,000 steps is too low for children and too high for many older adults. Still, it’s abundantly clear that any increased activity is good for us, and that people who do pursue a daily target of 10,000 steps have fewer signs of depression, stress and anxiety (even if they don’t hit that goal). In this light, the Quantified Self crowd seem to have a point: if you want to reach people, you need to speak in a language they understand.

When thinking about measurement in today’s world, the German sociologist Hartmut Rosa suggests it is characteristic of a particular 21st-century desire: to structure our lives through empirical observation, rendering our interests and ambitions as a series of challenges to overcome. ‘Mountains have to be scaled, tests passed, career ladders climbed, lovers conquered, places visited, books read, films watched, and so on,’ [Rosa] writes. ‘More and more, for the average late modern subject in the developed western world, everyday life revolves around and amounts to nothing more than tackling an ever-growing to-do list.’

This mindset, says Rosa, is the result of centuries of cultural, economic and scientific development, but has been ‘newly radicalised’ in recent years by digitalisation and the ferocity of unbridled capitalist competition. Measurement has been rightly embraced as a tool to better understand and control reality, but as we measure more and more, we encounter the limits of this practice and wrestle with its disquieting effects on our lives.

My interest in the history of metrology began as a simple curiosity about the origin of certain units of measurement. Why is a kilogram a kilogram, why an inch an inch? But these questions have a deeper resonance, too: if measurement is the mode by which we interact with the world, it makes sense to ask where these systems come from, and if there is any logic to them.

The answer I’ve found is that there isn’t any — not really. Or rather, there islogic, but, as with the 10,000 steps, it’s as much the product of accident and happenstance as careful deliberation. The metre is a metre because hundreds of years ago certain intellectuals decided to define a unit of length by measuring the planet we live on. As it happens, they made mistakes in their calculations, and so the metre itself is around 0.2mm short: a minute discrepancy that has nevertheless been perpetuated in every metre ever since. In other words: it is the way it is because we say it is. Measures, then, are both meaningful and arbitrary; iron guides in our lives that are malleable if we want them to be. If they don’t work — if they don’t measure up — then they too can be remade.”

From: Made to measure: why we can’t stop quantifying our lives, by James Vincent (The Guardian)

“On the level of language, which can guide behavior and outcomes, the term ESG is fairly meaningless. It’s an acronym for categories of things companies should work on. That’s likely a part of why investors like it — they can look like they’re talking about real progress on environmental and social issues without saying much at all. Efforts can easily drop into incremental approaches that may be worse than nothing. As Paul Polman, my coauthor on the book Net Positive, likes to say, ‘So if I killed 10 people before but only five now, am I a better murderer?’

When a company announces, in essence, ‘We’re doing ESG,’ what does that tell you? It’s like saying ‘We do HR.’ OK, so you have a human resources department and a senior vice president running it, but what are you doing with your people? Investing in them? Helping them find their purpose? Or maybe laying off all permanent staff — over Zoom! — to replace them with temps?

We clearly need to imbue ESG with meaning. We need sustainable or regenerative or net-positive ESG. Of course, these terms also need details behind them, but at least they tell you something about the direction in which you’re headed.

But I have a larger philosophical concern with investor-led language. Seeing all things through the lens of markets and the quest for shareholder maximization is largely how we got into this mess in the first place. We’ve put profits above literally all else, and it’s leading to ecological collapse and vast inequality. Framing a company’s commitments around battling climate disaster in investor terms turns it into an exercise of ‘Does this create shareholder value?’ — which is not beside the point but skews the world dramatically. Sure, shareholders should do well, but only after a company has served a purpose for stakeholders and helped protect the world and resources we all rely on to survive and thrive.

Investors aren’t well positioned for this approach. Just as fossil fuel companies should not lead the planning of our energy future, it seems unwise to let finance lead the journey to a humane, more just, less greed-filled form of capitalism.

This isn’t all just semantics. If we talk mostly in broad terms about what we’re doing and not in concrete, science-based ways about how fast we need to cut carbon or improve human rights, where are we, exactly? That said, even though I’m a writer, and words and rhetoric matter to me, I’ve always cared far more about outcomes. If your company’s carbon emissions are declining quickly and it’s paying living wages, working in its sector to find larger solutions, lobbying for the kinds of policies that help create systemic change, working to defend democracy and science, and so on … then you can call your efforts any number of things, so long as it works for you. Walmart, for example, has embraced becoming a ‘regenerative’ company — and if that motivates the organization, fantastic.

Either way, the moral and business imperative for leaders today is to focus on what really matters: action at the speed and scale we need to build a net-positive world.”

From: What’s Lost When We Talk ESG and Not Sustainability,’ by Andrew Winston (MIT Sloan Management Review)

“Across our society today, norms of noisiness run deep. Demands like constant connectivity and maintaining a competitive advantage still prevail in most office cultures. Few organizations prize or prioritize pristine human attention. But there are simple strategies we can employ in order to find our own personal sanctuaries and to shift broader cultures. By reclaiming silence in the workplace, we can create the conditions for reducing burnout and enhancing creative problem solving.”

“If we want organizational cultures that honor quiet, there are a few general principles we need to apply to make the transformation. The first is that we have to deliberately talk about it; we need to have clear conversations about our expectations around constant connectivity, when it’s permissible to be offline, and when it’s acceptable to reserve spaces of uninterrupted attention. These conversations can get into deeper cultural questions like whether it’s possible to be comfortable in silence together rather than always trying to fill the space, or whether it’s OK to be multitasking when another person is sharing something with you.

[…]

Starting a conversation about shared quiet doesn’t just mean seizing the opportunity to point fingers at other people’s noisy habits. The best starting point for a conversation on group norms is a check-in with yourself. How are you contributing to the auditory and informational noise facing the greater collective?

Maybe you unwittingly leave ringers and notifications on full blast. Maybe you ‘think out loud’ or habitually interrupt others. Perhaps you impulsively post on social media or send excessive texts or emails that require responses. Maybe you play music or podcasts in common spaces without checking in with others or jump on important work calls while your daughter is sitting next to you doing her homework.

Take some time to question whether any given habit that’s generating noise is necessary or if it’s really just an unexamined impulse — a default that needs to be reset. If your self-observation doesn’t yield clear insights, ask a truth-teller in your life for observations about how you could do better.”

From: How to Build a Culture That Honors Quiet Time, by Justin Zorn and Leigh Marz (Harvard Business Review)

“ The account has long been that a shipwreck in the ancient past had sent a precious Roman marble statue — a rare copy of a fifth century B.C. depiction of the ‘Doryphoros,’ or spear bearer — into the depths of the seas off Italy.

That was the account given in the late 1970s when the statue materialized out of the blue at the Glyptothek in Munich, the city’s museum of ancient Roman and Greek art. A dealer had lent it to the museum in anticipation of a possible sale, and the story he told then was of a statue that had been rescued from the ravages of seawater and held in a private collection, where for decades it escaped attention.

And that was the account endorsed by officials of the Minneapolis Institute of Art when they purchased the statue for $2.5 million in 1986 and installed it as a signature artifact in a showcase gallery.

But now Italian authorities are pushing the Minneapolis museum to return the work, asserting it was actually illegally excavated from a site near Pompeii in the 1970s.

[…]

‘The statue doesn’t show signs of having been under salty seawater for a long time,’ [Gabriel Zuchtriegel, the director of the archaeological site at Pompeii, told Elisabetha Povoledo] said, citing the impact such corrosives would have had on the marble. ‘This comes from the land.’”

From: Italy Says Ancient Statue in U.S. Museum Was Stolen, Not Lost at Sea, by Elisabetta Povoledo (The New York Times)

The object of all of this attention is one of the few ancient copies of a revered original by the Greek artist Polykleitos, which has long been embraced as one of the most important works of Classical antiquity, celebrated as an example of a perfectly proportioned body (detailed by the sculptor in ‘The Canon,’ a companion treatise to the statue). Of the copies that exist, the one in Minneapolis, thought to have been created between 27 B.C. and A.D. 68, is considered one of the best preserved — The Doryphoros (after Polykleitos), 27 BCE — AD 68, by an unknown Roman artist; pentelic marble, copy of a Greek bronze statue of c. 450–440 BCE. Collection of the Minneapolis Institute of Art. Photography (above and below) by Jenn Ackerman for The New York Times.
“No longer does museum validation or scholarly attention determine a painting’s value. Now, the collectors’ hunger comes first, and institutions must follow,” Jason Farago argues in Catch a Rising Star at the Auction House. (Painting: Portrait of a Lady (After Louis Leopold Boilly) , 2019, by Ewa Juszkiewicz; oil on canvas, 200 x 160 cm. Recently sold at Christie’s for $1.56 million, more than five times its high estimate. Photogtraph courtesy of Christie’s, New York)

“You remember Oscar Wilde’s aphorism from Lady Windermere’s Fan: The cynic knows the price of everything and the value of nothing. Culture was one of the last domains in neoliberal times that tried, at least a little, to hold up a distinction between the two, between, to put it bluntly, the market and our lives. The cynics of this digital age have had their ultimate victory by rendering price and value synonymous, and we’re in serious trouble if our cultural institutions, on the altar of inclusion and anti-elitism, accelerate their own capitulation to acclaim via algorithm.” — Jason Farago in Catch a Rising Star at the Auction House

Post scriptum will be back next week, if fortune allows, of course.

If you want to know more about my work as an executive coach and leadership facilitator, please visit markstorm.nl. You can also browse through my writings, follow me on Twitter or connect on LinkedIn.

--

--

Mark Storm

Helping people in leadership positions flourish — with wisdom and clarity of thought