Post scriptum (2022, week 16) — The carbon footprints of the rich, has neoliberalism come to an end?, and objective judgements of taste

Mark Storm
23 min readApr 25, 2022
Oscar Niemeyer’s final building opens in a vineyard in Aix-en-Provence, France — “Designed in 2010, two years before Niemeyer’s death aged 104, the curved white pavilion was created to sit comfortably within the landscape,” Tom Ravenscroft writes. (Photography by Stéphane Aboudaram / We Are Content(s))

Post scriptum is a weekly curation of my tweets. It is, in the words of the 16th-century French essayist and philosopher, Michel de Montaigne, “a posy of other men’s flowers and nothing but the thread that binds them is mine own.”

In this week’s Post scriptum: Why we need to talk about the carbon footprints of the rich; has neoliberalism come to an end?; the question of subjectivity and objectivity in aesthetic judgements; a guide to growing older; we have a creativity problem; when an institution punishes internal dissent, it shoots darts into its own brain; a Japanse ‘House of Memories’; and, finally, Susan Cain‘s Bittersweet and the idea of longing.

The carbon footprints of the rich

Dramatically unequal consumption lies at the heart of the climate crisis, the founding director of End Climate Silence, Dr. Genevieve Guenther, argues in We Need To Talk About The Carbon Footprints Of The Rich.

“[O]ne of the most powerful things you can do is talk about the climate crisis in your networks. But according to many climate activists, the one thing you should not do is discuss people’s personal carbon footprints. Talking about individual carbon footprints, these activists argue, is, at best, a distraction from the essential work of raising a climate movement and, at worst, a naive and counterproductive embrace of propaganda developed by oil and gas companies to dishearten people and divert them from building a movement for collective action. But this view of climate communication and carbon footprints rests on the mistaken idea that there is a universal individual whose personal carbon footprint is always an irrelevant distraction,” says Guenther.

“But there is not only one kind of individual in the world — not everybody is so inextricably entangled in the fossil fuel system that they have no choice but to emit too much carbon. Individuals are situated in their class; their identities are inflected by their privilege. ‘Driving’ signifies something very different for the American worker at a big-box store who is forced to commute in her car to the mall versus the private equity manager speeding a gleaming Lamborghini around the cliffs of the Italian Riviera. One the expression of entanglement in an exploitative economic system that makes it impossible not to emit carbon; the other is the expression of the injustice of that very system.”

“The discretionary carbon footprints of the 1% are not only unjust on a symbolic level. They are also quite literally a material cause of the climate crisis,” Genevieve Guenther writes in We Need To Talk About The Carbon Footprints Of The Rich. (Painting: John Orde, His Wife Anne, and His Eldest Son William, between 1754 and 1756, by Arthur Devis; oil on canvas, 94 x 96.2 cm. Collection of the Yale Center for British Art, New Haven)

“Dramatically unequal consumption lies at the heart of the climate crisis. In calling for justice, the climate movement must call for the wealthy to reduce their individual carbon footprints — or have their individual carbon footprints reduced by regulation — to as close to zero as possible,” Guenther writes.

“People who are locked in or stuck should never be made to feel ashamed, frustrated or helpless. And no one should embrace moral absolutism. Movements are built from connections, and connections are made when people approach each other with empathy, embracing their common imperfections and ambivalences, their shared complicity and entanglement. Having the courage to start thinking about the climate crisis at all is hard enough: People entering the climate movement should be welcomed into a community of care.

Yet for us to have any chance to resolve the climate crisis, the climate movement needs to call for climate justice — for new norms and policies targeting the luxury consumption of the super-rich and the upper-middle-class consumption that emulates it.”

“The personal emissions of the top 0.001% ‘can have the same impact as nationwide policy interventions,” Genevieve Guenther writes in We Need To Talk About The Carbon Footprints Of The Rich. (Painting: John Jennings Esq., his Brother and Sister-in-Law, 1769, by Alexander Roslin; oil on canvas, 121 x 148 cm. Collection of the Nationalmuseum, Stockholm)

“Resolving the climate crisis will require more than innovation. It will require remaking our systems — including our class system, or at least the unequal levels of consumption that our class system justifies. Ultimately, this transformation will be delivered by government policies in the context of international negotiations. But it requires a revolution in values, too,” argues Guenther.

“To change the system is to transform its social norms and ideological assumptions as much as it is to transform its means of production and consumption. This work has historically preceded the passage of policy, which only later codifies new norms in legislation whose goals have been so ‘normalized,’ as it were, that opposition or reversal becomes fringe or even unthinkable.

We have to make it normal not just to use zero-carbon forms of energy, but also to pursue our ambitions and enjoy our pleasures without making global heating worse. The material possibility for that life will be produced only by policy, but its cultural and imaginative possibility will be created only by behavior.

The climate crisis is profoundly unfair. The wealthy are currently destroying the Global South — and, if nothing changes, eventually the whole planet — for their own profit and pleasure. Most of their voracious consumption is entirely voluntary. We need to start talking about the personal carbon footprints of the rich and, as much as we can, walking our talk in order to resolve the climate crisis in time to have a livable future.”

Has neoliberalism really come to an end?

The Nation published a conversation with the historian Gary Gerstle about understanding neoliberalism as a bipartisan worldview and how the political order it ushered in has crumbled.

“The term ‘neoliberalism’ is often used to condemn an array of economic policies associated with such ideas as deregulation, trickle-down economics, austerity, free markets, free trade, and free enterprise. As a political movement, neoliberalism is seen as experiencing its breakthrough 40 years ago with the election into office of Ronald Reagan and Margaret Thatcher. And since the 2007-08 financial crisis, an explosion of academic work and political activism has been devoted to explaining how neoliberalism is fundamentally to blame for the massive growth in inequality.

Yet [in his new book, The Rise and Fall of the Neoliberal Order: America and the World in the Free Market Era, Gerstle argues] that this understanding of neoliberalism struggles to explain why it has exerted such a profound influence on both the left and the right. Gerstle […] thinks neoliberalism should be understood as a worldview that promises liberation by reconciling economic ‘deregulation with personal freedoms, open borders with cosmopolitanism, and globalization with the promise of increased prosperity for all.

Such a vision, as Gerstle relates, was able to attract such strange bedfellows as Steve Jobs and Barry Goldwater, Ralph Nader and Ronald Reagan, and Bill Clinton and Newt Gingrich. When seen as a worldview, Gerstle contends, neoliberalism can trace its origins just as much to the left, and in particular the New Left, as to the right. People across the political spectrum […] had a common goal: the end of a bureaucratized world,” between Daniel Steinmetz-Jenkins writes in the introduction to the conversation.

“The power of — and the fear unleashed by — the communist threat is now largely forgotten. Few accounts of neoliberalism treat the fall of the Soviet Union between 1989 and 1991 or the collapse of communism as capitalism’s chief global antagonist as seminal events. But they were,” says Gary Gerstle. “[I]t removed what had been an imperative in America (and in Europe and elsewhere) for compromise between elites and the working classes.” (Photograph: The Berlin Wall opening in Berlin, Germany, on November, 1989, by Patrick Piel / Gamma-Rapho via Getty Images)

Daniel Steinmetz-Jenkins: Over the last decade, few topics on the left have received more attention and stirred more debate than the subject of neoliberalism. Unlike some critics, you believe that neoliberalism is still a legitimate term of scholarly analysis in regard to understanding contemporary politics — rather than a pejorative, catch-all term others have deemed it. Why do you believe this is the case, and, in a nutshell, how do you define it?

Gary Gerstle: Neoliberalism is a creed that prizes free trade and the free movement of capital, people, and information. It celebrates deregulation as an economic good that results when governments are removed from interfering with markets. It valorizes cosmopolitanism as a cultural achievement, the product of open borders and the consequent voluntary mixing of large numbers of diverse people. It hails globalization as a win-win proposition that both enriches the West and brings an unprecedented level of prosperity to the rest of the world. It tolerates economic inequality and justifies the weakening of labor movements, welfare policies, and other ‘impediments’ to free market capitalism in the name of economic growth robust enough to lift all boats. These core principles deeply shaped American politics across the last 50 years.

The label ‘conservative’ is often attached to the aforementioned beliefs. But conservatism, in the classical sense of the term, connotes respect for tradition, deference to existing institutions, and the hierarchies that structure them, and suspicion of change. Neoliberalism, on the other hand, calls for unleashing capitalism’s power, along with entrepreneurialism and other forms of risk-taking, and eliminating institutions that stand in the way.

Invoking neoliberalism allows us to shift the focus somewhat away from narratives that have dominated so much history writing — white southerners, for example, seeking to maintain racial privilege in the era of civil rights, or evangelicals pushing back against women’s, gay, and sexual liberation movements — and toward equally important stories that focus on venture capitalists, Wall Street ‘modernizers,’ and information technology pioneers. That shift in emphasis, my book suggests, is overdue.

DSJ: Your understanding of neoliberalism goes against many of the dominant interpretations of it. For instance, many argue that what made neoliberalism ‘new’ is that it broke with the ‘old’ classical liberalism of the nineteenth century, which typically is associated with freeing markets from state regulation and interference. On this reading, the early neoliberals, perhaps most notably Friedrich Hayek, realized that — given mass enfranchisement, labor unions, and socialist parties — only strong states could protect and shield free markets from democratic forces. However, you see a strong connection between classical liberalism and neoliberalism. Can you explain this connection, and why, if it is so strong, the term ‘neoliberalism’ is even necessary?

GG: Classical liberalism is thought to be an emancipatory movement seeking to remove the heavy hand of the state, in the form of monarchs and mercantilists, from civil society. Neoliberalism is thought to be a repressive movement that uses the state to enforce capitalist prerogatives on ‘unruly’ democratic populations.

This dichotomy is overdrawn. We now know (from the excellent work of a generation of historians and political scientists) that governments were as necessary to construct and supervise markets in the 19th century (the era of classical liberalism) as they are today. Markets may emerge from what Adam Smith once described as the propensity of people to ‘truck, barter, and exchange,’ but they can only flourish within a context of government-enforced rules. ‘Laissez-faire’ is a political and economic project, not a condition of nature. It has always been thus.

By the same token, it is a mistake to treat neoliberals of the past half century as being exclusively concerned with order and domination, and with constraining (and sometimes undermining) democracy. In many of them a spirit of individualism and freedom reminiscent of classical liberalism still lives. This is especially true in the United States where, as Michel Foucault once observed, liberalism has always been everywhere, sprouting on the left as well as on the right, never confined to one party or school.

My book takes Foucault’s insight as inspiration: It argues that neoliberalism’s career has been marked as much by heterodoxy as orthodoxy, by its capacity to make individuals as different as tech hippies and Ronald Reagan, as dissimilar as Barry Goldwater and long-haired university students who wanted to bring down ‘the system,’ feel as though they held the key to unlocking a future of untrammeled personal freedom.

Why not, then, call this aspiration toward freedom by its original name, ‘liberalism’? Because Roosevelt and his New Dealers stole the name from its free market advocates in the 1930s and imbued it with social democratic meaning. That theft qualifies as one of history’s great terminological heists. Milton Friedman was forever dismayed by what he regarded as the corruption of the term ‘liberalism.’ So was Friedrich Hayek. Both men refused the label ‘conservative’ to describe their beliefs. The term ‘neoliberal’ allowed them to affiliate with the classical liberal tradition they admired while separating themselves from the New Deal liberalism they despised.

“Biden may fail. If he does, would it mean that an argument for the fall of the neoliberal order should be reconsidered? Not at all. A rising political order centering on Trump-style authoritarianism would mark the end of the neoliberal order just as surely as one centered on Biden-style progressivism. The United States may also be in the midst of an extended period of dysfunction that will forestall the establishment of a new political order, left or right. But one thing is clear: the neoliberal heyday has passed,” Gary Gerstle explains. (Photograph: Joe Biden during a campaign on February 1, 2020, by Andrew Harnik / AP)

DSJ: There is a big debate on the left today regarding the question of whether the neoliberal age is coming to an end. Trump, the rise of Bernie Sanders, Biden’s Build Back Better Act, China’s rise, and the Russian invasion of Ukraine can all be used in various ways to defend this view. You agree with this perspective. What are your essential reasons for doing so?

GG: Donald Trump and Bernie Sanders were inconsequential political figures during the order’s 1990s heyday. That the two became in the 2010s the two most dynamic forces in American political life provides the best evidence that the neoliberal order was losing its hold. It was no longer constraining political choice.

Other evidence for the neoliberal order’s fracturing can be gleaned from a brief look at the erosion of support for four key planks of the neoliberal ‘freedom’ agenda: the free movement of goods, people, information, and capital.

On the free movement of goods: During the neoliberal heyday, protectionism was a dirty word, not to be uttered by those pursuing high political office. Now it is favored by many on the right as well as on the left.

On the free movement of people: Thirty-five million people came to America between the 1960s and 2000s. Now the talk is all about walls and borders.

On the free movement of information: The instantaneous transmission of vast amounts of data and opinion to every corner of the world had been crucial to neoliberalism’s globalizing project. Now China, Russia, Turkey, and other countries are seeking to insulate their information systems from international ‘contamination.’

On the free movement of capital: This freedom has been the one most resistant to controls. But the actions recently taken by Western governments against Russia as well as against its oligarchs living abroad — freeing or seizing assets, denying the state and its people the opportunity to move money from one country to another or to convert their funds from one currency to another — constitute a major strike against that freedom.

Day by day, now, a new world is taking shape. Elements of neoliberalism will survive this transition. But neoliberalism as a political order is finished.

Can we make objective judgements of taste?

In Can We Make Objective Judgements of Taste?, Rowan Anderson aims to tackle the question of subjectivity and objectivity in aesthetic judgements.

“Proving that we could ground matters of taste in any kind of objectivity seems a much greater task than perhaps in knowledge or ethics — and I think it is. But in my view, it is hard to see how different these things really are. My argument is that, as people come together and produce art, an intersubjectively grounded notion of it emerges that sets objective constraints on what could be deemed art. It takes the form of a set of rules which implicitly determine good and bad art. So, while art could not exist completely independent of any mind (as a feature of nature), its rules and determinations exist as good or bad art independent of what any one person might say of them. It takes reliable practices of aesthetic critique to be someone who can find and make objective judgments about those emergent truths of taste,” Anderson argues.

“One reason for thinking we are genuinely arguing about something objective when we are discussing matters of taste is that we are responsive to reasons why we ought to think differently. And, more importantly, we can be convinced of such expressions. That is, our subjective experience of art is responsive to objective reasons why we ought to think this is better than that. […] What begins as a mere opinion, becomes common features, reasons and analyses of art that eventually come to be mutually understood among groups of people. They eventually amount to a co-operative exercise among persons, a constant and mutual working-out of what it means to be art — of what it means to be good art. From this working-out emerge some rules of the game or a conception of art, independent of any one mind,” Anderson writes.

“The concept of art itself is subjective in the sense that it exists as the product of thought, of the impressions of objects imposed upon individual human minds. But on the other hand, the intersubjective basis for the concept of art fixes it with objectivity (mind-independence). It is objective in the sense that its reality takes on an independent existence, independent of what you or I may think about art. No amount of my testimony to the contrary will ever make doing the dishes art, for example. On the other hand, it can be said that Caspar David Friedrich’s Wanderer above the Sea of Fog is objectively a piece of art. This means there are finitely malleable objective standards (as determined by the collective human experience) for what is art.”

In pronouncing one’s preference for either ‘this’ or ‘that,’ David Hume tells us what will happen: “No one pays attention to such a taste; and we pronounce, without scruple, the sentiment of these pretended critics to be absurd and ridiculous. The principle of the natural equality of tastes is then totally forgot […] it appears an extravagant paradox, or rather a palpable absurdity, where objects so disproportioned are compared together.” (Painting: Nude, Green Leaves and Bust, 1932, by Picasso; oil on canvas, 162 cm × 130 cm. Private collection , currently on long-term loan to Tate Modern, London)

“David Hume […] in his essay On the Standard of Taste offers a provisional, non-exhaustive but very eloquent list of characteristics of the critic primed to make sound judgements (that I partially reproduce below). This will give you something of an idea of what I have in mind about the good critic. Something else to keep in mind is that, as previously pointed out, these characteristics are often picking out objective facts of the matter that can substantially contribute to our experience of the piece and can count as reasons for liking it more or less. These qualities might just as easily, with a few modifications, be said of good scientific or ethical practice.

Delicacy Since many qualities in great works of art are only found in small degrees, the good critic notices the subtle idiosyncrasies and flourishes of the work. For example, I love the song Giant Steps by John Coltrane. Still, I know that a jazz musician with any knowledge of music theory would appreciate the unorthodox chord progression much more than I. They can also make an objectively valid appraisal and give convincing reasons why it may be better or worse (and I would hardly be one to dispute her). The good critic notices the small stuff.

Practice One needs to survey and experience a domain of art to make a sound judgement of it. Our first experience of art in any one of its great forms is almost always confused and unable to pick out the qualities that make it good or bad. It is hard even to make sense of what to think at all. For the same reason that I really ought not to be listened to about what I might think about the foundations of quantum mechanics, I really ought not to be taken seriously in my judgements about good or bad contemporary dance. I would have no idea where to even begin with either of these — I have no experience in them and therefore no way of intuiting the objective notion constructed among the relevant voices. The good critic is familiar with the domain as a whole.

Comparison Sound judgements of particular works of art require suitable comparison within the domain. The reader of one novel may love it, but they have no way appropriately situating it in the whole corpus of literature. Many pieces of art are good only in, or made better in, their relation to adjacent works. For example, one of the greatest Westerns, Unforgiven, is, in part a masterpiece only in relation to the genre. It signifies the death of the genre not only in its story but also in that it features an old and dying Clint Eastwood who was something like the poster boy for the darker era of the genre in the 60s and early 70s. The good critic recognises, compares, and situates the work in its particular context in relation to other works.

Prejudice The critic ought to free himself from prejudice. He should put himself in the shoes of the intended audience — especially among older works. While some works of literature do betray some fatal moral sensibilities from many years prior, many stand the test of time. To reject the validity of any art based merely on the idiosyncrasies of culture, time, or otherwise is undoubtedly a perversion of sound judgement. The strange aversion to black and white or foreign film among blockbuster audiences comes to mind. (Hopefully, Parasite is a boon for an already long-thriving South Korean film industry.) The good critic tries to free oneself of contingent prejudices and therefore open themselves up to the potential merit of any and all art.

Ends All art has an end; a purpose for which it is created. Any work is better if it achieves what it is going for. A comedy is good because it is funny, and a tragedy is good if it is tragic (obviously there is overlap, but you understand my meaning). Mike Leigh’s masterpiece Naked is at least partially great because it made me uncomfortable, Cats is emphatically not good because it made me uncomfortable. The good critic has the experience to determine, in comparison to other works, how well a specific work achieves its aims.”

“Comedy, tragedy, satire, odes, have each its partizans, who prefer that particular species of writing to all others. It is plainly an error in a critic, to confine his approbation to one species or style of writing, and condemn all the rest. But it is almost impossible not to feel a predilection for that which suits our particular turn and disposition. Such preferences are innocent and unavoidable, and can never reasonably be the object of dispute, because there is no standard by which they can be decided.” — David Hume (Painting: Nude in a Black Armchair, 1932, by Picasso; oil on canvas, 162 x130 cm. Private collection)

“There is a delineation between personal preferences and aesthetic judgements,” Anderson writes. “Our subjective experience of something is incorrigible — it cannot be wrong that we enjoyed something. But, when the subjectivist enters the public conversation with claims to judgements about things as good or bad art they are entering the public domain, making claims about things that are fixed by the public, whose rules are not determined by the whim of the individual. I will no doubt enjoy my baby nephew’s early forays into painting but will obviously not hail it as objectively great in the way I have described.

[…]

We do not have to be afraid to make objective judgements in the domains of taste. I honestly believe that Parasite is objectively better than Cats and always will be. I can present objective features of them as reasons why I think that. I can corroborate and validate that proposition among peers. I can convince you. It is a hypothesis, ultimately fallible, but one I can find support for. In posturing to make an objective judgement of taste, we are ultimately saying: ‘I think this is better than that and I will be vindicated in this view by the long term convergence of taste among art admirers of the future.’”

In the margins

In his most recent book, From Strength to Strength, Arthur C. Brooks explores how a healthy awareness of your mortality can influence your personal and professional life for the better.

“Drawing on the work of British psychologist Raymond Cattell, who in 1971 posited that people possess two types of intelligence in a mix that varies with age, Brooks writes: ‘The first is fluid intelligence, which Cattell defined as the ability to reason, think flexibly, and solve novel problems. It is what we commonly think of as raw smarts.’ Innovators tend to have fluid intelligence in abundance. Cattell noted that ‘it was highest relatively early in adulthood and diminished rapidly starting in one’s thirties and forties.’

Cattell’s second type of smarts is crystallized intelligence, or the ability to use one’s increasing store of knowledge. In other words, writes Brooks, ‘when you are young, you have raw smarts; when you are old, you have wisdom.’ Crystallized intelligence seems to grow with age and tends to make older people better historians and teachers. Brooks argues that high achievers sooner or later have to give up roles that are largely analytic or depend on quick thinking in favor of roles that exploit their superior ability to assemble and apply what they know — and that aid the young.

People have long known this; the author quotes the Roman statesman and philosopher Cicero, who believed that while the old should ‘have their physical labors reduced,’ at the same time, ‘their mental activities should be actually increased. They should endeavor, too, by means of their counsel and practical wisdom to be of as much service as possible to their friends and to the young, and above all to the state’ [De Officiis, Book 1, Moral Goodness, XXXIV, 123].

The trick for aging achievers like Brooks is to play to these new strengths while letting go of the powers that depended on youth.”

From: A guide to growing older, by Daniel Akst (strategy+business)

“[T]he emerging science of implicit bias has revealed that what people say about creativity isn’t necessarily how they feel about it,” Matt Richtel writes in We Have a Creativity Problem, an adaptation of his recently publihed book, Inspired: Understanding Creativity. A Journey Through Art, Science, and the Soul.

“Research has found that we actually harbor an aversion to creators and creativity; subconsciously, we see creativity as noxious and disruptive, and as a recent study demonstrated, this bias can potentially discourage us from undertaking an innovative project or hiring a creative employee. […] The reasons […] can be traced to the fundamentally disruptive nature of novel and original creations. Creativity means change, without the certainty of desirable results.

‘We have an implicit belief the status quo is safe,’ said Jennifer Mueller, a professor of management at the University of San Diego and a lead author on [a] 2012 paper about bias against creativity. Dr. Mueller, an expert in creativity science, said that paper arose partly from watching how company managers professed to want creativity and then reflexively rejected new ideas. ‘Leaders will say, We’re innovative, and employees say, Here’s an idea, and the idea goes nowhere,’ Dr. Mueller said. ‘Then employees are angry.’

But, she said, the people invested in the status quo have plenty of incentive not to change. ‘Novel ideas have almost no upside for a middle manager — almost none,’ she said. ‘The goal of a middle manager is meeting metrics of an existing paradigm.’

That creates another conundrum, the researchers noted, because people in uncertain circumstances may really need a creative solution and yet have trouble accepting it. ‘Our findings imply a deep irony,’ the authors noted in the 2012 paper. ‘Prior research shows that uncertainty spurs the search for and generation of creative ideas, yet our findings reveal that uncertainty also makes us less able to recognize creativity, perhaps when we need it most.’”

From: We Have a Creativity Problem, by Matt Richtel (The New York Times).

“The most reliable cure for confirmation bias is interaction with people who don’t share your beliefs. They confront you with counterevidence and counterargument. John Stuart Mill said, ‘He who knows only his own side of the case, knows little of that,’ and he urged us to seek out conflicting views ‘from persons who actually believe them.’ People who think differently and are willing to speak up if they disagree with you make you smarter, almost as if they are extensions of your own brain. People who try to silence or intimidate their critics make themselves stupider, almost as if they are shooting darts into their own brain.

In his book The Constitution of Knowledge, Jonathan Rauch describes the historical breakthrough in which Western societies developed an “epistemic operating system” — that is, a set of institutions for generating knowledge from the interactions of biased and cognitively flawed individuals. English law developed the adversarial system so that biased advocates could present both sides of a case to an impartial jury. Newspapers full of lies evolved into professional journalistic enterprises, with norms that required seeking out multiple sides of a story, followed by editorial review, followed by fact-checking. Universities evolved from cloistered medieval institutions into research powerhouses, creating a structure in which scholars put forth evidence-backed claims with the knowledge that other scholars around the world would be motivated to gain prestige by finding contrary evidence.

Part of America’s greatness in the 20th century came from having developed the most capable, vibrant, and productive network of knowledge-producing institutions in all of human history, linking together the world’s best universities, private companies that turned scientific advances into life-changing consumer products, and government agencies that supported scientific research and led the collaboration that put people on the moon.

But this arrangement, Rauch notes, ‘is not self-maintaining; it relies on an array of sometimes delicate social settings and understandings, and those need to be understood, affirmed, and protected.’ So what happens when an institution is not well maintained and internal disagreement ceases, either because its people have become ideologically uniform or because they have become afraid to dissent?

This, I believe, is what happened to many of America’s key institutions in the mid-to-late 2010s. They got stupider en masse because social media instilled in their members a chronic fear of getting darted. The shift was most pronounced in universities, scholarly associations, creative industries, and political organizations at every level (national, state, and local), and it was so pervasive that it established new behavioral norms backed by new policies seemingly overnight. The new omnipresence of enhanced-virality social media meant that a single word uttered by a professor, leader, or journalist, even if spoken with positive intent, could lead to a social-media firestorm, triggering an immediate dismissal or a drawn-out investigation by the institution. Participants in our key institutions began self-censoring to an unhealthy degree, holding back critiques of policies and ideas — even those presented in class by their students — that they believed to be ill-supported or wrong.

But when an institution punishes internal dissent, it shoots darts into its own brain.”

From: Why the Past 10 Years of American Life Have Been Uniquely Stupid, by Jonathan Haidt and brilliantly illustrated by Nicolás Ortega (The Atlantic)

“The house was built around 1930 and had been extended and remodeled several times, and there were no drawings or other documents available. When we removed the finishes to investigate, the history of the old renovations was revealed one after another in the floors, walls, and ceilings. The earthen floor had been renovated into a kitchen and dining room, ‘engawa’ — a semi-outdoor deck — had been converted into an interior, the guest room had been enlarged, and the first story had been converted into a second story by removing the roof and building a nursery and closet. These are not only records of the renovation work, but also memories of heartwarming family life. So we designed the house to reveal various records and memories of the past, so that they can coexist with them and they can add new records of their lives,” the architects write.

Via: YYAA converts 1930s property into ‘House of Memories’ (designboom).

“The ancient Greeks called it ‘pothos,’ which Plato defined as a yearning desire for something wonderful that we can’t have. Pothos was our thirst for everything good and beautiful. Humans were lowly beings imprisoned in matter, inspired by pothos to reach for a higher reality. The concept was associated with both love and death; in Greek myth, Pothos (Longing) was the brother of Himeros (Desire) and the son of Eros (Love). But because pothos had that quality of aching for the unattainable, the word was also used to describe the flowers placed on Grecian tombs. The state of longing strikes contemporary ears as passive, gloomy, and helpless, but pothos was understood to be an activating force. The young Alexander the Great described himself as ‘seized by pothos’ as he sat on a riverbank and gazed into the distance; it was pothos that set Homer’s Odyssey in motion, with the shipwrecked Odysseus longing for home.” — Susan Cain on the idea of longing, from Bittersweet: How Sorrow and Longing Make Us Whole

Post scriptum will be back next week, if fortune allows, of course.

If you want to know more about my work as an executive coach and leadership facilitator, please visit markstorm.nl. You can also browse through my writings, follow me on Twitter or connect on LinkedIn.

--

--

Mark Storm

Helping people in leadership positions flourish — with wisdom and clarity of thought