Post scriptum (2022, week 35) — Rules for standing out, how disruptions happen, and the problem of today’s technologists

Mark Storm
23 min readSep 11, 2022
House in Minoh by Motooka Ito Architects — “[The design responds] to its surroundings, incorporating the context while expanding and shrinking throughout.” (Photograph by Yosuke Ohtake)

Post scriptum is a weekly curation of my tweets. It is, in the words of the 16th-century French essayist and philosopher, Michel de Montaigne, “a posy of other men’s flowers and nothing but the thread that binds them is mine own.”

In this week’s Post scriptum: Leibniz’s rules for standing out for all the right reasons; what can upheavals of the past tell us about our own future?; what forms of attention do we want to cultivate, as individuals and as a society?; Socrates and the ethics of conversation; how to read philosophy; what made Rilke great?; the secrets lurking inside Matisse’s ‘Red Studio’; and, finally, Max Khan Hayward on how the growth of virtual worlds threatens to undermine public oversight of real spaces.

Rules for standing out

Although Gottfried Wilhelm Leibniz’s Lebensregeln (‘life rules’) bear the mark of the ethical codes that dominated the court culture of late 17th-century Europe, much of his advice remains relevant. “This is largely because Leibniz’s advice focuses on how a decent and thoughtful person can best navigate a world much like our own, one in which success and advancement often depend on the good judgments of others,” Ryan Patrick Hanley writes in Leibniz had rules for standing out for all the right reasons.

“We no longer go to court to try to get the king to notice us. But ours is yet, like Leibniz’s, an age dominated by the competition for attention: we just measure this by Likes on social media rather than smiles from a king. […]

[Leibniz] shares our concern to attract eyeballs, and knows very well that doing this requires carefully curating the images one allows the public to see. But he also knows what we often forget: there is a real danger in such obsessive focus on appearances that conceal or distort reality. For this reason, he cautions that we shouldn’t focus exclusively on appearances, but rather ‘seek two things: to be and to appear.’ His lesson is simple. We need to take care not just to appear deserving of the esteem of others, but actually to be worthy of the esteem of others. Tempted by incentives that encourage us to obsess about how we look, we run the risk of forgetting (or even no longer caring) who and what we truly are — unless, of course, we take care to develop rules for living that resist these temptations to artificiality,” Hanley writes.

“If we want to stand out, what then should we do? Leibniz suggests we in fact need to do the opposite of what we tend to do. Today, we are often tempted to think that if we want to get noticed, we need to turn up the volume — Tweet more, and in ALL CAPS! But Leibniz knows that greater volume usually gets lost in the noise. To really stand out, we shouldn’t do more of what everyone else is doing, but rather do something different from what everyone else is doing. Thus, his counterintuitive advice: ‘Whoever wishes to stand out should display something singular; for example: humility, modesty, patience.’ When everyone else is preening and posturing, harried and hurrying, we earn the attention of others when we do the opposite — and when we do so, we may even be happier individually and more pleasant for others to be around.”

“Speaking broadly, Leibniz’s rules fall into three basic categories: advice on how to communicate with others, advice on how to carry oneself with others, and advice on the sorts of subjects one ought to study,” Ryan Patrick Hanley writes in Leibniz had rules for standing out for all the right reasons. (Painting: Portrait of Gottfried Leibniz (1646–1716), 1729, by Christoph Bernhard Francke (1665–1729); oil on canvas, 81 x 66 cm. Collection of the Herzog Anton Ulrich Museum, Braunschweig)

“Leibniz wrote for a world much like our world, one that valued getting ahead. But he understood very well that self-interested striving can easily degenerate into selfishness if it’s not tempered by concerns for things that go beyond the self. It’s for this reason that his rules for living consistently emphasise that what matters isn’t just the self, but how the self relates to society. In so doing, he helps even the self-interested see that the real art of living requires learning how to balance the claims that our self makes on us with the claims that society makes on us.

All of this comes to a head in Leibniz’s reflections on what he calls ‘presence of mind.’ Presence of mind, he explains, comes down to this: whether ‘one can find in the heat of the moment what one typically maintains in silence.’ Leibniz knows that it’s no easy task to stay calm in a busy world, and even more difficult to remain oneself when out among others. Many of us today, I suspect, will nod our heads in agreement — and all the more so when Leibniz tells us that this, in fact, is ‘the most difficult skill’ of all.”

How disruptions happen

“Lenin’s theory of change was a theory of social disruption, of imposing a shift so radical that a society could not go back to the way it had been. Such disruptions don’t just happen randomly,” David Potter, a professor of Greek and Roman History, writes in How disruptions happen.

According to Potter, the core characteristics of disruption are that it: “1) stems from a loss of faith in a society’s central institutions; 2) establishes a set of ideas from what was once the fringe of the intellectual world, placing them at the centre of a revamped political order; and 3) involves a coherent leadership group committed to the change.

[…]

Disruptions bring a profound shift in people’s understanding of how the world around them works. They contrast in this way with less radical societal changes, based on an existing thought system: for example, the English ‘revolutions’ of the 17th century, which changed the balance of power between king and parliament without altering the basic system of government. Ideological change is crucial for major societal change […] because societies promote ideologies that support their way of doing business — and if the way of viewing the world doesn’t change, the way of doing business isn’t going to change either.

[…]

The outcome of a disruption is often unexpected to contemporaries, and that is precisely because ideas from outside the mainstream were used to shape the solutions to the problems of the time. We can’t know in advance exactly how a disruption will end. What history can teach us is what the circumstances are that lead to a disruption. It can make us realise what we might be facing as a result of the situation we are in today.”

The conversion to Christianity of the Roman emperor Constantine in the 4th century CE still influences the world in which we live. “This early example exhibits the key characteristics of a disruption: a loss of faith in central institutions (the imperial system of government), the establishment of previously fringe ideas (those of Christianity) at the centre of the political order, and a cohesive, committed leadership group that initiated the change. In elevating Christianity’s role in the empire, Constantine altered patterns of thought, replacing old ideas about imperial authority with a fresh, obviously different model of authority that told people they were moving in a new direction,” David Potter writes in How disruptions happen. (Photograph: The colossal head of the Roman Emperor Constantine the Great, ca. 313–324 AD; marble, cm 260. collection of the Musei Capitolini, Rome)

One of examples Potter gives, is the rise of Nazism in the first half of the 20th century, which was a profoundly violent disruption that its roots in a theory that implied that nations must inevitably be in competition (a view, promulgated by Herbert Spencer, known as social Darwinism).

“Hitler’s political success stemmed in large part from the fact that his assertions about Germany’s path — that it could have won the First World War, that it had been stabbed in the back, and that its problems could be solved by undoing the treaty that had ended the war — were familiar to the electorate from other sources. These positions were lies, but the lies were popular. Hitler’s extreme version of racist social Darwinism was initially on the fringe of German thought but, linked to his anticommunism, it was tolerated outside Nazi circles.

Yet an anticommunist message backed by lies would not be sufficient to explain Hitler’s rise to power. Echoing previous disruptions, this required the disintegration of faith in government. In this case, the loss of faith resulted in large part from the policies pursued by Germany’s centrist chancellor: in response to the Great Depression, he followed conventional wisdom by cutting public spending, thereby increasing the impact of the downturn and damaging the centre-Right alliance that had ensured his election. As the depression deepened, Hitler’s Nazi Party attracted ever more attention, something enhanced by Hitler’s ability to make use of new technologies, especially radio, and his vigorous style of campaigning. Still, when Germany’s president, Paul von Hindenburg, made Hitler chancellor in January 1933, Hindenburg had been convinced that Hitler could be controlled.

It would not be two full months before Hitler created the legal conditions for his dictatorship. The Nazi Party was still not yet recognised for the murderous institution it was when, in 1936, the world gathered in Berlin to celebrate the Olympic Games. Hitler sounded enough like other conservatives — the parallel between Jim Crow laws in the US and Germany’s antisemitic legislation was adduced in support of the appearance of the US at the games — and he had strong anticommunist credentials. These factors, along with dread of another war, made European governments unwilling to stand up to Hitler until war became inevitable.

The disruption led by Hitler relied on a collapse of faith in institutions, on the appeal of Hitler’s novel version of German nationalism to a society reeling from economic collapse and violence, and on the high level of discipline in the Nazi movement, within which Hitler had built a core leadership group. And his rise was aided by the blindness of the political establishment.”

“The deft use of media was crucial in […] the Protestant Reformation of the early 16th century. Around 70 years before Martin Luther posted his 95 theses challenging the notion of purgatory — the place where souls had to wait before going to heaven — and the validity of the indulgences one could purchase to hasten the path forward, Johannes Gutenberg had invented the printing press. Luther proved to be a master of the new medium, recognising that successful communication needed to be short, to the point and in his audience’s own language. The Catholic Church still issued its pronouncements in Latin. Luther told people they could receive God’s word in German,” David Potter writes in How disruptions happen. (Painting: Portrait of Martin Luther, 1528, by Lucas Cranach the Elder (1472–1553); oil on panel, 39.5 x 25.5 cm. Collection of the Coburg Fortress, Coburg)

When we ask what disruptions of the past — with their diverse outcomes — can tell us today, Potter believes that history enables us to detect patterns of behaviour in the present that have had serious consequences in the past.

“Today, there are signs that the US and European liberal democratic systems are under threat. The most obvious of these is a loss of trust in public institutions. Factors such as the willingness of Western governments to allow widespread impoverishment, the weakening of labour organisations, and the failure to provide adequate healthcare and other necessities, feed into powerful movements seeking to undercut the mainstream political system.

So too we see ideas from the intellectual fringe informing these increasingly powerful political movements. Some of these movements use social Darwinist ideas to claim, for instance, that public welfare is undercut by immigration. In Europe, the normalisation of nationalist groups such as the one supporting Éric Zemmour’s bid for the French presidency, or Viktor Orbán’s Fidesz Party in Hungary, is threatening established political norms. In the United Kingdom, some advocates of Brexit have translated traditional English exceptionalism into a form of hypernationalism in terms that, like those of the former US president Donald Trump’s supporters, echo social Darwinist doctrines. The prevalence of belief in lies, such as the lie that Trump won the 2020 election, is evocative of the universe of false assumptions that spread in Germany during Hitler’s rise to power. To combat the fissures that election lies, immigration fantasies or antivaccination movements represent, Western governments should recognise that the prevalence of fringe thinking is a sign that they are failing.

The path to restoring faith — which could lead through the sort of disruption that has preserved societies in the past — will offer real help to those who have been left behind. The underlying principle of liberal democracy is the contract between government and the governed. Government has a responsibility to reign in corporate power that undermines public welfare and spreads falsehood, just as it has a responsibility to ensure that people have access to the goods and services they need. This will require practices very different from ‘politics as normal.’ It is a critical lesson from history that, when normality fails, change will come.

The signs are that we’re in a time that is ripe for disruption. But what sort will it be?”

The problem of today’s technologists

The problem of today’s technologists “is that they do not take technology seriously enough. They refuse to see how it is changing us or even how it is changing them,” Ezra Klein writes in I Didn’t Want It to Be True, but the Medium Really Is the Message.

“It’s been revealing watching Marc Andreessen, a co-founder of the browsers Mosaic and Netscape and of A16Z, a venture capital firm, incessantly tweet memes about how everyone online is obsessed with ‘the current thing.’ Andreessen sits on the board of Meta, and his firm is helping finance Elon Musk’s proposed acquisition of Twitter. He is central to the media platforms that algorithmically obsess the world with the same small collection of topics and have flattened the frictions of place and time that, in past eras, made the news in Omaha markedly different from the news in Ojai. He and his firm have been relentless in hyping crypto, which turns the ‘current thing’ dynamics of the social web into frothing, speculative asset markets.

Behind his argument is a view of human nature and how it does, or doesn’t, interact with technology. In an interview with Tyler Cowen, Andreessen suggests that Twitter is like ‘a giant X-ray machine’:

‘You’ve got this phenomenon, which is just fascinating, where you have all of these public figures, all of these people in positions of authority — in a lot of cases, great authority — the leading legal theorists of our time, leading politicians, all these businesspeople. And they tweet, and all of a sudden, it’s like, «Oh, that’s who you actually ar».’

But is it? I don’t even think this is true for Andreessen, who strikes me as very different off Twitter from on. There is no stable, unchanging self. People are capable of cruelty and altruism, farsightedness and myopia. We are who we are, in this moment, in this context, mediated in these ways. It is an abdication of responsibility for technologists to pretend that the technologies they make have no say in who we become. Where he sees an X-ray, I see a mold,” Klein writes.

In a talk delivered in Denver Colorado in 1998, Neil Postman shared five ideas about technological change: “First, that we always pay a price for technology; the greater the technology, the greater the price. Second, that there are always winners and losers, and that the winners always try to persuade the losers that they are really winners. Third, that there is embedded in every great technology an epistemological, political or social prejudice. Sometimes that bias is greatly to our advantage. Sometimes it is not. The printing press annihilated the oral tradition; telegraphy annihilated space; television has humiliated the word; the computer, perhaps, will degrade community life. And so on. Fourth, technological change is not additive; it is ecological, which means, it changes everything and is, therefore, too important to be left entirely in the hands of Bill Gates. And fifth, technology tends to become mythic; that is, perceived as part of the natural order of things, and therefore tends to control more of our lives than is good for us.” (Illustration: A composite of photos taken at Government Center, Boston, over about an hour in March 2022, showing only people with their phones. Pelle Cass for The New York Times)

“Over the past decade, the narrative has turned against Silicon Valley,” Klein continues. “Puff pieces have become hit jobs, and the visionaries inventing our future have been recast as the Machiavellians undermining our present.”

His frustration with these narratives, both then and now, is that they focus on people and companies, not technologies. Klein suspects that is because American culture remains deeply uncomfortable with technological critique.

“There is something akin to an immune system against it: You get called a Luddite, an alarmist. ‘In this sense, all Americans are Marxists,’ [Neil Postman wrote In his prophetic 1985 book, Amusing Ourselves to Death], ‘for we believe nothing if not that history is moving us toward some preordained paradise and that technology is the force behind that movement.’

I think that’s true, but it coexists with an opposite truth: Americans are capitalists, and we believe nothing if not that if a choice is freely made, that grants it a presumption against critique. That is one reason it’s so hard to talk about how we are changed by the mediums we use. That conversation, on some level, demands value judgments. This was on my mind recently, when I heard Jonathan Haidt, a social psychologist who’s been collecting data on how social media harms teenagers, say, bluntly, ‘People talk about how to tweak it — oh, let’s hide the like counters. Well, Instagram tried — but let me say this very clearly: There is no way, no tweak, no architectural change that will make it OK for teenage girls to post photos of themselves, while they’re going through puberty, for strangers or others to rate publicly.’

What struck me about Haidt’s comment is how rarely I hear anything structured that way. He’s arguing three things. First, that the way Instagram works is changing how teenagers think. It is supercharging their need for approval of how they look and what they say and what they’re doing, making it both always available and never enough. Second, that it is the fault of the platform — that it is intrinsic to how Instagram is designed, not just to how it is used. And third, that it’s bad. That even if many people use it and enjoy it and make it through the gantlet just fine, it’s still bad. It is a mold we should not want our children to pass through.

Or take Twitter. As a medium, Twitter nudges its users toward ideas that can survive without context, that can travel legibly in under 280 characters. It encourages a constant awareness of what everyone else is discussing. It makes the measure of conversational success not just how others react and respond but how much response there is. It, too, is a mold, and it has acted with particular force on some of our most powerful industries — media and politics and technology. These are industries I know well, and I do not think it has changed them or the people in them (including me) for the better.

But what would? I’ve found myself going back to a wise, indescribable book that Jenny Odell, a visual artist, published in 2019, How to Do Nothing: Resisting the Attention Economy. In it she suggests that any theory of media must start with a theory of attention. ‘One thing I have learned about attention is that certain forms of it are contagious,’ she writes. She continues:

‘When you spend enough time with someone who pays close attention to something (if you were hanging out with me, it would be birds), you inevitably start to pay attention to some of the same things. I’ve also learned that patterns of attention — what we choose to notice and what we do not — are how we render reality for ourselves, and thus have a direct bearing on what we feel is possible at any given time. These aspects, taken together, suggest to me the revolutionary potential of taking back our attention.’

I think Odell frames both the question and the stakes correctly. Attention is contagious. What forms of it, as individuals and as a society, do we want to cultivate? What kinds of mediums would that cultivation require?

This is anything but an argument against technology, were such a thing even coherent. It’s an argument for taking technology as seriously as it deserves to be taken, for recognizing, as […] John M. Culkin put it, ‘we shape our tools, and thereafter, they shape us.’

There is an optimism in that, a reminder of our own agency. And there are questions posed, ones we should spend much more time and energy trying to answer: How do we want to be shaped? Who do we want to become?”

In the margins

“[T]hese are ideal dialogical conditions, no doubt, which the interlocutors — including Socrates himself — may fall short of at various points, but they regulate the discursive practice nonetheless, as do the following features:

  • Dialogue must proceed in an orderly way (463c, 494e, 504c ), without irrelevance and with a clarification of key terms before anyone makes assertions about what the topic under discussion is like, or what its properties are.
  • Participants must also respond to one another justly (451a, 504e ), which consists in observing the distinction between a debate and a conversation. This means that one should not trip up an opponent for the sake of it, but help the other party and make them aware of slips and fallacies for which they are responsible. If this rule is followed, then discussants will lay the blame for their confusions on themselves and not on the other party.
  • Discussion must also be conducted moderately (505c ), which means not losing one’s temper, and making concessions when required to do so (457c ).

Attention to the form of Socratic discussion shows that how the discussants talk to each other is often as important as what they say to each other. One must acknowledge the discussant as an equal, share the discussion, take no more than one’s share, reciprocate in the spirit of question and answer, and proceed moderately and fairly.

These dialogical norms foster virtues. Negotiating with hostile others so that difference can be appreciated fosters courage; giving each their due in dialogical exchange fosters a sense of justice; keeping within bounds and not taking more than one’s share in discussion fosters moderation; and collaborating in the business of examination fosters a sense of community and friendship. These virtues are not desired ‘results,’ or gained from a definitional ‘product,’ but cultivated in the very activity of conversation, as Socrates practised it. There is an ethics of conversation built into Socrates’ practice.

Seen as such, the content of the discussion (i.e. definitions of the virtues) is intimated and internalised by discussants in the performance of its form. And this suggests an answer to our question: how can the examined life make us better, even when it doesn’t always lead to results? If we make conversation an art, as well as our constant practice, as Socrates did, then this may be enough to habituate one to the virtues involved in its very operation. So, we should keep the art of conversation going; it may just make us better.”

References to Gorgias, Plato’s dialogue about practices of speech.

From: Socrates and the Ethics of Conversation, by Frisbee Sheffield (Antigone)

“Some philosophical books really are dialogues of characters conversing with one another. They read like plays, though if your local high school puts on a production of Plato’s Theaetetus — a lengthy dialogue about knowledge and judgment — I’d recommend staying home. Other books are straightforward treatises covering some topic in exhaustive monologue. But all philosophical works, even the monologues, are dialectical. They are conversations between someone making a claim and someone raising objections to that claim, or pressing questions to deeper levels. Indeed, philosophical books can become very complicated conversations, because not only are there multiple voices present in the author’s text, but now you have joined the conversation as well, conversing with those voices and your own voices. You and a book are now having a party.

Many philosophy books record the results of philosophers talking to themselves. The French philosopher René Descartes in his Meditations (1641), for example, argues with himself ceaselessly, paragraph after paragraph:

‘I will attempt to achieve, little by little, a more intimate knowledge of myself … Do I not therefore also know what is required for my being certain about anything? … Yet I previously accepted as wholly certain and evident many things which I afterwards realised were doubtful … But what about when I was considering something very simple?’

Descartes is giving himself the ‘Oh yeah?’ treatment. In a way, it is a rhetorical ploy, since he is trying to make it seem as if anyone who is clear-minded and honest with themselves will drift inexorably into the charmed circle of Cartesian metaphysics. But never mind that; it is still a brilliant work of dialectical reflection. The reader is supposed to be carried along in this dialogue, thinking of the same objections that Descartes gives expression to and then tries to answer. But a clever reader such as yourself will probably ask some questions Descartes doesn’t raise, or you will come up with alternative answers he didn’t think of. So you have a part in the dialogue too. Take notes to keep track of it all.

This dialogue is essential to philosophy. Maybe it is essential to all thinking. We raise ideas, ask questions or pose problems, revise or extend those ideas, face new challenges, propose new ideas, and we keep batting questions and answers back and forth in the tennis court of the mind until we can’t think of anything new to say. But philosophy lives in this energetic back-and-forth, picking up on missed possibilities or raising new questions or going back to insert some distinctions before reaching disastrous conclusions. That’s the philosophical method: keep the conversation going, changing, evolving.

Philosophy, as I would define it, is grappling with ideas at the borders of intelligibility. As soon as the questions become fully intelligible and tractable, then a new discipline emerges, like physics or biology or psychology. But while we are still wrestling back and forth over what seems to be true and what seems impossible in matters we cannot see in high resolution, we are doing philosophy. There really is no way to proceed except in a dialectical, back-and-forth manner.

In reading philosophy, you want to be sure to take up the dialogue. To that end, you should keep a journal to explain to yourself what you think is being said in the book, what you think about it, what you’d like to ask the author, and what you would like to tell them (even if it’s impolite). As you make your thoughts explicit on the page, you will learn more about what you yourself think. Sometimes we don’t know what we think until we hear (or read) ourselves saying it. Taking notes in a philosophical dialogue is a way of learning new things about your own mind and your own experience. When you return to that blasted book for a third or fourth time, you will be armed with new questions and perspectives to bring into the conversation. It will be a new reading of it.

It has been said that you are never done reading a great book because each time you read it you become a different person. That may be an exaggeration, but there’s something to it. It is also sometimes said that as you read a great book, it reads you. That is to say, you may begin to understand your own life anew through the concepts the book has suggested to you. This is the inevitable result of any genuine dialogue. That’s when philosophy is at its best.”

From: How to read philosophy, by Charlie Huenemann (Psyche)

“Classical music offers a fabulous example of what was happening to art in Rilke’s Austria and Germany in the early 20th century. Though audiences protested, already Schoenberg’s atonal music seemed to express the modern technology-driven condition. It was exciting, bewildering, but also repetitive and seemingly forever unfinished. The sentimental human heart suddenly didn’t know where to take refuge — and nowhere was probably the implicit answer. Face up to modernity, that is, to a certain new kind of bleakness and rawness, exposed by the age of the machine. Don’t hide away.

The neo-Romantic style of composition which preceded Schoenberg was quite different. Schoenberg himself caught the tail end of the fashion, which is why many Romantic listeners prefer the richly textured, but still tonal, early work. Personally I love to embed myself in the First String Quartet in D Minor, op. 7. I can find a home there — the kind of ‘spiritual’ home Rilke would often allude to, and meaning a home in the imagination. The neo-Romantics were composers like late Brahms and the searingly emotive Hugo Wolf. Their emotionally laden and discordant harmonies pointed ways out of the 19th century. But they did not compel the abiding Western tradition to reinvent itself, as Schoenberg did, perhaps regrettably, but necessarily, after he left that op. 7 behind.

Early Schoenberg was in-between, and in-between is roughly where I think we should place Rilke too, between these two moments in music, that is, the last notes of romanticism and the first signs of rupture. Rilke’s intensely individually felt lyrics and his so-called ‘thing-poems,’ his elegies and his sonnets were new and unique, and yet they could be absorbed into what went before — even centuries before. And so on their evidence Rilke seems, like the earliest Schoenberg, not yet ‘modern’ enough.

But to call Rilke conservative and exclusively aesthetic-minded diverts attention precisely from what made him new. The world he addressed was losing its spirituality, and just as Schoenberg felt music needed a new language, so Rilke toyed with whether the old language could continue: what it could refer to, and mean, as references like God and the soul lost credibility.”

From: What made Rilke great?, by Lesley Chamberlain (Poetry Foundation); excerpted from Rilke: The Last Inward Man (Pushkin Press, 2022).

“For a while already, Matisse had been struggling to dissolve the boundaries between the ‘fine art’ that had dominated Europe’s elite traditions — the kind of fine art that focused on female nudes — and the decorative objects, and domestic spaces, that seemed to matter in a lot of the world’s other cultures. In a few earlier works, Matisse had depicted his own radically new paintings as props in cozy domestic still lifes, aiming for ‘the productive ambiguity between the artistic and the domestic realms that characterized Matisse’s art throughout his career,’ as the MoMA catalog puts it.

In The Red Studio, Matisse magnifies the effect, depicting a painter’s entire atelier as something closer to a bourgeois living room: He conceals all the practical, almost industrial features he had spec’d out for his new workshop (the MoMA show sets them out) and instead fills its image with the furniture and gewgaws and framed paintings you’d have found in the comfortable Matisse home nearby. That’s where he had posed his kids and wife for that earlier [Sergei Shchukin] commission, The Painter’s Family, which Matisse had imagined hanging right beside The Red Studio once it arrived in his patron’s home. (In the end, Shchukin turned down the later Studio painting, for reasons that aren’t quite clear.)

In The Red Studio, Matisse took the homemaking Marguerite of his family painting, identified there by the ‘marguerites’ on her dress, and translated her into his atelier’s artful nude, still recognizably daisied. A central figure from Matisse’s home life, that is, gets to play double duty as a symbol of the grand European tradition. He’s telling us that domesticity is still at hand in the new picture, however much art and its evolution might also be in play.

For all the sex and riotous style in The Red Studio, Matisse imagined that it might someday become family fare. Judging by the untroubled pleasure his painting gives us at MoMA, he succeeded.”

From: The Secrets Lurking Inside Matisse’s ‘Red Studio,’ by Blake Gopnik (The New York Times)

“‘I like it, but I don’t quite understand it,’ Henri Matisse told a journalist visiting his studio in 1911. Matisse was referring to a painting he’d recently made of his studio space, a room filled with earlier paintings and sculptures, all recreated in miniature,” Jonathon Keats writes in MoMA Shows The Backstory Of A Matisse Masterpiece So Radical It Befuddled The Artist Himself. (Painting: The Red Studio, Issy-les-Moulineaux, fall 1911, by Henri Matisse; oil on canvas, 181 x 219.1 cm, Collection of the Museum of Modern Art, New York)
Matisse’s daughter Marguerite (French for ‘daisy’) wears a dress covered in the same flowers that reappear on paintings and drawings her father made of her in the nude. (Painting: The Painter’s Family, 1911, by Matisse; oil on canvas, 143 x 194 cm. Collection of the Hermitage Museum, Saint Petersburg)

“If anybody had the capacity to understand The Red Studio, it was Gertrude Stein, the avant-garde author who was also an avid collector of Matisse’s art. As Matisse noted in a letter to [Sergei Shchukin] — after understatedly describing The Red Studio as ‘surprising at first sight’ — ‘Mme. Stein finds it the most musical of my paintings.’ Perhaps following this lead, Matisse explained that the Venetian red ‘serves as a harmonic link between the green of a nasturtium branch, […] the warm blacks of a border of a Persian tapestry placed above the chest of drawers, the yellow ocher of a statuette around which the nasturtium has grown, enveloping it, the lemon yellow of a rattan chair placed at the right of the painting between a table and a wooden chair, and the blues, pinks, yellows and other greens representing the paintings and other objects placed in my studio.’”

From: MoMA Shows The Backstory Of A Matisse Masterpiece So Radical It Befuddled The Artist Himself, by Jonathon Keats (Forbes)

“[W]e cannot allow the novelty of virtual worlds to blind us to the risks of relocating our social and economic activity into a realm that is privately owned and controlled by unaccountable corporations. And whatever we decide, it must be based on a simple principle: the interests of citizens must always take precedence over the entrancing visions of the utopian messiahs of the virtual future.” — Max Khan Hayward, in Does the rise of the Metaverse mean the decline of cities?

Post scriptum will be back next week, if fortune allows, of course.

If you want to know more about my work as an executive coach and leadership facilitator, please visit markstorm.nl. You can also browse through my writings, follow me on Twitter or connect on LinkedIn.

--

--

Mark Storm

Helping people in leadership positions flourish — with wisdom and clarity of thought