I have gathered a posy of other men’s flowers, and nothing but the thread that binds them is mine own.” — Michel de Montaigne
Random finds is a weekly curation of my tweets and, as such, a reflection of my fluid and boundless curiosity.
If you want to know more about my work and how I help leaders and their teams find their way through complexity, ambiguity, paradox & doubt, go to Leadership Confidant — A new and more focused way of thriving on my ‘multitudes’ or visit my ‘uncluttered’ website.
This week: Why most companies that try to do well by doing good can’t make it last; do we have the right to believe whatever we like?; an unabridged Oliver Sacks on steam engines, smartphones and fearing the future; the hidden benefits of hiring Jacks and Jills of all trades; how our intuitions may inadvertently fit us with blinkers; Bauhaus, one of the most transcendent and frustrating movements of the Modernist age; John Ruskin and the uses of ornament; a Brutalist house in Bali; and, finally, Grayson Perry’s ‘Default Man.’
The short life of enlightened leadership
“Most companies that try to do well by doing good can’t make it last,” James O’Toole argues in The Short Life of Enlightened Leadership (and How to Extend It), an adaptation of his recent book The Enlightened Capitalists: Cautionary Tales of Business Pioneers Who Tried to Do Well by Doing Good (HarperBusiness, 2019).
For nearly fifty years, O’Toole, a professor emeritus at USC Marshall School of Business, has studied the social performance of some 50 U.S. and British companies that could be deemed ‘enlightened’ — enterprises with unusually admirable organizational practices benefiting both their shareholders and society.
According to O’Toole, “The leaders of those companies attempted to address the world’s most deeply entrenched problems: unemployment, poverty, unsafe and unhealthy working conditions, low-quality goods, and environmental degradation — all while meeting the necessity of making a profit. Significantly, they sought to address social problems through their business practices and not through philanthropy. Their ethical and responsible acts were not add-ons, afterthoughts, or atonement for bad behavior; instead, they were integral to the way these companies did business, incorporated in how they made products and delivered services. To an unusual degree, those leaders consistently practiced what they espoused throughout their careers.”
Unfortunately, most companies that are engaged in “virtuous practices’’ do not endure in their original “virtuous’’ form. O’Toole’s research shows that after an enlightened capitalist founder is gone, successors quickly abandon the very practices that made the company both financially successful and publicly admired, particularly in companies economies characterized by an Anglo-American variety of laissez-faire shareholder capitalism. In fact, he found only a few notable exceptions among companies that are not family-owned: the John Lewis Partnership (a giant U.K. retailer), the American Cast Iron Pipe Company (ACIPCO, in Birmingham, Ala.), Lincoln Electric (a Cleveland-based manufacturer of arc welding machines), and W.L. Gore and Associates (the Maryland-headquartered producer of Gore-Tex). O’Toole calls these the “virtuous four.”
“Three interrelated factors appear to be critical for their long-term success: the carefully articulated business philosophies of their founders, their unusual governance structures, and their nontraditional forms of ownership. Leaders who are trying to build their own enlightened, successful businesses probably need to consider all three of these factors, or their vision seems far less likely to survive after they exit,” O’Toole writes.
Virtuous Business Philosophy
“The founders of these four companies were atypical leaders,” according to O’Toole. “They developed fully fleshed out business philosophies in which they identified higher purposes for their enterprises than simply making a profit. Their primary ethical value was respect for people. Whether that value was rooted in the religious Golden Rule or in the humanistic values of the 18th-century Enlightenment, they tried to use their organizations as vehicles for what Thomas Jefferson called the pursuit of happiness. Most unusually, they maintained a commitment to their values through good times and bad, creating sustainable business models buttressed by a strong corporate culture that institutionalized virtuous behaviors. That meant introducing organizational structures, legal strictures, and ownership bulwarks designed to ‘bake in’ attitudes and practices that would last for generations.”
“The history of socially responsible companies shows that when virtuous programs and policies exist primarily because an individual leader cares about them, his or her successors have no problem removing them. These practices are far more likely to last when they are institutionalized in rules of governance. Thus, a few enlightened capitalists have attempted, in one form or another, to institutionalize their practices in an organizational structure. Sometimes this involves a family business structure; sometimes, as with England’s 174-year-old The Economist, it involves a board of independent trustees charged with safeguarding its corporate and editorial independence. It can also rely on an independent trust or foundation that owns most of the company stock.”
O’Toole’s research shows that a good governance structure is necessary for long-term enlightened management, it is, however, not sufficient. Rather, it should be regarded as a prerequisite for a more potent element: control of company stock.
“After studying the stories of enlightened capitalists for the better part of my career, I believe that ownership is the most significant predicator of virtuous business practices and the key to their sustainability. As history demonstrates, the virtuous practices of such admired leaders as […] came to an end once they lost financial control of the organizations they founded. In contrast, the cultures of stewardship at [the “virtuous four”] have been sustained, in large part, because control of those companies has remained in the hands of the founder’s descendants and employees,” O’Toole writes.
“This was not by chance. All four founders had an aversion to public ownership of their company. James Lincoln, founder of Lincoln Electric, felt that the greatest threat to his people-oriented management practices came from Wall Street and the short-term dictates of the stock market. ‘The usual absentee stockholder contributes nothing to efficiency,’ he wrote. ‘He buys a stock today and sells it tomorrow. He often doesn’t even know what the company makes. Why should he be rewarded by large dividends?’”
But in the end, many founders of privately held businesses lose control. They either sell their company to a big corporation, like the Body Shop’s Anita Roddick did, or their shares to investors. “Even entrepreneurs who continue to own and manage their companies often find it necessary to sell equity in them to finance growth. That act often turns out to be a devil’s bargain: when founders’ (or their families’) ownership is diluted, they begin to lose control of how the business is managed, as the family that founded Marks & Spencer discovered. And when shares are bought and sold on financial markets, investors inevitably gain the upper hand; eventually founders and their families lose influence as their companies come to be led by professional managers. When that occurs, investor demand for short-term profit increases, and the sustainability of virtuous practices becomes imperiled. All this is reinforced by concepts related to shareholder primacy, which dictate that managers are simply the agents of stock owners, and no other constituency has any claim. When the purpose of a corporation is seen as only maximizing shareholder profit, enlightened capitalism — even when it is linked to long-term financial success — tends to fall by the wayside.”
“There have been an increasing number of calls for greater corporate social engagement in recent years. But when I studied the actual behavior of publicly traded companies, I was unable to find much real progress beyond executive rhetoric and relatively costless activities such as employee community volunteer programs and swapping out incandescent lightbulbs for LEDs. Some companies come under tremendous shareholder pressure to abandon (or greatly scale back) their social and environmental efforts. For example, Unilever — widely considered the world’s most socially responsible giant company — has struggled to keep its commitment to provide fair salaries and generous benefits to employees in the developing world. Another example is Whole Foods. CEO John Mackey (a cofounder of the conscious capitalism movement) was forced to sell his company to Amazon in a rushed effort to save its enlightened policies from the hands of activist investors, who wanted it sold to a conventionally managed supermarket chain.
Given these precedents, and a multitude of others, it is doubtful that many executives of publicly traded firms will buck Wall Street by introducing enlightened employee- and community-oriented practices that appear to diminish profits. If they do, they will run up against the same kind of investor headwinds that Mackey recently encountered. The silver lining of that cloud is the existence of alternative forms of ownership, such as the trusts, foundations, and employee ownership models used by the four companies described here, along with traditional family ownership and cooperatives.
O’Toole believes, based on his reading of business history, that a pluralistic economy with a healthy mix of private, public, and nonprofit organizations offers the best prospect for prosperity and a just society. Does this mean that enlightened corporate leadership can be compatible with all the forms of corporate governance, including shareholder capitalism? On that question he agrees with the closing of the novel Pontoon by Garrison Keillor: “You get old and you realize there are no answers, just stories.”
The right to believe whatever you want
“Do we have the right to believe whatever we want to believe? This supposed right is often claimed as the last resort of the wilfully ignorant, the person who is cornered by evidence and mounting opinion: ‘I believe climate change is a hoax whatever anyone else says, and I have a right to believe it!’ But is there such a right?,” Daniel DeNicola, a professor philosophy and the author of Understanding Ignorance: The Surprising Impact of What We Don’t Know (2017), wonders in You don’t have a right to believe whatever you want to.
“We do recognise the right to know certain things. I have a right to know the conditions of my employment, the physician’s diagnosis of my ailments, the grades I achieved at school, the name of my accuser and the nature of the charges, and so on. But belief is not knowledge. Beliefs are factive: to believe is to take to be true. Beliefs aspire to truth — but they do not entail it,” says DeNicola.
“Beliefs can be false, unwarranted by evidence or reasoned consideration. They can also be morally repugnant. Among likely candidates: beliefs that are sexist, racist or homophobic; the belief that proper upbringing of a child requires ‘breaking the will’ and severe corporal punishment; the belief that the elderly should routinely be euthanised; the belief that ‘ethnic cleansing’ is a political solution, and so on. If we find these morally wrong, we condemn not only the potential acts that spring from such beliefs, but the content of the belief itself, the act of believing it, and thus the believer.
Such judgments can imply that believing is a voluntary act. But beliefs are often more like states of mind or attitudes than decisive actions. Some beliefs, such as personal values, are not deliberately chosen; they are ‘inherited’ from parents and ‘acquired’ from peers, acquired inadvertently, inculcated by institutions and authorities, or assumed from hearsay. For this reason, I think, it is not always the coming-to-hold-this-belief that is problematic; it is rather the sustaining of such beliefs, the refusal to disbelieve or discard them that can be voluntary and ethically wrong.”
“Unfortunately, many people today seem to take great licence with the right to believe, flouting their responsibility. The wilful ignorance and false knowledge that are commonly defended by the assertion ‘I have a right to my belief’ do not meet James’s requirements. [According to the American philosopher and psychologist, William James, some of our most important beliefs about the world and the human prospect must be formed without the possibility of sufficient evidence. In such circumstances, one’s ‘will to believe’ entitles us to choose to believe the alternative that projects a better life.] Consider those who believe that the lunar landings or the Sandy Hook school shooting were unreal, government-created dramas; that Barack Obama is Muslim; that the Earth is flat; or that climate change is a hoax. In such cases, the right to believe is proclaimed as a negative right; that is, its intent is to foreclose dialogue, to deflect all challenges; to enjoin others from interfering with one’s belief-commitment. The mind is closed, not open for learning. They might be ‘true believers’, but they are not believers in the truth.
Believing, like willing, seems fundamental to autonomy, the ultimate ground of one’s freedom. But, as [the 19th-century mathematical philosopher William K Clifford] also remarked: ‘No one man’s belief is in any case a private matter which concerns himself alone.’ Beliefs shape attitudes and motives, guide choices and actions. Believing and knowing are formed within an epistemic community, which also bears their effects. There is an ethic of believing, of acquiring, sustaining, and relinquishing beliefs — and that ethic both generates and limits our right to believe. If some beliefs are false, or morally repugnant, or irresponsible, some beliefs are also dangerous. And to those, we have no right.”
The Machine Stops
The New Yorker published a wonderful short essay by Oliver Sacks, written shortly before his death in 2015.
Sacks writes about the parallels between what he sees around him and the world described by E.M. Forster in The Machine Stops. In this prescient short story from 1909, Forster imagines a future in which humans live in separate cells, communicating only by audio and visual devices.
Even at the end of his life, Sacks dares to hope that human life and its richness of cultures will survive. “We can pull the world through its present crises and lead the way to a happier time ahead,” he writes. “I have to believe in this — that mankind and our planet will survive, that life will continue, and that this will not be our final hour.”
Here is his unabridged essay, also entitled The Machine Stops.
“My favorite aunt, Auntie Len, when she was in her eighties, told me that she had not had too much difficulty adjusting to all the things that were new in her lifetime — jet planes, space travel, plastics, and so on — but that she could not accustom herself to the disappearance of the old. ‘Where have all the horses gone?’ she would sometimes say. Born in 1892, she had grown up in a London full of carriages and horses.
I have similar feelings myself. A few years ago, I was walking with my niece Liz down Mill Lane, a road near the house in London where I grew up. I stopped at a railway bridge where I had loved leaning over the railings as a child. I watched various electric and diesel trains go by, and after a few minutes Liz, growing impatient, asked, ‘What are you waiting for?’ I said that I was waiting for a steam train. Liz looked at me as if I were crazy.
‘Uncle Oliver,’ she said. ‘There haven’t been steam trains for more than forty years.’
I have not adjusted as well as my aunt did to some aspects of the new — perhaps because the rate of social change associated with technological advances has been so rapid and so profound. I cannot get used to seeing myriads of people in the street peering into little boxes or holding them in front of their faces, walking blithely in the path of moving traffic, totally out of touch with their surroundings. I am most alarmed by such distraction and inattention when I see young parents staring at their cell phones and ignoring their own babies as they walk or wheel them along. Such children, unable to attract their parents’ attention, must feel neglected, and they will surely show the effects of this in the years to come.
In his novel Exit Ghost, from 2007, Philip Roth speaks of how radically changed New York City appears to a reclusive writer who has been away from it for a decade. He is forced to overhear cell-phone conversations all around him, and he wonders, ‘What had happened in these ten years for there suddenly to be so much to say — so much so pressing that it couldn’t wait to be said? . . . I did not see how anyone could believe he was continuing to live a human existence by walking about talking into a phone for half his waking life.’
These gadgets, already ominous in 2007, have now immersed us in a virtual reality far denser, more absorbing, and even more dehumanizing. I am confronted every day with the complete disappearance of the old civilities. Social life, street life, and attention to people and things around one have largely disappeared, at least in big cities, where a majority of the population is now glued almost without pause to phones or other devices — jabbering, texting, playing games, turning more and more to virtual reality of every sort.
Everything is public now, potentially: one’s thoughts, one’s photos, one’s movements, one’s purchases. There is no privacy and apparently little desire for it in a world devoted to non-stop use of social media. Every minute, every second, has to be spent with one’s device clutched in one’s hand. Those trapped in this virtual world are never alone, never able to concentrate and appreciate in their own way, silently. They have given up, to a great extent, the amenities and achievements of civilization: solitude and leisure, the sanction to be oneself, truly absorbed, whether in contemplating a work of art, a scientific theory, a sunset, or the face of one’s beloved.”
“A few years ago, I was invited to join a panel discussion about information and communication in the twenty-first century. One of the panelists, an Internet pioneer, said proudly that his young daughter surfed the Web twelve hours a day and had access to a breadth and range of information that no one from a previous generation could have imagined. I asked whether she had read any of Jane Austen’s novels, or any classic novel. When he said that she hadn’t, I wondered aloud whether she would then have a solid understanding of human nature or of society, and suggested that while she might be stocked with wide-ranging information, that was different from knowledge. Half the audience cheered; the other half booed.
Much of this, remarkably, was envisaged by E. M. Forster in his 1909 story The Machine Stops, in which he imagined a future where people live underground in isolated cells, never seeing one another and communicating only by audio and visual devices. In this world, original thought and direct observation are discouraged — ‘Beware of first-hand ideas!’ people are told. Humanity has been overtaken by ‘the Machine,’ which provides all comforts and meets all needs — except the need for human contact. One young man, Kuno, pleads with his mother via a Skype-like technology, ‘I want to see you not through the Machine. . . . I want to speak to you not through the wearisome Machine.’
He says to his mother, who is absorbed in her hectic, meaningless life, ‘We have lost the sense of space. . . . We have lost a part of ourselves. . . . Cannot you see . . . that it is we that are dying, and that down here the only thing that really lives is the Machine?’
This is how I feel increasingly often about our bewitched, besotted society, too.
As one’s death draws near, one may take comfort in the feeling that life will go on — if not for oneself then for one’s children, or for what one has created. Here, at least, one can invest hope, though there may be no hope for oneself physically and (for those of us who are not believers) no sense of any “spiritual” survival after bodily death.
But it may not be enough to create, to contribute, to have influenced others if one feels, as I do now, that the very culture in which one was nourished, and to which one has given one’s best in return, is itself threatened. Though I am supported and stimulated by my friends, by readers around the world, by memories of my life, and by the joy that writing gives me, I have, as many of us must have, deep fears about the well-being and even survival of our world.
Such fears have been expressed at the highest intellectual and moral levels. Martin Rees, the Astronomer Royal and a former president of the Royal Society, is not a man given to apocalyptic thinking, but in 2003 he published a book called Our Final Hour, subtitled ‘A Scientist’s Warning: How Terror, Error, and Environmental Disaster Threaten Humankind’s Future in This Century — on Earth and Beyond.’ More recently, Pope Francis published his remarkable encyclical ‘Laudato Si’,’ a deep consideration not only of human-induced climate change and widespread ecological disaster but of the desperate state of the poor and the growing threats of consumerism and misuse of technology. Traditional wars have now been joined by extremism, terrorism, genocide, and, in some cases, the deliberate destruction of our human heritage, of history and culture itself.
These threats, of course, concern me, but at a distance — I worry more about the subtle, pervasive draining out of meaning, of intimate contact, from our society and our culture. When I was eighteen, I read Hume for the first time, and I was horrified by the vision he expressed in his eighteenth-century work A Treatise of Human Nature, in which he wrote that mankind is ‘nothing but a bundle or collection of different perceptions, which succeed each other with an inconceivable rapidity, and are in a perpetual flux and movement.’ As a neurologist, I have seen many patients rendered amnesic by destruction of the memory systems in their brains, and I cannot help feeling that these people, having lost any sense of a past or a future and being caught in a flutter of ephemeral, ever-changing sensations, have in some way been reduced from human beings to Humean ones.
I have only to venture into the streets of my own neighborhood, the West Village, to see such Humean casualties by the thousand: younger people, for the most part, who have grown up in our social-media era, have no personal memory of how things were before, and no immunity to the seductions of digital life. What we are seeing — and bringing on ourselves — resembles a neurological catastrophe on a gigantic scale.
Nonetheless, I dare to hope that, despite everything, human life and its richness of cultures will survive, even on a ravaged earth. While some see art as a bulwark of our collective memory, I see science, with its depth of thought, its palpable achievements and potentials, as equally important; and science, good science, is flourishing as never before, though it moves cautiously and slowly, its insights checked by continual self-testing and experimentation. I revere good writing and art and music, but it seems to me that only science, aided by human decency, common sense, farsightedness, and concern for the unfortunate and the poor, offers the world any hope in its present morass. This idea is explicit in Pope Francis’s encyclical and may be practiced not only with vast, centralized technologies but by workers, artisans, and farmers in the villages of the world. Between us, we can surely pull the world through its present crises and lead the way to a happier time ahead. As I face my own impending departure from the world, I have to believe in this — that mankind and our planet will survive, that life will continue, and that this will not be our final hour.”
And also this …
“The danger of over-specialisation, though, is that ‘you become very narrow in your skill set and tribal in your attitude.’ As employees dive deeper into a specific area, they lose perspective and find they qualify only for ever more tightly drawn roles,” writes Andrew Hill in The hidden benefits of hiring Jacks and Jills of all trades.
This concern isn’t new, though. “John Ruskin, the Victorian polymath whose bicentenary is this year, advocated a school curriculum spanning science, art and handicrafts. He took issue with the obsession with arid mathematical prowess, based, he wrote, on ‘the notion that every boy is to become first a banker’s clerk and then a banker.’”
According to Hill, polymaths are more M-shaped than T-shaped. They must demonstrate mastery in at least three different domains, which makes them rarities. For example, only two Nobel Prize winners have won a second prize in a different domain: Linus Pauling for chemistry and peace, and Marie Curie for physics and chemistry.
But polymathic range is less unusual. “We are all born with it, for one thing, and it should be celebrated and cultivated. It may even be a pre-requisite for excellence in a single field. A study by psychologist Bernice Eiduson, cited in The Polymath, found Nobel science laureates were 25 times as likely as the average scientist to sing, dance or act, and 17 times as likely to be a visual artist.”
“Some people evolve into master generalists by necessity. Writer Maya Angelou pulled herself up from her tough childhood, becoming a civil rights activist, foreign correspondent, linguist, historian, dancer, singer, actor and film director. Others choose a multi-faceted career. Nathan Myhrvold quit the plum job of chief technology officer at Microsoft and went on to pursue interests in cookery, inventions, volcanology and wildlife photography. The pressure to specialise usually drives people in the opposite direction: towards neglect of hobbies, withering of skills, stagnation of talent, and wilful ignorance of wider opportunities.
Those who persist with polymathy risk being labelled as dilettantes. Yet you do not need to be an Angelou, a Myhrvold or a Ruskin to reap the advantages of being a master generalist. Studies by Cláudia Custódio of Imperial College Business School [et al] found that generalist chief executives earned more and fostered more innovation than more specialised counterparts. Wide experience gave them more job options — and therefore probably a greater tolerance for risk — and helped them ‘bring more diverse knowledge’ to their companies.
‘Polymath’ is a term often applied in obituaries and biographies of people known for a single speciality. It is as though the value of breadth is clear only when surveying a whole life posthumously. That is a tragedy, because the average narrow-minded specialist usually does not merit an obituary at all.”
Recommended further reading: The Neo-Generalist, by Richard Martin and Kenneth Mikkelsen (LID Publishing, 2016).
“Our intuitions may inadvertently fit us with blinkers,” writes the behavioural economist Koen Smets in Blinkers and intuitions. Statistics and figures give instant credence to “something” but “ not necessarily to your intuition.”
“There is an (apocryphal?) WWI story around the introduction of tin Brodie helmets in the British Army, to provide the soldiers with better protection against flying shrapnel. However, when the number of injured people brought in with head wounds were counted, the data revealed that, surprisingly, it had gone up by a large percentage, instead of gone down. Did this mean those who doubted that the helmets would be any better than the cloth caps at protecting the soldiers’ skulls were right? Of course not. What happened is that fewer soldiers died on the spot of their head injuries and more of them survived (but with injuries).”
“Another classic (and true) war story illustrates the same kind of intuitive blinker effect. In WWII the allied forces sought to minimize losses of aircraft to enemy fire. Researchers had been studying the damage to the aircraft that managed to return, and recommended that they be reinforced where the damage was worst. That was what their intuition told them, and what the data appeared to confirm: what was the point of strengthening the fuselage and the wings where there were no bullet holes? But Abraham Wald, a statistician, stopped them in their tracks. The holes, he pointed out, were exactly where the aircraft were the strongest: they were clearly capable of flying home even with the damage. It was the locations where the surviving planes were unharmed that needed reinforcement: the planes that got hit there didn’t make it back.
Intuition is a good thing to have, but we must be careful not to allow it to fit us with blinkers. We may believe we know the answer and have the data to support it, but it’s worth checking whether it is the answer to the right question. And for that, we should take off our blinkers.”
A century after its founding, the Bauhaus, the German school of art and architecture, remains one of the most transcendent — and frustrating — movements of the Modernist age.
“The history of the Bauhaus is […] also a history of its controversies, false starts and failures,” Nikil Saval writes in How Bauhaus Redefined What Design Could Do for Society.
“Directors failed to maintain order, politics overran the school, women were consistently subordinated. It is also a history in which design as a social concern gave way to design as the styling of consumer goods. But it is also a history of other schools, with which it was contemporary and to which it gave birth. The poet Rabindranath Tagore’s Visva-Bharati University, founded in Santiniketan in rural West Bengal, India, in 1921, bears comparison with the Bauhaus. (Tagore, who visited the school on a trip to Europe that year, also helped organize a 1922 exhibition in Calcutta featuring artists from the Bauhaus and the Indian avant-garde.) So, too, does Black Mountain College in Asheville, N.C., founded in 1933, the year the Bauhaus closed, where Josef and Anni Albers taught. Max Bill, a former Bauhaus student, co-founded the Ulm School of Design in 1953 in West Germany, which collaborated early on with the German manufacturing company Braun, whose Dieter Rams-designed products directly influenced Jony Ive, the chief designer of Apple. Which brings us back via a commodius vicus to design as the styling of consumer goods.
Why did things end up there? After all, the Bauhaus began as a protest against the thoughtless direction of industrialization, the harm it did to mind and spirit. ‘Only work which is the product of an inner compulsion can have spiritual meaning,’ [Walter Gropius] wrote in 1923. ‘Mechanized work is lifeless, proper only to the lifeless machine … The solution depends on a change in the individual’s attitude toward his work.’ But Gropius was also intent on partnering with German industry to market Bauhaus products; under [Hannes Meyer’s] directorship, the Bauhaus actually became profitable through its commercial partnerships.”
“The triumph, after the war, of the Bauhaus as a style and a brand were almost inversely proportional to its failure as a social program. Bauhaus furniture and objects became marketable, Bauhaus architecture a cuboid product available to anyone. Bauhaus became one more form of enabling the growth of consumer society, with its microgradations of taste corresponding to class and status. One of the high (or low) points was the exhibition of a model house by the Bauhaus alumnus Marcel Breuer in the sculpture garden of the Museum of Modern Art in 1949: An early blockbuster exhibition in MoMA’s history, it also betrayed the spirit of the school by showing a house that was far too expensive for most working-class Americans. (John D. Rockefeller Jr. bought the actual house that was exhibited and used it as a guesthouse on his estate in Pocantico Hills, N.Y.) This was a trend observable even to Bauhaus contemporaries. In a 1930 essay titled Ten Years of the Bauhaus, the Hungarian art theorist Erno Kallai, who edited the Bauhaus journal under Meyer, laconically telegraphed the standardization of form at the expense of content: ‘Tubular steel armchair frames: Bauhaus style. Lamp with nickel-coated body and a disk of opaque glass as lampshade: Bauhaus style. Wallpaper patterned in cubes: Bauhaus style. No painting on the wall: Bauhaus style.’”
“In Dessau, the complex known as the Laubenganghäuser, and usually translated with dogged literalness as the Houses With Balcony Access, reflects the humane principles of reproducible workers’ housing. Designed under Hannes Meyer, these, like the ADGB Trade Union School, were true Bauhaus buildings, conceived and executed under the collective imprimatur of the Bauhaus. Three-story brick buildings, with apartments linked by long-running balconies — a precursor to the ‘streets in the sky’ of later British social housing complexes — the Laubenganghäuser cram a number of amenities into small spaces. Kitchen cabinets are hidden behind sliding doors; bright shades of maroon and mauve enliven the otherwise incredibly tight quarters, which give off an impression of openness and space. A current resident recently told Berlin’s Monopol magazine that because there aren’t ‘sterile corridors in the building,’ residents ‘spend a lot of time outdoors, so neighbors often come into contact with each other. It feels more like a community than an apartment building, and many people have become friends.’ Here was a vision of the Bauhaus’s potential beyond consumer society, beyond the rule of markets and private property — one in which collective provision defeats private greed, and in which strangers are made to feel welcome as members of a group. Had history not intervened, there might have been more of them in Dessau: nearly anonymous testaments to the ideals of the old, fractious, continuously fascinating school, present only by implication, as its students and teachers wanted, in everyday life.”
In The Uses of Ornament (from The Seven Lamps of Architecture, 1849), John Ruskin argues that ornament is about the essence of architecture rather than being a robe put on from outside and that apart from revealing directly the relationship of humanity with god, ornament reflects the internal beauty of architecture. For him, it is the primary part of architectural construction. The grandeur of a building does not show by its constructive perfection but by the quality of its ornament and painting.
“Must not beauty, then, it will be asked, be sought for in the forms which we associate with our every-day life? Yes, if you do it consistently, and in places where it can be calmly seen; but not if you use the beautiful form only as a mask and covering of the proper conditions and uses of things, nor if you thrust it into the places set apart for toil. Put it in the drawing-room, not into the workshop; put it upon domestic furniture, not upon tools of handicraft. All men have sense of what is right in this matter, if they would only use and apply that sense; every man knows where and how beauty gives him pleasure, if he would only ask for it when it does so, and not allow it to be forced upon him when he does not want it. Ask any one of the passengers over London Bridge at this instant whether he cares about the forms of the bronze leaves on its lamps, and he will tell you, No. Modify these forms of leaves to a less scale, and put them on his milk-jug at breakfast, and ask him whether he likes them, and he will tell you, Yes. People have no need of teaching, if they could only think and speak truth, and ask for what they like and want, and for nothing else; nor can a right disposition of beauty be ever arrived at except by this common-sense, and allowance for the circumstances of the time and place. It does not follow, because bronze leafage is in bad taste on the lamps of London Bridge, that it would be so on those of the Ponte della Trinità; nor because it would be a folly to decorate the house fronts of Gracechurch Street, that it would be equally so to adorn those of some quiet provincial town. The question of greatest external or internal decoration depends entirely on the conditions of probable repose. It was a wise feeling which made the streets of Venice so rich in external ornament; for there is no couch of rest like the gondola. So again, there is no subject of street ornament so wisely chosen as the fountain, where it is a fountain of use; for it is just there that perhaps the happiest pause takes place in the labour of the day, when the pitcher is rested on the edge of it, and the breath of the bearer is drawn deeply, and the hair swept from the forehead, and the uprightness of the form declined against the marble ledge, and the sound of the kind word or light laugh mixes with the trickle of the falling water, heard shriller and shriller as the pitcher fills. What pause is so sweet as that — so full of the depth of ancient days, so softened with the calm of pastoral solitude?”
A Brutalist Tropical Home, by architectural studio Patisandhika and designer Dan Mitchell in Bali, Indonesia, is located in a small valley nestled within rice fields on the south coast of the island. It has exaggerated structural slabs that extend horizontally from its exterior to shade its living room that is fronted by a double-height glazing.
The double-height living room forms the heart of the house, and is flanked by split-levels that Patisandhika and Mitchell modelled on Kappe Residence — a geometric house designed and lived in by Modernist architect Ray Kappe in Los Angeles.
“Ray Kappe is a huge inspiration for us. To be able to see spaces from angles that you could not in a conventional house with walls gives a completely different sense of space and feeling,” Dan Mitchell added.
“Default Man feels he is the reference point from which all other values and cultures are judged. Default Man is the zero longitude of identities.
He has forged a society very much in his own image, to the point where now much of what other groups think and feel is the same. They take on the attitudes of Default Man because they are the attitudes of our elders, our education, our government, our media. If Default Men approve of something it must be good, and if they disapprove it must be bad, so people end up hating themselves, because their internalised Default Man is berating them for being female, gay, black, silly or wild.” — Grayson Perry in The rise and fall of Default Man (The New Statesman, 2014)