Post scriptum is a weekly curation of my tweets. It is, in the words of the 16th-century French essayist and philosopher, Michel de Montaigne, “a posy of other men’s flowers and nothing but the thread that binds them is mine own.”
This is the first of three abbreviated Summer editions of Post scriptum with only a handful of recommended reads. If you have time to spare and want more to read, you can always follow me on Twitter.
The dangerous populist science of Yuval Noah Harari
Yuval Noah Harari is a gifted storyteller and popular speaker. But he sacrifices science for sensationalism, and his work is riddled with errors, Darshana Narayanan writes in The Dangerous Populist Science of Yuval Noah Harari.
“Times are tough, and we are — all of us — looking for answers to literal questions of life and death. Will humans survive the coming waves of pandemics and climate change? Do our genes contain the key to understanding everything about us? Will technology save us, or will it destroy us? The desire for a wise guide — a sort of prophet who boldly leaps across multiple disciplines to provide simple, readable, confident answers, tying it all together in page-turning stories — is understandable. But is it realistic?
It scares me that, to many, this question appears to be irrelevant. Harari’s blockbuster, Sapiens, is a sweeping saga of the human species — from our humble beginnings as apes to a future where we will sire the algorithms that will dethrone and dominate us. Sapiens was published in English in 2014, and by 2019, it had been translated into more than 50 languages, selling over 13 million copies. Recommending the book on CNN in 2016, president Barack Obama said that Sapiens, like the Pyramids of Giza, gave him ‘a sense of perspective’ on our extraordinary civilization. Harari has published two subsequent bestsellers — Homo Deus: A Brief History of Tomorrow (2017), and 21 Lessons for the 21st Century (2018). All told, his books have sold over 23 million copies worldwide. He might have a claim to be the most sought-after intellectual in the world, gracing stages far and wide, earning hundreds of thousands of dollars per speaking appearance.”
“We have been seduced by Harari because of the power not of his truth or scholarship but of his storytelling. As a scientist, I know how difficult it is to spin complex issues into appealing and accurate storytelling. I also know when science is being sacrificed to sensationalism. Yuval Harari is what I call a ‘science populist.’ (Canadian clinical psychologist and YouTube guru Jordan Peterson is another example.) Science populists are gifted storytellers who weave sensationalist yarns around scientific ‘facts’ in simple, emotionally persuasive language. Their narratives are largely scrubbed clean of nuance or doubt, giving them a false air of authority — and making their message even more convincing. Like their political counterparts, science populists are sources of misinformation. They promote false crises, while presenting themselves as having the answers. They understand the seduction of a story well told — relentlessly seeking to expand their audience — never mind that the underlying science is warped in the pursuit of fame and influence.
In this day and age, good storytelling is more necessary, but riskier, than ever before, particularly when it comes to science. Science informs medical, environmental, legal, and many other public decisions, as well as our personal opinions on what to be wary about and how to lead our lives. Important societal and individual actions depend on our best understanding of the world around us — now more than ever, with the plague in all our houses, and the worst yet to come with climate change.
It is time to subject our Populist Prophet, and others like him, to serious scrutiny,” says Narayanan. And that is precisely what he does.
“Harari has seduced us with his storytelling,” Narayanan concludes, “but a close look at his record shows that he sacrifices science to sensationalism, often makes grave factual errors, and portrays what should be speculative as certain. The basis on which he makes his statements is obscure, as he rarely provides adequate footnotes or references and is remarkably stingy with acknowledging thinkers who formulated the ideas he presents as his own. And most dangerous of all, he reinforces the narratives of surveillance capitalists, giving them a free pass to manipulate our behaviors to suit their commercial interests. To save ourselves from this current crisis, and the ones ahead of us, we must forcefully reject the dangerous populist science of Yuval Noah Harari.”
The biggest problem with remote work
Working from anywhere has been successful for veteran employees in defined roles with trusted colleagues, but it is much harder for new workers, new groups, and new ideas to get revved up, Derek Thompson writes in The Biggest Problem With Remote Work. He nevertheless believes the corporate world can solve these problems, because other industries already have.
“[M]id-1800s companies had to invent an entirely new system of organizing work. They needed a new layer of decision makers who could steer local production and distribution businesses. A new species of employee was born: the ‘middle manager.’
‘As late as 1840, there were no middle managers in the United States,’ Alfred Chandler observed in The Visible Hand, his classic history of the rise of America’s managerial revolution. In the early 1800s, all managers were owners, and all owners were managers; it was unheard of for somebody to direct employees without being a partner in the company. But once ownership and management were unbundled, new kinds of American companies were made possible, such as the department store, the mail-order house, and the national oil and steel behemoths.
In the 1800s, new technology allowed U.S. companies to extend their distribution and production tentacles across the continent, necessitating a new class of worker. Today’s hybrid companies, similarly extended across the country and even around the world, need to invent a new role to remain competitive and sane. This role would determine what work was ‘hard work’ that could be done asynchronously and from anywhere, and what necessary ‘soft work’ would require people to be in an office at the same time. Based on a comprehensive understanding of total workflow and team dynamics, this person would develop and constantly update a plan of who needs to be in the office, and on what days, and where they sit, and why they are there in the first place.”
“Operations teams at many companies are already doing some of this work. Often these teams are spread across multiple challenges that preexisted the pandemic — like recruiting, IT, office maintenance, and normal pre-pandemic communications. For these stressed and overstretched workers, coordinating the perfect hybrid cadence is the third priority for five different people. But managing a remote or hybrid workflow is too important to sprinkle onto old positions. It’s a discrete task, with discrete challenges, which deserves a discrete job.
The synchronizer — or, for large companies, a team of synchronizers — would be responsible for solving the new-worker, new-group, and new-idea problems. Synchronizers would help new workers by ensuring that their managers, mentors, and colleagues are with them at the office during an early onboarding period. They would plan in-person time for new teammates to get to know one another as actual people and not just abstracted online personalities. They would coordinate the formation of new groups to tackle new project ideas, the same way that modern teams in science pull together the right researchers from around the world to co-author new papers. They would plan frequent retreats and reunions across the company, even for workers who never have to be together, with the understanding that the best new ideas — whether in science, consulting, or media — often come from the surprising hybridization of disparate expertise.
The remote-work debate has become deeply polarized between people who consider it a moral necessity that is beyond criticism and those who consider it a culture-killer that is beyond fixing. Like the office, remote work will never go away, and like the office, it has important problems that deserve our attention. Solving remote work’s problems is a job worth doing.”
Drawing and thinking
“Learning the truth just by thinking, as one does in philosophy, is analogous to learning to see a face better by drawing it, in at least the sense that they require analogous attitudes of humility, Michael Thorne writes in Drawing and Thinking.
“The first step towards interpreting this analogy is to explain what, in the practice of philosophy, the observational drawing of an object could be analogous to. An answer that chimes with much of Ludwig Wittgenstein’s later work is that observational drawing is analogous to describing the use of words. Perhaps Wittgenstein is suggesting that this activity requires an attitude of humility in which one relinquishes one’s confidence that one already knows how words are used, so that one’s descriptions of the uses of words can be fully guided by their actual use.
In fact, Wittgenstein explicitly calls for something very like this kind of humility in his discussion, in Part II of the Philosophical Investigations, of the concept of the ‘state of seeing’ something. He exhorts us: ‘Do not think you knew in advance what ‘state of seeing’ means here. Let the use teach you the meaning.’
Why would Wittgenstein ask his reader to adopt this kind of humility? More generally, why might we think that this kind of humility is necessary in philosophy for learning the truth by thinking? An example may help us here.
Suppose we want to understand what it means to be healthy. Since ‘healthy’ is an everyday term, we might confidently believe that we already know how it is used, and we might say something like: to be healthy is to be free from illness or injury. Yet saying this would be like drawing a face without really looking at it first, and placing the eyes three-quarters of the way up the head. As Aristotle noticed (Metaphysics IV.2), not only do we call people healthy; we also apply the term to things that produce health (like certain diets), as well as symptoms of health (like certain complexions). But when we do this, we do not mean that these diets or complexions are free from illness or injury. In our confidence that we already knew how ‘healthy’ was used, we failed to be guided by the actual use of the word.
There is something surprising about the idea that we might need this kind of humility when describing the use of words that we ourselves competently use. How could we fail to know how they are used and yet remain competent users? The analogy with drawing makes this slightly less puzzling: I see faces all the time, yet I still thought the eyes should be drawn three-quarters of the way up the head. In both cases, a kind of familiarity with something is not enough to enable us to make an accurate copy of it, linguistically or pictorially. The question of what exactly is going on in such cases is beyond this discussion, though an answer might begin by noting the aptness of David Searcy’s claim to know ‘in a background sort of way’ what he failed to copy ‡.
Another question beyond this discussion is whether the kind of learning we have been concerned with — one that involves describing the use of words — is the only kind of learning that happens in philosophy. Here I merely claim that, at the very least, this is one kind of learning that happens in philosophy. Moreover, the learning that involves describing the uses of words seems to be a common and important kind of learning in philosophy. As Aristotle’s example is intended to show, it would be wrong to think that it is somehow confined to an era of ‘ordinary language philosophy’ that we have outgrown.
Resistance to the thought that descriptions of the uses of words have an important place in philosophy connects with a broader resistance, widespread today, to Wittgenstein’s views on the nature of philosophy. I would like to close with one remark on this.
The resistance is by no means incomprehensible. Many of Wittgenstein’s statements on the nature of philosophy, such as the famous remark that ‘a philosopher treats a question; like an illness’ (Philosophical Investigations §255), seem to present the idea that the most a philosopher can achieve is a kind of ‘unlearning.’ On this view, philosophy has the purely negative value of ridding us of misleading pictures of mind, language, logic, knowledge, and so on, which the surface features of our language suggest to us, and which lead us into confusion. The resistance is comprehensible in that it is hard not to feel dissatisfied with such a pessimistic view, or even to feel that it constitutes something of an insult to the countless creative minds whose efforts over the centuries have come to define the discipline of philosophy.
The remark I have been discussing suggests that this interpretation of Wittgenstein’s meta-philosophy needs to be nuanced. Here is Wittgenstein, apparently in self-examination, seeking to understand, not how he is ‘unlearning,’ but how he is learning. So perhaps Wittgenstein’s position does not deny the genuine achievements in learning that can happen in philosophy. Perhaps he would simply ask us to keep in mind how unusual this kind of learning is.”
‡ “Oh, but see if you look at the grass, it isn’t really solid green like that. You know? (Perhaps I knew. Perhaps in a background sort of way, as one knows there is death and history and other imponderables. But of course it’s green. We all know grass is green.) But look. You see? […] Look at the colours. Look at all the browns and yellows. See if you can draw that, won’t you? See if you can draw it as it is.” — David Searcy, from Shame and Wonder
Technology transfer across the ages
“There are countless instances throughout history of what could be called intellectual property theft — from the appropriation of the technology of Chinese silk manufacture in Byzantium to beer-brewing or porcelain recipes in Europe,” Daniel T Potts wonders in Technology transfer across the ages.
“When archaeological evidence clearly demonstrates that, chronologically, a certain technological innovation occurred in one place and, a thousand years later, it is found in another, distant locale, we are certainly justified in positing the origin of the technology in one core area and its subsequent diffusion elsewhere. Whether we are talking about bronze, which appeared in the Near East long before it began to be made in China, or cotton, which is attested in India millennia before it appeared in the Near East, most scholars are comfortable with presuming that one area was an original hearth of invention from which a given technology spread to other parts of Eurasia. The question is, however, by what means?
Itinerant craftsmen are well evidenced in the ancient Near East and one can conjure up any number of scenarios that might help us understand how their movement resulted in the concomitant spread of a particular technology. Gold filigree, for instance, was present in Georgia long before examples of it appear in the Royal Cemetery of Ur in southern Mesopotamia, c.2500 BC. Could some craftsmen have found their way from the Caucasus to what is today southern Iraq, introducing a new technology in jewellery manufacture for an élite clientele? The likelihood of this happening over vast distances within the lifetime of a single craftsman seems slim. However, the time-honoured, if somewhat discredited, notion of ‘diffusion,’ once a panacea for all the inexplicable, seemingly coeval appearances of technologies across Eurasia, has something to offer in cases such as these. In the past, prehistorians rejected diffusion because it was invoked as a mechanism of contact in ways that now seem somewhat laughable. Every time the same motif appeared on pottery found thousands of kilometres apart, it was attributed to diffusion, going in the direction from the earlier to the later manifestation of the design. In the case of some technologies, critics of diffusionism argued, it was more logical to assume independent invention in two or more different places, rather than a speculative, chronologically implausible case of diffusion.
In fact, there are other ways in which technologies may diffuse and people, including craftsmen, may move across the landscape. Already in prehistoric times the presence of hugely dissimilar, painted pottery in one and the same small settlement suggests that exogamy may have served as a mechanism for the diffusion of alien motifs and even technologies. If, as has been demonstrated in ethnographic studies conducted all over the world, most pre-modern, pre-industrialised pottery manufacture was done by women, and exogamous marriage combined with patrilocal residence patterns were in place — in other words, women from one village, when they married, moved to the village of their husband — then it is easy to see how decorative motifs, and even shapes and pottery production techniques, could have been brought from one place to another as female potters were transplanted from their home villages to those of their mates.
In [this and other examples] we must reckon with a sliding archaeological timescale, from the earliest attestation of a given technology to the later manifestations of it. Yet such diffusion need not have occurred overnight. In some cases, it was surely an incremental process by which technologies and the advantages they brought with them were carried across short distances, through adjacent polities, until, after a few centuries or a millennium, they were to be found in widely separated areas, geographically speaking. In this sense, diffusion is a perfectly respectable mechanism to invoke in studies of technology transfer.
Yet another issue that is rarely mentioned in studies of technology in the ancient Near East is that of different states of technological sophistication in different areas. Technologies serve us in a variety of ways and can become essential without our even realising it. But not all societies or their members have the same technological needs. The military, for example, as noted in the examples cited above, had an appetite for certain kinds of technology that differed from the members of a village community. Urban societies had needs that rural ones did not. From an archaeological perspective, these differing needs contributed to a very different material culture ‘signature.’ In other words, the sorts of finds made in excavating an urban setting may differ from those found in a rural one. A rural community may appear ‘Neolithic’ compared to an urban one. Two contemporaneous sites — one urban and the other rural — may differ to such an extent that the rural one appears stuck in the Stone Age while a few kilometres away people were living in utterly different conditions that appear more sophisticated and technologically advanced. This is not to cast aspersions on rural members of a society, for Neolithic societies, if by that we mean early agriculturalists and herders, used just as much technology as their Bronze or Iron Age urban-dwelling brethren. They simply used different technologies.
The very terms we employ to describe the broad phases of the human past in Eurasia — Stone Age, Bronze Age, Iron Age — are inherently technology-based. What the so-called three-age system (devised by Christian Jürgensen Thomsen (1786–1865) c.1818 as a means of organising the collection in the Oldnordisk Museum in Copenhagen) failed to capture was the simultaneity of different technological systems. It is now clear, however, thanks to absolute methods of dating like radiocarbon, that hunters, gatherers and farmers, using what might be termed Stone Age technologies, lived side by side with town and city dwellers making use of technologies that might be more commonly ascribed to the Bronze or Iron Age. It is also clear that, in some areas, bronze tools and weapons continued in use for centuries after some neighbouring regions had moved from bronze to iron for most of their needs, while other contemporary communities — hunters, gatherers, fishermen, farmers — may have continued to use flaked stone tools that, taken out of context and without reliable means for their dating, could easily be mistaken for objects of Palaeolithic type from the Stone Age, thousands of years older.
What we see, therefore, is abundant evidence of the contextually dependent adoption of technology in ancient Near Eastern societies. Communities living in proximity to each other may have used very different sorts of technology, giving the appearance that one was still in the Stone Age while another was more ‘advanced’. The broad-brush approach, like Thomsen’s three-age system, led to a layer cake image of unidirectional technology uptake that is far too simplistic. The explosion of archaeological exploration in the Near East since the Second World War has resulted in the acquisition of a wealth of data from what were once considered marginal or peripheral zones. In reality, the nineteenth- and early-twentieth-century focus on high-profile ‘centres’ of civilisation has given way to a much more truthful, accurate picture of life throughout the multiplicity of environmental zones that characterise the region. This picture shows us that humankind did not move in lockstep up an imaginary ladder of technological evolution. Technologies were adopted according to differing needs. Whether it was the proximity of abundant copper sources and the technical efficacy of bronze tools that caused the retention of bronze technology in the Persian Gulf, while contemporary communities in Assyria and Anatolia adopted iron, or the military imperative of Hittite and Egyptian armies that fuelled evolution in chariotry, technology was an independent variable in human existence that reflected the needs of the individuals, groups and communities employing it. What scholars have often labelled ‘conservatism’ with reference to technology — making a moral judgement upon those who seemingly shunned new technology out of a kind of ignorant or perverse unwillingness to change — can be shown to reflect a deep understanding of exactly what a given technology does and why it is implemented. Change and innovation in ancient technology are abundantly attested and so, too, is continuity. Technological praxis, learned experience and centuries-long tradition are all entwined in strategies of technological deployment in ancient societies that we, as modern students of those societies, must seek to understand, not to judge.”
Antico Setificio Fiorentino
“You are not just buying a fabric. You are also receiving a part of my heart. This is the real difference between an artisanal textile and one made industrially.”
In A Glimpse Inside a Florentine Silk-Weaving Workshop, Susan Wright writes about the Antico Setificio Fiorentino, one of the last remaining workshops for silk manufacturing in the world.
“Silk was introduced to Italy by Catholic missionaries working in China around the year 1100. The art of silk weaving and sericulture in Tuscany flourished in the 14th century; the main production was in Lucca, though it soon expanded to Florence, Venice and Genoa.
At peak production, there were around 8,000 looms operating in Florence. Today only a handful of those remain, eight of which are in production in the Antico Setificio Fiorentino. (Those eight looms were donated by noble families in the 1700s.) In total, the mill houses 12 looms, including the more recent semi-mechanical machines.
At the heart of the silk mill is a machine called a warper, which prepares warp yarns to be used on a loom. This particular warper, designed to operate vertically, was built in the early 19th century, according to original drawings made by Leonardo da Vinci in 1485.
‘We use it in the way that it was designed — powered by hand,’ said Fabrizio Meucci, the technician and restorer at the workshop. ‘It’s not just there for its beauty,’ Mr. Meucci added, describing the workshop as a ‘living and working mill that looks like a museum.’
It’s mesmerizing to watch Leonardo’s warper machine in motion, spinning and perfectly aligning warp threads from a row of twirling spools onto the creel, which gathers the precious threads. These warp threads are then used to weave trims, ribbons, cords and braiding — used for everything from upholstery, furnishings, and bed and bath linens to fashion clothing and accessories.”
“In ecological economics, we’ve tried to make a distinction between development and growth. When something grows, it gets bigger physically by accretion or assimilation of material. When something develops, it gets better in a qualitative sense. It doesn’t have to get bigger. An example of that is computers. You can do fantastic computations now with a small material base in the computer. That’s real development. And the art of living is not synonymous with ‘more stuff.’ People occasionally glimpse this, and then we fall back into more, more, more.” — Herman Daly in This Pioneering Economist Says Our Obsession With Growth Must End (The New York Times, July 17, 2022)