Working Notes of a Practising Neo-Generalist (#14) — On bubbles and the need for cognitive diversity

Mark Storm
12 min readAug 18, 2017

--

“If anyone can refute me — show me I’m making a mistake or looking at things from the wrong perspective — I’ll gladly change. It’s the truth I’m after, and the truth never harmed anyone. What harms us is to persist in self-deceit and ignorance.” — Marcus Aurelius in Meditations, Book 6:21 (translated by Gregory Hays, The Modern Library, New York)

On bubbles and the need for cognitive diversity

One of the things that struck me the most while watching the first part of the BBC documentary Secrets of Silicon Valley was how Sam Altman, the President of Y Combinator, replied to Jamie Bartlett:

[JB] I think it’s an important job for journalists to try to ask about the negative possibilities of this stuff.

With “this stuff” Bartlett meant the utopian and apparently also inescapable dream that Altman and many of his Silicon Valley contemporaries have of the future.

[SA] “I think if you continue this thrust of, shouldn’t we stop progress, no-one’s going to take you seriously because people want this stuff, and people don’t think we should still have people in poverty. People don’t think that we should take away our iPhones and take away Facebook. So I think you can add a really important voice, but I worry you’re going in the wrong direction with this, like, anti-progress angle.”

That, and the long and chilling stare he gave Bartlett immediately after.

“Technology magnifies differences, and it’s been replacing or obviating jobs for a long time. But what happens as that case accelerates? I’m not one of these doomsayers who says, ‘There will be no jobs.” — Sam Altman (Photograph by Steve Jennings/Getty Images for TechCrunch)

Last week, I read Why we fell for clean eating, a longread by Bee Wilson. At the Cheltenham literary festival, she and dietitian Renee McGregor, who works both with Olympic athletes and eating disorder sufferers, were booed off stage by a crowd of around 300 clean-eating fans.

“We were supposedly taking part in a clean-eating debate with ‘nutritionist’ Madeleine Shaw,” Wilson writes. “When we met on stage in Cheltenham, I asked Shaw why she told people to cut out all bread, and was startled when she denied she had said any such thing (rye bread was her favourite, she added). McGregor asked Shaw what she meant when she wrote that people should try to eat only ‘clean proteins’; meat that was ‘not deep-fried’ was her rather baffling reply. McGregor’s main concern […], she added, was that as a professional treating young people with eating disorders, she had seen first-hand how the rules and restrictions of clean eating often segued into debilitating anorexia or orthorexia.

‘But I only see the positive,’ said Shaw, now wiping away tears. It was at this point that the audience, who were already restless whenever McGregor or I spoke, descended into outright hostility, shouting and hissing for us to get off stage.”

Thinking about the event on the train home, Wilson realised that the crowd were angry with her and McGregor “not because they disagreed with the details (it’s pretty clear that you can’t have sugar in ‘sugar-free’ recipes), but because they disliked the fact that we were arguing at all. To insist on the facts made us come across as cruelly negative. We had punctured the happy belief-bubble of glowiness that they had come to imbibe from Shaw.”

Shaw’s ‘#eatclean fans’ reacted in the same way as Sam Altman when his beliefs were being questioned by Jamie Bartlett. “It’s striking that in many of the wellness cookbooks, mainstream scientific evidence on diet is seen as more or less irrelevant,” Wilson writes, “not least because the gurus see the complacency of science as part of what made our diets so bad in the first place.”

Madeleine Shaw, who hopes her website “brings you a world of health and happiness,” at a book signing at Harvey Nichols. (Photograph by Laura Sweetingham)

“Living in bubbles is the natural state of affairs for human beings,” says Derek Thomson in Everybody’s in a Bubble, and That’s a Problem. “People seek out similarities in their marriages, workplaces and peer groups.” This is called ‘homophily’ in sociological terms and, as both Altman and the crowd at Cheltenham show, the implications aren’t always positive.

‘Blind spots’ is a good way to describe the cost of all these bubbles, Thomson suggests. As we surround ourselves with like-minded people, only read what ‘everybody’ else is reading, watch what ‘everybody’ else is watching, we not only fortify our belief system, we also close our eyes and ears to alternative views and opinions — to ‘others.’ We persist in our beliefs, while at the same time shielding ourselves from alternative realities. Asking questions gets easily mistaken for being “anti-progress.” Challenging someone on the lack of scientific evidence for his or her claims turns into a jeering and shouting. “I was met by a hugely closed audience who really just wanted to evangelise the benefits,” McGregor wrote shortly after her ‘debate’ at Cheltenham.

“The white and male venture capital community bestows its money on white male founders. Little surprise then that this world is yet another bubble. Capital flows through homophilic networks, from white men to white men, sending a social and financial signal that encourages others to join the in-group.” — Derek Thomson in Everybody’s in a Bubble, and That’s a Problem

What to do, especially now every debate, whether about the possible future impact of artificial intelligence or even how ethnically diverse Roman Britain actually was, seems to end in a further closing of ranks? Now the persistence in our beliefs and a lack of self-doubt prematurely stifle all debate, and, as a consequence, real political and social progress.

There are no easy answers, especially because most of us are totally unaware of our biases and blind spots, as leading psychologists, Mahzarin Banaji and Anthony Greenwald point out in Blindspot: Hidden Biases of Good People. We all carry them from a lifetime of exposure to cultural attitudes about gender, age, ethnicity, religion, social class and much more. But even if we could see and understand our hidden biases, who are Banaji and Greenwald anyway? Just two experts. So, what would they know?

In Why we no longer trust the experts, Gillian Tett writes that “citizens of the cyber world no longer have much faith in anything that experts say, not just in the political sphere but in numerous others too.” The British campaign to leave the EU — “People in this country have had enough of experts,” the pro-Brexit politician Michael Gove warned — and anti-vaccination movement are just two of many examples. But what’s just, or maybe even more interesting are the areas where trust remains high, says Tett.

“In an annual survey conducted by the Edelman public relations firm, people in 20 countries are asked who they trust. They show rising confidence in the ‘a person like me’ category, and surprisingly high trust in digital technology. We live in a world where we increasingly trust our Facebook friends and the Twitter crowd more than we do the IMF or the prime minister.”

“In some senses, this is good news,” says Tett. “Relying on horizontal axes of trust should mean more democracy and empowerment for ordinary citizens. But the problem of this new world is that people can fall prey to social fads and tribalism — or groupthink.”

According to Leave campaigner and Conservative MP Michael Gove, the British people “have had enough of experts.” (Photograph: Getty)

There are, of course, good reasons for not always trusting experts. First of all, experts aren’t always right. Alan Greenspan, an outspoken opponent of increased regulation of the financial markets, is a prime example of a bright mind thinking linearly within a monotype world view. What this example makes perfectly clear is that experts can be right … until they’re wrong.

And also, not all experts are indeed experts. “People who care about truth and facts are up against a lot of challenges these days, from fake news to filter bubbles,” writes Daniel Levitin. But according to the Founding Dean of Arts and Humanities, The Minerva Schools at KGI, there’s something else we can’t ignore: the rise of the pseudo-experts who dominate our airwaves and social networks, offering opinions on subjects they know little to nothing about.

“Expertise in a given field is narrow, and isn’t interchangeable with expertise in another realm. Yet because we tend to respect people who are experts in their field, we can be all too willing to believe them when they step outside their area of knowledge,” Levitin writes. He believes scientists like himself are partly to blame. “When one of our own […] starts making false claims, we don’t stand up and denounce them.” But according to Levitin, “pseudo-expertise is a problem that has to become every individual’s responsibility. Nowhere is this more clear than among the climate-change deniers — almost entirely pseudo-experts — who contradict ample scientific evidence and lend support to devastating public policies.”

“Just as we wouldn’t let an optometrist perform open-heart surgery on us, we shouldn’t be influenced by pseudo-experts who weigh in on issues when they have no business doing so. It’s time to raise the bar on who we are willing to listen to.” — Daniel Levitin in It’s time to stop letting so-called ‘experts’ comment on subjects they know nothing about

But with over 97 percent of scientists agreeing that climate change is real and man-made, it shouldn’t be difficult to choose sites, unless you have some stated interest in, say, the coal industry. But it’s not always that clear-cut. Take Google’s so-called “anti-diversity” manifesto. Its author, a Google engineer who has since been fired, claimed that women’s biology makes them less able than men to work in technology jobs. This is not something I want to be true (judgement). But preferences apart and more importantly, is it ‘in fact’ true (fact)?

“We’ve studied gender and STEM for 25 years. The science doesn’t support the Google memo,” write Caryl Rivers and Rosalind C. Barnett in an article for Recode. According to them, the widely held belief that boys are naturally better than girls at math and science is unravelling among serious scientists. “Evidence is mounting that girls are every bit as competent as boys in these areas. […] Also, several large-scale international testing programs find girls closing the gender gap in math, and in some cases outscoring the boys. Clearly, this huge improvement over a fairly short time period argues against biological explanations.”

But then, Debra Soh, who holds a PhD in sexual neuroscience from York University, writes, “Despite how it’s been portrayed, the memo was fair and factually accurate. Scientific studies have confirmed sex differences in the brain that lead to differences in our interests and behaviour.” According to Soh, differences do exist at the individual level, “[b]ut to claim that there are no differences between the sexes when looking at group averages, or that culture has greater influence than biology, simply isn’t true.”

But regardless of who and what to believe, nobody can deny that Silicon Valley has a diversity issue. Google’s workforce “is currently composed of 31 percent women, with 20 percent working in technical fields. Those numbers are roughly on par with the tech sector as a whole, where about a quarter of workers are women,” according to Ian Bogost. Racial and ethnic diversity are even worse, “and so invisible that they barely register as a problem for the anonymous Googler.”

So, whatever your judgement, and we have seen plenty of those, fact is that the tech industry is dominated by men — mostly white, sometimes Asian. This has profound consequences for the technology that is being conceived and developed, and the impact it has on our lives. After all, Silicon Valley wants to completely transform the way in which things are done. Excluding the majority of people from thinking about how such a future —their future — could, or even should look like, borders on ‘totalitarianism.’

But Soh has a fair point when she says “we should be more concerned about viewpoint diversity than diversity revolving around gender.” Despite all their diversity efforts, organisations still embrace uniformity. Recently, I discussed this issue with the CEO of a consultancy firm. “But we have highly diverse teams,” he said, “men, women, diverse cultures and backgrounds, a wide range of studies.” “Maybe so,” I replied, “but they all think alike. There is hardly any openness towards other ideas and thoughts. Besides, they’re all so cocksure.”

“[James] Damore was fired, basically, for making a well-meant, if amateurish, attempt at institutional design, based on woefully incomplete information he picked from published research studies. But however imperfect his attempt, he was fired, in short, for thinking on his own. And what example does that set?” — Sabine Hossenfelder in Outraged By the Google Diversity Memo? I Want You to Think About It

Research suggests that the visible elements of diversity such as race, gender, sexual orientation, age or religion are actually really proxies for and often determinants of the core thing that actually matters: cognitive diversity or viewpoint diversity, says Christopher Haley in Silicon Valley’s other diversity problem. What is important in many circumstances, according to researchers like Scott E Page, is that members of a team think differently — leading them to bring different perspectives, try a wider variety of approaches, test a wider range of potential solutions and, in some cases, cancel each other’s biases more effectively.

Also according to Alison Reynolds and David Lewis, teams solve problems faster when they are more cognitively diverse. “If cognitive diversity is what we need to succeed in dealing with new, uncertain, and complex situations, we need to encourage people to reveal and deploy their different modes of thinking. We need to make it safe to try things multiple ways,” they write in Harvard Business Review. Adding that this also means “leaders will have to get much better at building their team’s sense of psychological safety.”

True, but it also has many implications for how and who we hire. We create cultural barriers that restrict the degree of cognitive diversity, even when we don’t mean to. The familiar saying, ‘We recruit in our own image,’ “doesn’t end with demographic distinctions like race or gender, or with the recruiting process, for that matter,” Reynolds and Lewis argue. “Colleagues gravitate toward the people who think and express themselves in a similar way” and as a result, organisations often end up with like-minded people and teams. Psychologists call this ‘functional bias’ or low cognitive diversity, and this is exactly what we see in many companies and organisations, and of course on social media. The bubbles we work and live in, and that already have, what Dave Gray describes in his book Liminal Thinking as ‘a self-sealing logic,’ are becoming more and more impenetrable, whether it’s the #cleaneat bubble, the utopian Silicon Valley bubble or even the one many experts operate in.

Studies over decades have shown that our existing ideas and beliefs act as a filter, often distorting new information to make it more consistent with our current view of things. This hinders us with being able to change our minds when presented with new information. Science also shows that the more strongly held the belief the more negatively we react when confronted with a challenge to that belief, as illustrated by both Sam Altman and the crowd at Cheltenham. And with the continuous flux of information and opinions, all fighting for our attention, and the growing importance of social media platforms and the spread of fake news across these platforms, the need for us to think critically and coherently will become ever more important.

But it also places a heavy responsibility on organisations, and society as a whole to understand and include alternative views and ideas. To understand that black and white are just two opposite ends of a spectrum that contains innumerable shades of grey. That there’s no such thing as the future, but “many other possible roads we could be going down,” as David Byrne writes in Eliminating the Human. And that “the one we are on is not inevitable or the only one; it has been (possibly unconsciously) chosen.” But nevertheless, chosen.

Challenging others is, of course, the easy part. Confronting our own ideas and beliefs, and asking ourselves on what these are actually based, is much more difficult. But nevertheless, necessary. As American journalist Edward Murrow has said, “A great many people think they are thinking when they are really rearranging their prejudices.”

David Byrne is an artist, musician and founder of Talking Heads. His most recent book is called How Music Works.

“When interaction becomes a strange and unfamiliar thing, then we will have changed who and what we are as a species. Often our rational thinking convinces us that much of our interaction can be reduced to a series of logical decisions — but we are not even aware of many of the layers and subtleties of those interactions. As behavioral economists will tell us, we don’t behave rationally, even though we think we do. And Bayesians will tell us that interaction is how we revise our picture of what is going on and what will happen next.” — David Byrne writes in Eliminating the Human

Secrets of Silicon Valley is available on BBC iPlayer. If you don’t have access to BBC iPlayer, you can read my transcript here (part 1, The Disruptors) and here (part 2, The Persuasion Machine).

--

--

Mark Storm
Mark Storm

Written by Mark Storm

Helping people in leadership positions flourish — with wisdom and clarity of thought