Random finds (2017, week 11) — On inequality as a feature (not a bug), why facts alone won’t change your mind, and calling bullshit

“I have gathered a posy of other men’s flowers, and nothing but the thread that binds them is mine own.” — Michel de Montaigne

Random finds is a weekly curation of my tweets and a reflection of my curiosity.

Inequality as a feature (not a bug)

“The Silicon Valley economy has caused massive disruption of traditional business and business models — in the process, making a relatively small cadre of brilliant engineers staggeringly wealthy,” Gregory Ferenstein writes in an artcile for City Journal, titled The Disrupters. “These dislocations, while profound, have been reasonably manageable. But in the years to come, a vast new range of technological innovation — from self-driving cars to robots — may make the disruptions we have seen so far look tame. In this coming world, driven by innovation and powered by individual brilliance, what role will ‘normal’ employees and small-business owners have,” he wonders.

Image for post
Image for post
Illustration by Walter Vasconcelos (City Journal)

Ferenstein sought to learn how the tech elite would answer these and other questions, and how they think more broadly, by polling dozens of start-up founders and conducting interviews with a handful of notable billionaires. By doing so, Ferenstein hoped to discover the long-term economic vision of tech leaders, who are beginning to take on a broader role as public leaders

As far as the future of innovation and its impact on ordinary people, the most common answer I received in Silicon Valley was this: over the (very) long run, an increasingly greater share of economic wealth will be generated by a smaller slice of very talented or original people. Everyone else will increasingly subsist on some combination of part-time entrepreneurial ‘gig work’ and government aid. The way the Valley elite see it, everyone can try to be an entrepreneur; some small percentage will achieve wild success and create enough wealth that others can live comfortably. Many tech leaders appear optimistic that this type of economy will provide the vast majority of people with unprecedented prosperity and leisure, though no one quite knows when.

When Paul Graham, one of Silicon Valley’s living legends, said it’s the job of tech to create inequality, Ferenstein wanted to know whether Graham’s view was representative. “In an economy where income was perfectly allocated by how much wealth each worker contributes, would this world be very equal or unequal?,” he asked. The answer he got was that a meritocracy inherently leads to an unequal world. Ferenstein subsequently asked what percentage of wealth would be held by the richest people in a perfect meritocracy. Roughly eight of 12 respondents said that 50 percent or more of all income would go to the top 10 percent. This is an exceedingly common worldview in Silicon Valley, where tech executives often praise so-called ‘10xer engineers’ — an elite class of worker ten times more productive than average workers.

“Understanding these frank beliefs on inequality is an important step in placing Silicon Valley’s common policy solutions in context with its goals,” Ferenstein writes. “For instance, tech founders express broad support for increased ‘equality of opportunity’ for every American. When it comes to giving people of all backgrounds a better shot […], tech founders sound supportive. For instance, in response to a lack of ethnic and gender diversity at tech companies (all of which are about 80 percent male and about 60 percent Asian or white), the industry has directed cash toward programs that teach coding to underprivileged communities.”

One of them is Code.org. But when you ask its donors whether Code.org, or other diversity initiatives, will reverse the overall trend of growing economic inequality, they demur. “It’s not that everyone will have the skills for a high-performing job,” LinkedIn cofounder Reid Hoffman told Ferenstein. Instead, he claimed that mass computer literacy would help many more people make meaningful inventions that would benefit their companies and communities. But to Ferenstein, the upshot was clear: “most of Silicon Valley’s charitable initiatives, especially education, engineer a world of vast inequality. This fact is foundational to the high-skilled world. To paraphrase a common saying in the Valley: inequality is a feature, not a bug.”

Image for post
Image for post
Illustration by Walter Vasconcelos (City Journal)

What Ferenstein also discovered through his survey was that “Silicon Valley represents an entirely new political category: not quite liberal and not quite libertarian. They make a fascinating mix of collectivists and avid capitalists.”

“On the capitalistic side,” Ferenstein says, “tech founders were extraordinarily optimistic about the nature of change, especially the kind of unpredictable creative destruction associated with free markets. […] The tech industry’s obsession with innovation is, at its core, a belief that the future gets better. Change is evolutionary. The more things change, the more companies fail — and, alas, the more people get fired — the more we learn how to do things better.” They also believe that the government should be run like a business. Using traditional American political categories, this would land them in the Republican camp.

In Silicon Valley, unearthing the latent talent of each individual is the top priority. For technologists, this means making tools that enable people to create new ideas and distribute them. For the state, this means a role as an investor, rather than as a regulator. Instead of stifling capitalism, the state accelerates the promise of capitalism by heavily funding education, welcoming high-skilled immigrants, and paying for breakthrough scientific research.

“What is government’s role as it concerns those who can’t be entrepreneurs?,” Ferenstein wonders. “As the Valley elite see it, it’s to tax the wealthy — and give everyone else lots of cash. […] A no-strings-attached mass cash transfer [or universal basic income] will ensure that no matter what happens in the future, everyone will have a reasonable income.” According to Y Combinator president Sam Altman, we shouldn’t try to regulate our way to stopping the inevitable rise of inequality but instead raise the quality of life for everyone. Basic income will allow many more people to contribute something unique to the world.

“I think there are new novelists who are right now driving for Uber who could contribute more to the sum output of humanity; there are great artists, there are people who just have new ideas about how to build communities that make people happy, that have nothing to do with tech or start-ups at all but are currently not able to do what they want to do.” — Sam Altman, president of Y Combinator

Technology leaders and entrepreneurs have emerged as some of the most prominent figures in American life, and Silicon Valley has become a byword for innovation, brilliance, and futuristic thinking. But the broad philosophy and long-term economic vision of Silicon Valley remain little understood. “As tech leaders move to the forefront of economic and public policy, Americans [or rather, ‘we’] should understand better how they think — especially since their goals and conclusions tend to separate them from earlier generations of business leaders. In many respects, their vision represents something new. Whether it is something that Americans [and agian, ‘we’] will embrace and support remains to be seen.”

Why facts alone won’t change your mind

In her highly recommended long read, titled This Article Won’t Change Your Mind, Julie Beck, a senior associate editor at The Atlantic, where she covers health and psychology, explores why facts alone aren’t enough to fight false beliefs. A few quotes …

“Still, all manner of falsehoods — conspiracy theories, hoaxes, propaganda, and plain old mistakes — do pose a threat to truth when they spread like fungus through communities and take root in people’s minds. But the inherent contradiction of false knowledge is that only those on the outside can tell that it’s false. It’s hard for facts to fight it because to the person who holds it, it feels like truth.”

Illustration by John Garrison for The Atlantic.

“At first glance, it’s hard to see why evolution would have let humans stay resistant to facts. ‘You don’t want to be a denialist and say, ‘Oh, that’s not a tiger, why should I believe that’s a tiger?’ because you could get eaten,’ says [Lee] McIntyre, a research fellow at the Center for Philosophy and History of Science at Boston University.

But from an evolutionary perspective, there are more important things than truth. Take the same scenario McIntyre mentioned and flip it on its head — you hear a growl in the bushes that sounds remarkably tiger-like. The safest thing to do is probably high-tail it out of there, even if it turns out it was just your buddy messing with you. Survival is more important than truth.

And of course, truth gets more complicated when it’s a matter of more than just ‘Am I about to be eaten or not?’ As Pascal Boyer, an anthropologist and psychologist at Washington University in St. Louis points out in his forthcoming book The Most Natural Thing: How Evolution Explains Human Societies: ‘The natural environment of human beings, like the sea for dolphins or the ice for polar bears, is information provided by others, without which they could not forage, hunt, choose mates, or build tools. Without communication, no survival for humans.’

In this environment, people with good information are valued. But expertise comes at a cost — it requires time and work. If you can get people to believe you’re a good source without actually being one, you get the benefits without having to put in the work. Liars prosper, in other words, if people believe them. So some researchers have suggested motivated reasoning may have developed as a ‘shield against manipulation.’ A tendency to stick with what they already believe could help protect people from being taken in by every huckster with a convincing tale who comes along.


Part of the problem is that society has advanced to the point that believing what’s true often means accepting things you don’t have any firsthand experience of and that you may not completely understand. Sometimes it means disbelieving your own senses — Earth doesn’t feel like it’s moving, after all, and you can’t see climate change out your window.

In areas where you lack expertise, you have to rely on trust. […] The problem is that who and what people trust to give them reliable information is also tribal. Deferring to experts might seem like a good start, but [Yale’s professor of law and psychology] Dan Kahan has found that people see experts who agree with them as more legitimate than experts who don’t.”

Image for post
Image for post
Illustration by John Garrison for The Atlantic.

“So much of how people view the world has nothing to do with facts. That doesn’t mean truth is doomed, or even that people can’t change their minds. But what all this does seem to suggest is that, no matter how strong the evidence is, there’s little chance of it changing someone’s mind if they really don’t want to believe what it says. They have to change their own.” This, Beck writes, is more likely to happen in group interactions, because groups are usually better at coming up with the correct answers to reasoning tasks than individuals are. However, the wisdom of groups is probably diminished if everyone in a group already agrees with each other.

“One real advantage of group reasoning is that you get critical feedback,” McIntyre says. “If you’re in a silo, you don’t get critical feedback, you just get applause.” But if the changes are going to happen at all, it’ll have to be on a person-to-person level. Even then, when someone does change their mind, it will be gradually and slowly — “sort of like death by a thousand cuts.”

A bit more …

Tired of alternative facts, fake news, and breathless hyperbole, Jevin West and Carl Bergstrom, two professors at the University of Washington, are trying to strike a blow for science with a new course: Calling Bullshit In the Age of Big Data.

The class website and colorful syllabus went online last month and almost instantly went viral. “We just struck a nerve,” said West, an assistant professor in UW’s Information School.

“The world is awash in bullshit,” West and Bergstrom write. “Politicians are unconstrained by facts. Science is conducted by press release. Higher education rewards bullshit over analytic thought. Startup culture elevates bullshit to high art. Advertisers wink conspiratorially and invite us to join them in seeing through all the bullshit — and take advantage of our lowered guard to bombard us with bullshit of the second order. The majority of administrative activity, whether in private business or the public sphere, seems to be little more than a sophisticated exercise in the combinatorial reassembly of bullshit.

We’re sick of it. It’s time to do something, and as educators, one constructive thing we know how to do is to teach people. So, the aim of this course is to help students navigate the bullshit-rich modern environment by identifying bullshit, seeing through it, and combating it with effective analysis and argument.”

What exactly is bullshit anyway?

Surprising as it may seem, there has been considerable scholarly discussion about this exact question. Unsurprisingly given that scholars like to discuss it, opinions differ.

As a first approximation, we subscribe to the following definition:

‘Bullshit’ is language, statistical figures, data graphics, and other forms of presentation intended to persuade by impressing and overwhelming a reader or listener, with a blatant disregard for truth and logical coherence.

It’s an open question whether the term bullshit also refers to false claims that arise from innocent mistakes. Whether or not that usage is appropriate, we feel that the verb phrase calling bullshit definitely applies to falsehoods irrespective of the intentions of the author or speaker. Some of the examples treated in our case studies fall into this domain. Even if not bullshit sensu stricto, we can nonetheless call bullshit on them.

In this course, we focus on bullshit as it often appears in the natural and social sciences: in the form of misleading models and data that drive erroneous conclusions.

In The body is the missing link for truly intelligent machines, Ben Medlock, co-founder of SwiftKey, a mobile app that uses predictive technology to adapt to the way that users type, writes, “it’s a bit of a leap to go from smart, self-organising cells to the brainy sort of intelligence that concerns us here. But the point is that long before we were conscious, thinking beings, our cells were reading data from the environment and working together to mould us into robust, self-sustaining agents. What we take as intelligence, then, is not simply about using symbols to represent the world as it objectively is. Rather, we only have the world as it is revealed to us, which is rooted in our evolved, embodied needs as an organism. Nature ‘has built the apparatus of rationality not just on top of the apparatus of biological regulation, but also from it and with it’, wrote the neuroscientist Antonio Damasio in Descartes’ Error (1994), his seminal book on cognition. In other words, we think with our whole body, not just with the brain.”

Detail from Patroclus by Jacques Louis David, 1780.

“I suspect that this basic imperative of bodily survival in an uncertain world is the basis of the flexibility and power of human intelligence. But few AI researchers have really embraced the implications of these insights. The motivating drive of most AI algorithms is to infer patterns from vast sets of training data — so it might require millions or even billions of individual cat photos to gain a high degree of accuracy in recognising cats. By contrast, thanks to our needs as an organism, human beings carry with them extraordinarily rich models of the body in its broader environment. We draw on experiences and expectations to predict likely outcomes from a relatively small number of observed samples. So when a human thinks about a cat, she can probably picture the way it moves, hear the sound of purring, feel the impending scratch from an unsheathed claw. She has a rich store of sensory information at her disposal to understand the idea of a ‘cat’, and other related concepts that might help her interact with such a creature.

This means that when a human approaches a new problem, most of the hard work has already been done. In ways that we’re only just beginning to understand, our body and brain, from the cellular level upwards, have already built a model of the world that we can apply almost instantly to a wide array of challenges. But for an AI algorithm, the process begins from scratch each time. There is an active and important line of research, known as ‘inductive transfer’, focused on using prior machine-learned knowledge to inform new solutions. However, as things stand, it’s questionable whether this approach will be able to capture anything like the richness of our own bodily models.”

“According to this conception, high intelligence is essential to creative genius, but only insofar as it collaborates with cognitive disinhibition. Exceptional intelligence alone yields useful but unoriginal and unsurprising ideas. Marilyn vos Savant made it into the Guinness Book of Records for the world’s highest recorded IQ, and yet has not managed to find a cure for cancer or even build a better mousetrap.” — Dean Keith Simonton in If You Think You’re a Genius, You’re Crazy

Human beings are most creative when we get time by ourselves and then time with one another, Judah Pollack and Olivia Fox Cabane write in an article or FastCompany.

“The way to maximize creative potential is to flow between being alone and being in a group, and back again. When you’re alone, you’re essentially building a woodpile in your brain. Then, when you join a group, you’re igniting a shower of sparks that might light it up. Of course, you sometimes need to go be alone again in order to let the sparks you’ve started generating get close enough to the wood.”

“Paul Paulus at the University of Texas at Arlington ran an experiment to test how this process works. First, he had a group spend 10 minutes writing down ideas. Then they went and built off those ideas individually. Then Paulus reversed the conditions: People wrote ideas down alone and then brought them to the group. The group that worked under the first condition produced 37% more ideas than the second.

But then Paulus tried it a third way: He had another group spend eight minutes writing down ideas individually, then come together to share their ideas for three minutes, then go back to being alone, only to rejoin the group a second time–four steps: alone, together, alone, together. This group produced 71% more ideas per person. Alternating between solo time and collaboration seemed to encourage more creativity than either approach exclusively–very likely because that’s how our brains are built.

How can you put this into practice? Try brainstorming like this:

  1. Grab some large sticky notes and have everyone write down their ideas, one per note, for 10 minutes.
  2. Have them put their ideas on the wall, and everyone gets three minutes to look them over.
  3. When time’s up, everyone goes back and writes new ideas for five more minutes.
  4. The stickies go up, and everyone looks at them for two minutes.
  5. Then everyone goes back to being alone and writes out new ideas for just 90 seconds.
  6. The stickies go up one last time, and everyone looks at them for a final five minutes.
  7. Discuss.

You’ll be done in half an hour.”

In Making Athens great Agian, Rebecca Newberger Goldstein explores what happens when a society, once a model for enlightened progress, threatens to backslide into intolerance and irrationality — with the complicity of many of its own citizens? How should that society’s stunned and disoriented members respond? Do they engage in kind, resist, withdraw, even depart? It’s a dilemma as old as democracy itself.

“Twenty-four centuries ago, Athens was upended by the outcome of a vote that is worth revisiting today. A war-weary citizenry, raised on democratic exceptionalism but disillusioned by its leaders, wanted to feel great again — a recipe for unease and raw vindictiveness, then as now. The populace had no strongman to turn to, ready with promises that the polis would soon be winning, winning like never before. But hanging around the agora, volubly engaging residents of every rank, was someone to turn on: Socrates, whose provocative questioning of the city-state’s sense of moral superiority no longer seemed as entertaining as it had in more secure times. Athenians were in no mood to have their views shaken up. They had lost patience with the lively, discomfiting debates sparked by the old man. In 399 b.c., accused of impiety and corrupting the young, Socrates stood trial before a jury of his peers — one of the great pillars of Athenian democracy. That spring day, the 501 citizen-jurors did not do the institution proud. More of them voted that Socrates should die than voted him guilty in the first place.”

Illustration by Owen Davey for Aeon Magazine.

“It’s all too easy to imagine, at this moment in American history, the degree of revulsion and despair Plato must have felt at the verdict rendered by his fellow Athenians on his beloved mentor. How could Plato, grieving over the loss of the ‘best man of his time,’ continue to live among the people who had betrayed reason, justice, open-mindedness, goodwill — indeed, every value he upheld? From his perspective, that was the enormity Athenians had committed when they let themselves be swayed by the outrageous lies of Socrates’s enemies. Did truth count for nothing?”

Canadian studio Omar Gandhi Architect has created a remote vacation home in Nova Scotia for an urban couple, with floor-to-ceiling windows that provide sweeping views of the sea. The residence, called Lookout at Broad Cove Marsh, is located on Cape Breton Island, on the outskirts of the coastal village of Inverness. (source: dezeen)

“Knowledge consists in the search for truth. It is not the search for certainty.” — Karl Popper (In Search of a Better World: Karl Popper on Truth vs. Certainty and the Dangers of Relativism, by Maria Popova, Brainpickings)

Written by

helping leaders navigate complexity with confidence & clarity of thought | varius multiplex multiformis

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store