Random finds (2016, week 37) — On making peace with complexity, ethics for AI, and the superiority of print

Mark Storm
12 min readSep 16, 2016

--

Abandoned, by Christian Richter.

Every Friday, I run through my tweets to select a few observations and insights that have kept me thinking.

On making peace with complexity

In Like It Or Not, Complexity Is Something We Can No Longer Ignore, Greg Satell writes about Sam Arbesman new book, Overcomplicated.

As Sam Arbesman points out, complexity in our modern world is all but unavoidable. It is one thing that we don’t fully understand the devices we use every day, the markets that drive our commerce and the body of laws that govern our activity, but experts don’t either. We need to find a way to make peace with complexity.

Overcomplicated, by Sam Arbesman.

If you want to rail against any facet of the modern world, simply point to its complexity,” Satell writes. “Politicians are fond of holding up pieces of legislation and pointing to the thousands of pages they contain, because that kind of complexity is widely seen as a fatal flaw. After all, if it was thought through clearly, why couldn’t it have been devised more simply? Yet while we yearn for simple rules, those rules are often lead us astray. As Ludwig Wittgenstein put it in his famous rule following paradox, ‘no course of action could be determined by a rule because every course of action can be made out to accord with the rule.’ Simple rules tend to be ineffective because they are necessarily vague.”

Something similar happens when we try to tame complexity by summarizing it through identifying patterns. The problem with these rules and patterns, as Arbesman makes clear, is “that there will always be ‘edge cases’ that don’t fit. Sometimes these are evidence of a false pattern, but other times they are merely odd ducks that point the way to a more expansive rule. Much as Kurt Gödel pointed out long ago, we can make our systems consistent or complete, but not both."

Complexity is, to a large extent, unavoidable for which Arbesman gives us two reasons. “The first is accretion. We build initial systems […]. Yet to get those systems to scale, we need to build on top of them to expand their initial capabilities. As the system gets larger, it gets more complex. The second force that leads to complexity is interaction. We may love the simplicity of our iPhones, but we don’t want to be restricted to its capabilities alone. So we increase its functionality by connecting it to millions of apps. Those apps, in turn, connect to each other as well as to other systems.”

Instead of yearning for a simpler, tamer world, Arbesman suggests “we should take proud pleasure in the complexity we uncover with our creations, much as we would with a precocious child. Like Camus’ Sisyphus, we need to take pleasure in the struggle, knowing that it is within that struggle that we find true purpose.”

“Complexity brings the unexpected, but we realize it only when something goes wrong.” — Samuel Arbesman

Also Adrienne LaFrance, a staff writer at The Atlantic where she covers technology, wrote about Overcomplicated.

“Here we are in an era in which prevailing cultural attitudes toward technology are deeply at odds with how that technology actually behaves,” she writes in The Age of Entanglement. “While people marvel or sigh at computing systems with a mix of reverence and fear, they fail to appreciate that technology’s messy imperfections are both inevitable and, to some extent, comprehensible. At the same time, we’re being forced to confront a kind of radical novelty in technology, a seemingly inexorable push toward complexity that the theoretical physicist and computer scientist Edsger Dijkstra once described as ‘conceptual hierarchies that are much deeper than a single mind ever needed to face before.’ That was in 1988. Three decades later, the technological world is far more intricate still. As a result, almost eveything humans do in the technological realm, Arbesman writes, ‘seems to lead us away from elegance and understandability, and toward impenetrable complexity and unexpectedness.’ We’re living, he says, in an age of Entanglement.”

Entanglement, by Randy Walker (Scottsdale Public Art, picture: Dayvid LeMmon).

“Most people think about understanding as a binary condition,” Arbesman told LaFrance in an interview. “Either you understand things completely or not at all.” That viewpoint is dangerous when it’s applied to technology today, because there’s simply no way to understand everything. (Or, as Arbesman puts it in his book: “The vast majority of computer programs will never be thoroughly comprehended by any human being.”) Instead, he argues, people should be acting as technological naturalists, approaching complex digital systems the way a biologist would examine living systems. Doing so will require people to rethink what it means to understand technology, and at what scale:

“When attempting to understand a complex system,” says Arbesman, “we must determine the proper resolution, or level of detail, at which to look at it. How fine-grained a level of detail are we focusing on? Do we focus on the individual enzyme molecules in a cell of a large organism, or do we focus on the organs and blood vessels? Do we focus on the binary signals winging their way through circuitry, or do we examine the overall shape and function of a computer program? At a larger scale, do we look at the general properties of a computer network, and ignore the individual machines and decisions that make up this structure?”

None of these questions has a straightforward answer, and yet the tendency not to pose them at all, has left humans in a perilous and vulnerable place.

LaFrance: “Arbesman is not saying you need to dismantle your iPhone and build it from scratch, or only use apps that you created yourself. (Although, hey, if that’s your thing, great.) But he is saying that active curiosity — and a certain degree of futzing with the technological systems we encounter — is culturally overdue. ‘The need to have a more calm, tinkering approach to technologies is going to be very, very important,’ he said. ‘I think also the idea that we need to build in the understanding from the outside that our systems are going to be buggy. Especially now, if people realize more explicitly that we’re in this new age of incomprehensibility. That abdication of responsibility is too easy, to say, ‘Oh, I don’t know what I’m doing. Everything is magical and I don’t understand it.’”

Further reading: The Enlightenment is Dead, Long Live the Entanglement, by Danny Hills for Journal of Design and Science.

On ethics for AI

In last week’s Random finds, I wrote about Sheila Jasanoff, Pforzheimer Professor of Science and Technology Studies at the Harvard Kennedy School, and the author of The Ethics of Invention, a book that delves into how we should approach a world increasingly governed by tech. In an interview with Mother Jones’ Kanyakrit Vongkiatkajorn, Jasanoff explains why she believes we don’t sufficiently acknowledge how much power we’ve handed over to technology, which, she writes, “rules us as much as laws do.” What we need, she says, is far more reflection on the role that tech plays in our lives now and what role we want it to play in the future. One of her hopes in writing this book was to explore the “’need to strengthen our deliberative institutions to rethink things,” so that “people will recognize that this is a democratic obligation every bit as much as elections.’ She sees that putting ‘a technology into place [is] like putting a political leader into place, and we should take political responsibility for the consequences of technology in the same way that we at least try to take political responsibility for who we elect.”

How much power we’ve handed over became apparent last week, when Facebook U-turned on its decision to remove the iconic Vietnam war photo featuring a naked girl — the nine-year-old Kim Phúc running away from a napalm attack — after global outcry and accusations of ‘abusing power.’ Facebook initially defended its decision to remove the image, saying: “While we recognize that this photo is iconic, it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others.” The controversy has erupted at a time of increasing tensions between media organizations and Facebook, the site where 44% of US adults get their news.

1972 Napalm Girl Photo, Photographer Nick Ut (AP Images).

Phúc, who now lives in Canada with her husband and two children, piled on further pressure with her own powerful statement, saying: “I’m saddened by those who would focus on the nudity in the historic picture rather than the powerful message it conveys. I fully support the documentary image taken by Nick Ut as a moment of truth that capture the horror of war and its effects on innocent victims.”

(source: Facebook backs down from ‘napalm girl’ censorship and reinstates photo, by Sam Levin, Julia Carrie Wong and Luke Harding, The Guardian)

In Artificial intelligence is hard to see, Kate Crawford, who researches the social impacts of machine learning, AI, and large scale data, writes:

“The core issue here isn’t that AI is worse than the existing human-led processes that serve to make predictions and assign rankings. Indeed, there’s much hope that AI can be used to provide more objective assessments than humans, reducing bias and leading to better outcomes. The key concern is that AI systems are being integrated into key social institutions, even though their accuracy, and their social and economic effects, have not been rigorously studied or validated.

There needs to be a strong research field that measures and assesses the social and economic effects of current AI systems, in order to strengthen AI’s positive impacts and mitigate its risks. By measuring the impacts of these technologies, we can strengthen the design and development of AI, assist public and private actors in ensuring their systems are reliable and accountable, and reduce the possibility of errors. By building an empirical understanding of how AI functions on the ground, we can establish evidence-led models for responsible and ethical deployment, and ensure the healthy growth of the AI field.

If the social impacts of artificial intelligence are hard to see, it is critical to find rigorous ways to make them more visible and accountable. We need new tools to allow us to know how and when automated decisions are materially affecting our lives — and, if necessary, to contest them.”

Further reading: How Tech Giants Are Devising Real Ethics for Artificial Intelligence, by John Markoff, The New York Times.

A bit more …

“Each time my newspaper delivery runs late, as it did last Saturday morning, and I’m forced to the Web for my early dose of news, I’m reminded how reading the news online pales compared to reading it in newsprint,” Jack Shafer, an online journalist for 20 years, writes in Why Print News Still Rules on Politoco Magazine.

“[…] when it comes to immersion — when I really want the four winds of news to blow me deeper comprehension — my devotion to newsprint is almost cultistic. My eyes feel about news the way my ears feel about music driven from a broken pair of speakers — distorted, grating, and insufferable. Reading online, I comprehend less and I finish fewer articles than I do when I have a newspaper in hand. Online, I often forget why I clicked a page in the first place and start clicking on outside links until I’m tumbling through cyberspace like a marooned astronaut.”

Polish newspaper designer Jacek Utko asks if good design save the newspaper. It just might.

“What accounts for print’s superiority? Print — particularly the newspaper — is an amazingly sophisticated technology for showing you what’s important, and showing you a lot of it. The newspaper has refined its user interface for more than two centuries. Incorporated into your daily newspaper’s architecture are the findings from field research conducted in thousands of newspapers over hundreds of millions of editions. Newspaper designers have created a universal grammar of headline size, typeface, place, letter spacing, white space, sections, photography, and illustration that gives readers subtle clues on what and how to read to satisfy their news needs.”

“Thinking about play points us instead towards our inner resources that are equally as important — those exhibiting cognitive flexibility and surplus, rather than limits. Play is the source of that ‘influx of mind into matter’ (as Johan Huizinga put it in Homo Ludens), creating a world we should expect to be able to engage with, rather than simply be bamboozled by,” says Pat Kane, who is curating the Play theme of this year’s FutureFest, in The future’s in play on TheLong+Short.

Play will be to the 21st century what work was to the industrial era.

“We live in an era where explanations about human nature framed in terms of evolution has often encouraged a downbeat, self-doubting sense of agency. Play science restores to us a basic confidence that we can pursue and excel at the arts of living, in a complex world.”

In The Experience of Resonance: How To Create Vibrant Conversations, Sébastien Paquet explores what is it that makes us deem a conversation we have with someone to learn about something successful and worthwhile. “Having learned something is certainly part of the equation,” he writes, “but another aspect intuitively stands out: the feeling of connection, of there being a flow of current between ourselves and the other person. Call it resonance or vibrancy or flow, everyone has experienced this feeling.”

“Beyond what each conversation partner brings, there are elements of the experience they create together that impact whether resonance is felt or not. First, the setting and atmosphere impact resonance. Just compare the feel of a party to that of an office. An informal setting, with a playful and accepting atmosphere, without time pressure or anxiety, helps us relax into the now.

Second, resonance happens more easily if intentions are clear and some agreements are found. These things are not necessarily discussed explicitly, but they are felt. For instance, partners may sense from each other a permission to ‘tell the whole story,’ by tone, encouraging glances, or trigger words. A feeling of safety helps with all the factors described in the previous section.

Third, flexibility in the form of the conversation, including the ability to switch roles around between giver and taker, which makes people feel like peers, and the latitude to experience the conversation more as a dance than a tennis match helps create good flow.

Fourth, specific conversational events seem to have a particular importance. Laughing together helps: humor has a way of revealing a map of who a person is and how they place themselves in the world. It creates a protective bubble around the conversation. And one should not neglect the effect of Eureka moments. There’s a kind of delight that we experience when we actually learn something. I call it the ‘learning high.’ and like laughter, it has a kind of relaxing and opening effect.”

Are we turning our backs on the humanities? The declining enrollment in disciplines including history, literature, language, philosophy and the arts, at colleges and universities across the country, signals a significant cultural shift. In an episode of Aspen Ideas to Go, Leon Wieseltier, contributing editor for The Atlantic, and Harvard President Drew Gilpin Faust unpack why the diminished appeal of the humanities has huge cultural implications. Can this trend be reversed in a challenging age, when technology and quantification are highly revered?

In Why Startups Need Philosophers, Mike Sturm argues that philosophical thinking rests on two core principles. First, there are no boundaries or limits to what can be explored, questioned, and theorized about — nor are there limits to how. Second, each new idea is addressed on its own terms and held up to rigorous questioning.

It seems that those who do philosophy well have a leg up on innovation.

“Does this ring a bell to those in the business environment?,” Sturm asks. “It should, it’s practically a description of the ideal brainstorming practice. Brainstorming is embraced by so many businesses because it leads to innovation (when done correctly). Innovation is more highly valued now in the business world (or at least more talked about) than ever before, so if philosophy breeds the kind of thinking that leads to innovation, it seems that those who do philosophy well have a leg up on innovation.”

German photographer Christian Richter has been breaking into more than 1,000 abandoned buildings Europe, aided by a network of friends who suggest new places for him to visit, to capture their ‘swan song’ for his Abandoned series. This series began in 2011 and is continuing to evolve. In an exclusive photo essay for Dezeen, Richter explains the process behind his photographs.

Picture by Christian Richter.

He began taking amateur pictures of these structures after a friend gave him a small digital camera as a gift, which eventually led to a career as a fine art photographer. “Abandoned photography is my ongoing project and now I travel around Europe looking for abandoned buildings,” he told Dezeen. “I adore old decaying architecture, the patterns and textures — they remind me that everything is impermanent.”

Picture by Christian Richter.
Picture by Christian Richter.
Picture by Christian Richter.

“These are such good questions, perhaps we shouldn’t spoil them with answers.” — John Cage

--

--

Mark Storm
Mark Storm

Written by Mark Storm

Helping people in leadership positions flourish — with wisdom and clarity of thought

No responses yet