Can a Society Ruled by Complex Computer Algorithms Let New Ideas In?

by Jim Daly

The brilliant filament that so often lights up intellectual debate online seems to have dimmed a bit lately — have you noticed?

OK, I’m kidding. Intelligent point/counterpoint sparring on social media platforms has always been rare and more often than not is non-existent. Most people go online looking for affirmation of their old ideas, not new perspectives. Alternative views are met with insult, not inquisition. Righteous indignation seems to be the coin of the online realm, meted out in angry posts, poop emojis, and tweeted zingers.

For author Tim O’Reilly, that online echo chamber is not a bug, it’s a feature, one that’s hard-baked into the code that runs the platform economy. “We are in a world that is ruled by algorithms,” said O’Reilly at a recent Reinvent event marking the publication of his new book WTF? What’s the Future and Why It’s Up to Us. “The fundamental algorithm of Facebook is not to give people fake news but give people what they want [to see and hear].” Each time you “like” a post or perspective, the algorithm is designed to serve up similar items. “The algorithm is a mirror of our own desires,” O’Reilly said.

The algorithm supports a natural human tendency toward confirmation bias—the urge to believe only things that confirm what you already believe to be true—which in the Internet age, has a decidedly dark side. In the past decade we’ve seen the rapid rise of the platform economy, wherein companies such as Twitter, Amazon, Etsy, Facebook, and Google have created online structures and communities that enable a wide range of human activities. Platform-based business strategies are driving the most profound global macroeconomic change and value creation since the industrial revolution. Platforms are becoming increasingly central to the way our economy works in an increasingly bifurcated and contentious society.

O’Reilly challenges companies to do a better job in ensuring that their architectures let new ideas in. “We get incredible benefits from these platforms, no question,” he said. “But some of the problems they’re creating are going to bite us hard if we don’t get ahead of these things….The old way of thinking about things isn’t working, so have to make it anew. It’s time for us to step up.”

A path to division or unification?

O’Reilly asks a simple question, not just about the platform economy, but about any new or powerful technology: “Will it help us build the kind of society we want to live in?”

On social media platforms, the answer to that question is mixed. Platforms like Twitter and Facebook have done an astonishing jobs of connecting hearts and minds throughout the world, but they’re also filled with escalating sophistry, falsehoods, and vicious personal attacks that frequently displace intelligent conversation. Concurrently, Twitter and Facebook have also become major players in the political news sphere. In recent weeks, for instance, Facebook acknowledged that before and after last year’s American election, more than 146 million of its users may have seen Russian misinformation on its platform. Google’s YouTube admitted to more than 1,100 Russian-linked videos, and Twitter said more than 36,000 of its accounts could be tied to Russian operatives.

According to post-election BuzzFeed analysis by Craig Silverman, “In the final three months of the US presidential campaign, the top-performing fake election news stories on Facebook generated more engagement [such as shares, reactions and comments] than the top stories from major news outlets such as The New York Times, Washington Post, Huffington Post, NBC News and others.”

Instead of bringing intelligence to critical discussions, social media has spread falsehoods and dumbed down debate. A recent cover story in The Economist posited the not-too-hyperbolic question: “Do Social Media Threaten Democracy?” Far from bringing enlightenment, social media is spreading poison—and the code is to blame.

Made to work this way

To understand the root cause of some of these issues, you have to go back almost a half century, when the economist Milton Friedman famously observed in the New York Times in 1970 that the only social responsibility of a corporation is to increase profits. That myopic view, though widely debated even at the time, was taken as scripture by some business titans, and began a generation of zero-sum business ethics. It also sparked what O’Reilly calls “the runaway objective function” — maximizing a value (that is, corporate profits) no matter what the consequence or collateral damage.

Even today, that rationale lingers. Facebook’s chief response to complaints about its power are often met with a shrug. Its leaders say they’re simply creating a platform and not a media company. Their service is designed to draw us together, not pull us apart. Social scientists claim otherwise. A study published in the Proceedings of the National Academy of Sciences notes that social media actually isolates us, creating and facilitating confirmation biases and echo chambers where old — and sometimes erroneous — information is spread, often to toxic effect. Typically, this comes in the form of bogus conspiracy theories and untrue scientific information.

“Our findings show that users mostly tend to select and share content related to a specific narrative and to ignore the rest,” the paper reads. “In particular, we show that social homogeneity is the primary driver of content diffusion, and one frequent result is the formation of homogeneous, polarized clusters.”

We are their product

O’Reilly is not the only one concerned about the effect that these narrowing algorithms have on our lives. “Everyone is rightfully focused on Russian manipulation of social media, but as lawmakers it is incumbent on us to ask the broader questions: How did big tech come to control so many aspects of our lives?” asked Senator Al Franken (D-MN) in a speech to a Washington think tank this week. A handful of companies decide what Americans “see, read, and buy,” he said, easing the spread of disinformation. “Accumulating massive troves of information isn’t just a side project for them. It’s their whole business model. We are not their customers, we are their product.”

And the algorithms that drive that product are only as smart as the engineers who create them. At the Senate Judiciary Committee testimony last week, a flustered Senator Dianne Feinstein (D-CA) asked Google’s attorney Richard Salgado to explain why the Kremlin-backed propaganda network RT (formerly Russia Today) was included in YouTube’s “Preferred” viewing program. “Russia Today qualified, really because of algorithms,” Salgado noted. Inclusion in the program is simply based on popularity, not content, he added.

Sean Parker, the founding president of Facebook, said the algorithms were designed to increase the addictive quality of the platforms. They have, he said, created “a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology. That means that we needed to sort of give you a little dopamine hit every once in a while because someone liked or commented on a photo or a post or whatever. You’re exploiting a vulnerability in human psychology … [The inventors] understood this, consciously, and we did it anyway.”

Creating a better society

At What’s Now: San Francisco, O’Reilly recalled the the famous scene in Fantasia when Mickey Mouse, as the Sorcerer’s Apprentice, is almost destroyed by the wooden brooms he brings to life in order to do his chores — a metaphor for the current dangers of power over wisdom. Our algorithms should be designed to make us collectively smarter and more understanding, not just wealthier. “This is not just about Google and Facebook and Twitter,” he said. “If we make it just about them we’re really missing the opportunity to see what’s really going on and change the future. We need to rethink what we would do if we were trying to make a better and fairer world and what our policies would be.”

But O’Reilly is a pragmatist, and understands that there is tangle of contingencies between the conception of the noble idea and making it actually happen. Even the sunniest tech optimist must admit there is a lot of uncoiled rope on this deck, ready to trip up even the most well meaning intention. But a human designed system, he notes, can be fixed by humans. “Markets are not a natural phenomena,” he says. “They are the outcomes of a designed system. If we don’t get the expected results from the system we have, let’s rewrite the rules.”

Maybe it’s time to consider more dramatic and comprehensive measures. Some social scientists outside the company would like Facebook to be more open about its goals and guidelines for research on user behavior, in order to understand how their algorithms work. Others, like Columbia law professor Tim Wu, author of The Attention Merchants: The Epic Scramble to Get Inside Our Heads, advocates converting Facebook into a public benefit or nonprofit company. A far-fetched idea, but one that shows the seriousness of the problem.

O’Reilly didn’t go that far, but did offer this concluding thought: “I think we need to impose serious limitations on platforms and what their obligation is to the rest of society.…We’re going through a lot of trouble and a lot of pain but we have just begun the process of rethinking and rebuilding the world. It’s up to us to make good choices so that the world we build for tomorrow is better than the world we have today.”

This post originally appeared in our Medium publication.