November 25, 2020 Reading Time: 6 minutes

I have spent some time with Tristan Harris lately. Online, of course, and as a passive consumer of his words, delivered to me through popular podcasts and social media recommendations (for example here, here, and here). I’m not the only one, as the new tech documentary The Social Dilemma in which he’s a leading character has been viewed by some tens of millions of people.

An entrepreneur, cofounder of a tech start-up, and deeply worried about the impact of social media technology on our society, Harris worked as “design ethicist” at Google for years. Now he’s the president of the nonprofit Center for Humane Technology. Most of the bios I keep seeing about him include the following particularly apt description: “He’s the closest thing Silicon Valley has to a conscience.”

The film is a scary thing to watch, and it messes with your head quite a bit. It explains the many ways in which social media like Twitter, Snapchat, TikTok, Instagram, YouTube and particularly Facebook vie for our attention ‒ and the countless ways that they are causing harm and tearing societies apart. Supercomputers and algorithms optimized for attention-grabbing are targeted at my brain to predict exactly what content will most resonate with me ‒ positively or negatively ‒ so that I stay on the platform just a bit longer and keep earning ad revenues for their host companies.  

The way Harris described the film in an interview with Alex Kantrowitz was hair-raising and only a little bit hyperbolic: 

The major point of the film is that a business model that is infused in the social communications infrastructure that 3 billion people live by, and are dependent on, is misaligned with the fabric of society and specifically poses a kind of existential threat to democracy and a functioning society.

What’s most frightening to me is the way that users of these platforms are literally shown different worlds. What is going to keep you scrolling is different from what keeps me scrolling, so the hyper-attentive algorithms give us different stuff to engage with, a different impression of “what’s going on.” 

When this happened with personalized ads, I was never bothered (would you rather have good and relevant ads than bad and useless ones?), but when it’s information and news more broadly, I can appreciate Harris’ stark conclusions about the threats to democracy. 

For many years I’ve wondered why it is that other people’s opinions so often stem from a worldview so at odds with the established, scientific, empirical reality (yes, there are myriad ways in which this process can be biased and led astray as well). To quote the title of a wholly relevant new book by Alison Dagnes: we seem “Super Mad at Everything All the Time,” even about things that are contrary to verifiable reality

Even the late Hans Rosling wasn’t above blaming this on media producers for so frequently sensationalizing odd-one-out events that we start to think they are an accurate representation of the world. 

Harris and his colleagues featured in the film take it one step further: social media, as opposed to traditional media, makes societal fault lines deeper, feeding us different information in desperate longing for our attention ‒ eyeballs that they can sell to advertisers who, not infrequently, themselves game the system, from nefarious foreign leaders to Cambridge Analytica-style manipulation of our domestic affairs.

It’s a scary new world they present, but I wonder how bad it really is. 

What’s the harm?

We can all point to examples where social media has spread misinformation or tampered with lived reality that most people accept is an unambiguous bad. Those are not examples that those of us skeptical of the film’s stark conclusion must defend; everyone admits that this is bad.  

I confess that after a dozen or so hours hearing Tristan (and others) talk seriously about these topics, I’m conflicted. I totally share many of his concerns, as do most people who have experienced some of the ills raised (attention distraction, misinformation, control, spreads of hatred). Still, grand doom-and-gloom stories require a little bit more evidence.

The movie is called The Social Dilemma precisely because social media brings hard-to-detect harms with the unequivocally good that we observe and appreciate every day. If how we use this technology were entirely bad, we wouldn’t be conflicted about them. 

I’m much more concerned by the power that good art has over me ‒ and even then I’m still not persuaded that it’s altogether damaging. Like many others, I recently fell down a classic YouTube hole spending a few hours on the platform when I wasn’t planning to. A few days before, I had tentatively begun watching the Netflix show The Queen’s Gambit starring Anya Taylor-Joy as a phenomenal but troubled chess player in the 1950s and 1960s. Like many of us captivated by universally relatable characters and wonderful shows (now available by the hundreds), I couldn’t stop watching. Five am the next morning, having finally finished the series in one sitting, I snapped out of my addiction.

We might argue that Netflix should make shows that are less interesting or addictive, but that hardly seems reasonable. There’s also the question of my actual opportunity cost: I slept a little less that night, and didn’t read the fiction book I had intended. Big deal. One self-imposed choice of spare time activities replaced with another.

A few days later, through the wonders of algorithms and Big Tech spying on me, YouTube started recommending videos about The Queen’s Gambit ‒ professional chess players who analyzed the various games played in the show, showing how they were renditions of famous games of 20th century chess grandmasters. Amazing! For a few hours in that limbo between dinner and bedtime, I consumed one Queen’s Gambit-analysis after another. Fantastic! A better understanding of the moves and the real-world connections of the fictional games I had just watched. What’s the big harm?

This innocent example actually strengthens Harris’ case: the same technological mechanism ‒ its unstoppable psychological power over me ‒ that spreads good and harmless information has the capacity to spread bad information. Harris points out that these technologies are utopia and dystopia simultaneously. And that’s the point. If YouTube and Netflix could capture hours of my life on a topic ‒ chess ‒ in which my interest is at best lukewarm, how much worse can they do if that topic were something that really fueled my priors? Or the us-against-them and superficially appealing arguments of, say, flat earthers, anti-vaxxers, or those nefariously spewing hatred for this or that group?

It’s not the technology itself presenting an existential threat to society; it’s that the technology enables the worst ideas in society to profligate in a way that we’re wholly unequipped to handle. 

A way out?

A number of the scary ills presented in the documentary come from Greg Lukianoff and Jonathan Haidt’s work on the Coddling of the American Mind and some more recent work by Haidt and co-authors on the impact of social media, (especially on young girls in the form of depression, anxiety, self-harm). This is persuasive. People who haven’t (yet) developed mental and emotional tools to critically evaluate information, put them in context, or overcome negativity bias are in much greater danger, emotionally and intellectually. 

But for the rest of us, I’m less convinced. Take these simple countermeasures, for instance: 

  • Almost everyone I know uses ad blocks.
  • Many people dim the colors of their screens to make their phone seem less appealing.  
  • We turn off notifications, depriving these social media companies from their prime mechanism of being in your face all the time (At the end of the documentary, almost all those interviewed unanimously recommend this action).  
  • Those most concerned get off Facebook entirely, delete most of their social media accounts, and use only IMs and group chats to communicate with their friends. 

If these companies truly had revolutionary and unavoidable power over our psyches, none of these actions would have been possible. But they’re not impossible, and lots of people use them to regulate their social media use.  

Especially in the interview on the Making Sense podcast, Harris repeatedly argued that this time is different from past informationally disruptive technologies (radio, television, newspaper, 24/7 media coverage). The scale on which social media operates is different, and the potential impacts are much larger. I also buy his argument that the supercomputer at the other end of the line is always going to win out over me and my paltry humanity.  

But really? I deleted Facebook years ago and don’t miss it ‒ what is the supercomputer going to do about that? I long ago turned off notifications for basically everything but banking transactions ‒ what is the supercomputer going to do now?

What critics refer to when they say “This time is the same” isn’t that television, newspapers, or cars revolutionized our societies in the exact same way that social media are doing, but that we found ways to deal with them. Live with them. Constrain them.

If that’s the core message of The Social Dilemma, I’m all for it. And that careful warning makes a whole lot more sense than the almost dystopian catastrophism of “I see no way out of this” mentality featured in many of the conversations.

Even if Harris doesn’t seem to think so, we can learn to live with these new technologies.

Joakim Book

Joakim Book

Joakim Book is a writer, researcher and editor on all things money, finance and financial history. He holds a masters degree from the University of Oxford and has been a visiting scholar at the American Institute for Economic Research in 2018 and 2019.

His work has been featured in the Financial Times, FT Alphaville, Neue Zürcher Zeitung, Svenska Dagbladet, Zero Hedge, The Property Chronicle and many other outlets. He is a regular contributor and co-founder of the Swedish liberty site, and a frequent writer at CapXNotesOnLiberty, and

Get notified of new articles from Joakim Book and AIER.