by Vanessa Bates Ramirez: Dramatic political polarization. Rising anxiety and depression. An uptick in teen suicide rates. Misinformation that spreads like wildfire…
The common denominator of all these phenomena is that they’re fueled in part by our seemingly innocuous participation in digital social networking. But how can simple acts like sharing photos and articles, reading the news, and connecting with friends have such destructive consequences?
These are the questions explored in the new Netflix docu-drama The Social Dilemma. Directed by Jeff Orlowski, it features several former Big Tech employees speaking out against the products they once upon a time helped build. Their reflections are interspersed with scenes from a family whose two youngest children are struggling with social media addiction and its side effects. There are also news clips from the last several years where reporters decry the technology and report on some of its nefarious impacts.
Tristan Harris, a former Google design ethicist who co-founded the Center for Humane Technology (CHT) and has become a crusader for ethical tech, is a central figure in the movie. “When you look around you it feels like the world is going crazy,” he says near the beginning. “You have to ask yourself, is this normal? Or have we all fallen under some kind of spell?”
Also featured are Aza Raskin, who co-founded CHT with Harris, Justin Rosenstein, who co-founded Asana and is credited with having created Facebook’s “like” button, former Pinterest president Tim Kendall, and writer and virtual reality pioneer Jaron Lanier. They and other experts talk about the way social media gets people “hooked” by exploiting the brain’s dopamine response and using machine learning algorithms to serve up the customized content most likely to keep each person scrolling/watching/clicking.
The movie veers into territory explored by its 2019 predecessor The Great Hack—which dove into the Cambridge Analytica scandal and detailed how psychometric profiles of Facebook users helped manipulate their political leanings—by having its experts talk about the billions of data points that tech companies are constantly collecting about us. “Every single action you take is carefully monitored and recorded,” says Jeff Siebert, a former exec at Twitter. The intelligence gleaned from those actions is then used in conjunction with our own psychological weaknesses to get us to watch more videos, share more content, see more ads, and continue driving Big Tech’s money-making engine.
“It’s the gradual, slight, imperceptible change in your own behavior and perception that is the product,” says Lanier. “That’s the only thing there is for them to make money from: changing what you do, how you think, who you are.” The elusive “they” that Lanier and other ex-techies refer to is personified in the film by three t-shirt clad engineers working tirelessly in a control room to keep peoples’ attention on their phones at all costs.
Computer processing power, a former Nvidia product manager points out, has increased exponentially just in the last 20 years; but meanwhile, the human brain hasn’t evolved beyond the same capacity it’s had for hundreds of years. The point of this comparison seems to be that if we’re in a humans vs. computers showdown, we humans haven’t got a fighting chance.
But are we in a humans vs. computers showdown? Are the companies behind our screens really so insidious as the evil control room engineers imply, aiming to turn us all into mindless robots who are slaves to our lizard-brain impulses? Even if our brain chemistry is being exploited by the design of tools like Facebook and YouTube, doesn’t personal responsibility kick in at some point?
The Social Dilemma is a powerful, well-made film that exposes social media’s ills in a raw and immediate way. It’s a much-needed call for government regulation and for an actionable ethical reckoning within the tech industry itself.
But it overdramatizes Big Tech’s intent—these are, after all, for-profit companies who have created demand-driven products—and under-credits social media users. Yes, we fall prey to our innate need for connection and approval, and we’ll always have a propensity to become addicted to things that make us feel good. But we’re still responsible for and in control of our own choices.
What we’re seeing with social media right now is a cycle that’s common with new technologies. For the first few years of social media’s existence, we thought it was the best thing since sliced bread. Now it’s on a nosedive to the other end of the spectrum—we’re condemning it and focusing on its ills and unintended consequences. The next phase is to find some kind of balance, most likely through adjustments in design and, possibly, regulation.
“The way the technology works is not a law of physics. It’s not set in stone. These are choices that human beings like myself have been making, and human beings can change those technologies,” says Rosenstein.
The issue with social media is that it’s going to be a lot trickier to fix than, say, adding seatbelts and air bags to cars. The sheer size and reach of these tools, and the way in which they overlap with issues of freedom of speech and privacy—not to mention how they’ve changed the way humans interact—means it will likely take a lot of trial and error to come out with tools that feel good for us to use without being addicting, give us only true, unbiased information in a way that’s engaging without preying on our emotions, and allow us to share content and experiences while preventing misinformation and hate speech.
In the most recent episode of his podcast Making Sense, Sam Harris talks to Tristan Harris about the movie and its implications. Tristan says, “While we’ve all been looking out for the moment when AI would overwhelm human strengths—when would we get the Singularity, when would AI take our jobs, when would it be smarter than humans—we missed this much much earlier point when technology didn’t overwhelm human strengths, but it undermined human weaknesses.”
It’s up to tech companies to re-design their products in more ethical ways to stop exploiting our weaknesses. But it’s up to us to demand that they do so, be aware of these weaknesses, and resist becoming cogs in the machine.
Leave A Comment