The Social Dilemma
Film writer Vickie Curtis 鈥07 explains how the technology that connects us might destroy us.
The tech industry insiders who invented infinite scrolling, the Facebook 鈥淟ike鈥 button and so many of the other elements that help make social media so addicting didn鈥檛 set out to radically alter the fabric of society. But they did.
Now, Vickie Curtis 鈥07 is helping to warn the world. She is one of three writers of The Social Dilemma, a wildly popular Netflix docudrama that explores the dangerous human impact of social networking, as told by the very people who created the platforms.
鈥淲hat they were realizing at the time we were filming wasn鈥檛 that there were quirks to the thing they made that needed fixing; it was that they had helped create these three billion鈥搕entacled monsters that are actually shifting the course of human history,鈥 she said.
Curtis joined Austin Jenkins 鈥95, an Olympia, Washington鈥揵ased political reporter with the Northwest News Network, for a virtual conversation in November about the process of crafting the film, which premiered at the 2020 Sundance Film Festival and was released on Netflix in September. More than 400 alumni, parents, faculty, staff and students took part in the live event, which was hosted by 糖心TV鈥檚 Office of Alumni and Parent Engagement.
CC Magazine has edited the conversation for clarity and length.
Austin Jenkins: Can you tell us a little bit about the origin of this film?
Vickie Curtis: In Jan. 2018, the director, Jeff Orlowski, organized a group of folks to meet with [former Google employee] Tristan Harris, who features as sort of a protagonist in the film. A lot of Jeff鈥檚 friends were working for these companies in Silicon Valley, and more and more people were coming to him having left Twitter, Google and Facebook, saying, 鈥淚 have regrets,鈥 or 鈥淭hings have taken a turn for the worse and I want out.鈥 The more stories he heard, the more he was wondering, 鈥淐ould this be a film?鈥
At first the thought was, 鈥淚s this a big enough issue? Does this just mean everyone鈥檚 addicted to their phone, which we all already know?鈥 There鈥檚 no big reveal there. But the more experts we talked to, the more we realized it is so much bigger than [someone] being advertised to, or being addicted to the phone or looking at Facebook too much. It is really an existential threat that is tearing apart the fabric of society.
AJ: When I watched the documentary, I immediately thought of the 1999 film The Insider, which was about a former tobacco industry scientist who was ultimately convinced by 60 Minutes to tell the story of what was really going on inside Big Tobacco. I鈥檓 curious whether you think there is a comparison to be made between Big Tobacco of yesterday and Big Tech of today.
VC: There are ways in which I would say yes. It鈥檚 an industry where the product is not aligned with the incentives of the user of that product. For Big Tobacco, they鈥檙e making cigarettes that we now know鈥攁nd they did know鈥攃ause cancer. I haven鈥檛 met any cigarette smokers who smoke cigarettes to get cancer. It鈥檚 not an outcome they鈥檙e looking for. With tech, there鈥檚 a similar thing, where the product is these platforms that are addictive and that are misinforming us, making us more narcissistic, more anxious, more depressed, and that are tearing apart some of our institutions. For that reason, I would say that it鈥檚 probably more dangerous than tobacco, because it isn鈥檛 just having effects on individuals, it鈥檚 having effects on larger institutions as well.
AJ: There are all these light bulb or 鈥渁ha鈥 moments when you鈥檙e watching the film, and one is that social media, and tech in general, is the only industry outside of the drug industry that talks about its customers as 鈥渦sers.鈥 I was also struck by the line, 鈥淚f you鈥檙e not paying for the product, you are the product.鈥
VC: A great question for us at the beginning of the filmmaking process was, 鈥淚f these companies are worth hundreds of billions of dollars, why? We鈥檙e not paying them, so who is paying them?鈥 Advertisers. Advertisers are paying them to target their advertisements to people who will be the most susceptible to that advertisement.
To figure out who鈥檚 most susceptible, Facebook and Google create an avatar of you, which is a collection of up to 29,000 data points about what kind of person you are. They鈥檙e monitoring things like how fast you scroll, or how you move your mouse, or how long it takes you to absorb an article, or which things you鈥檝e clicked on in the past or what time of day you click on certain kinds of information. Google Maps and Google phones are sending back to Google headquarters all of your real-life habits of where you physically go.
They have a ton of data on who you are as a person, and then they can group you and say, 鈥淥kay, there are 29 other people just like Austin in his neighborhood, and they all are doing this, so why don鈥檛 we advertise that to Austin, too? We know he鈥檚 likely to be susceptible to that thing.鈥
And on one level, people are like, 鈥淥h, well, this just means the advertisements I see are relevant to me, so, great, this is a pair of shoes I want to buy.鈥 But the algorithm isn鈥檛 perfect, nor is it trying to figure out what shoes you want to buy. It鈥檚 trying to figure out how to keep you on the platform longer, what gets you hooked. It preys on our fears, doubts and insecurities, so it鈥檚 going to show us more outrageous, salacious, fearmongering information in order to keep us there so that we will see more ads, click on more ads. That makes data this really powerful tool for shifting and manipulating people鈥檚 beliefs and behaviors.
AJ: I read during the Cambridge Analytica scandal that they could get enough data points to eventually know more about you than you know about yourself.
VC: Absolutely, because a lot of it is subconscious. Like, I don鈥檛 know how fast I scroll; I don鈥檛 know what my mouse-click patterns are; I don鈥檛 know what my personality profile is based on those habits of mine. They have a whole understanding of your personality profile based on the information that they鈥檝e collected on you, and that determines which particular conspiracy theory to show you next.