THE EVENTS OF THIS WEEK HAVE SHOWN THAT SOCIAL MEDIA NEEDS A WARNING LABEL. NO, REALLY.

Genevieve Thiers
17 min readJan 8, 2021

--

You know those FDA warning labels they put on cigarettes?

We need them on social media sites too, and here’s the argument to get it done.

By Genevieve Thiers

It’s been a terrible few days in our house. Like a lot of us, I entered the New Year with plans. I was going to lose the quarantine weight, stop snapping at my twins and somehow manage to keep the house clean despite huge odds. I kicked into gear on Jan 4. But by Jan 6, I was crying over ice cream with a Manhattan in hand, staring in disbelief at my news feed across from my husband who gave up on work for the day because watching an attempted coup appeared to be more important.

The scariest thing? Most of us know that we’re watching what could be the end of America and the end of democracy, but we can’t figure out how to stop it.

Sure, it’s not going to happen this month. Pelosi is calling for ousting Trump and Biden and Harris are close on the horizon. But we all have the pervasive feeling that Trump is setting up for something in the media space and that it’s going to be worse than FOX News. He’s rebranding himself as a media CEO and not afraid to blow up the country in the process. A CEO cannot be seen as a loser. Trump will burn it all down to make sure he enters his new gig as a winner, and he’s taught a legion of followers like him how to do the same. And if he will burn it all down this dramatically just leaving office, imagine what he will do once he’s out and in the saddle of something he can run without the eyeballs of a nation on his every move.

I refuse to believe that there is not an answer to fix democracy in this country before it collapses. I hate the doom-spreading and the doom-scrolling we are all doing. I long to find a real set of solutions in this area. I simply cannot fathom that we cannot form a set of warnings and set of regulations (and even a Constitutional amendment?) that might be able to stop companies from creating the environments where trust drops and propaganda can run wild — but to make an argument, I’ve had to detour through philosophy, epistemic chambers, skeuomorphism, dopamine reward systems and more. I have some experience in tech that is aiding the lens I am using to write this. I founded a startup called Sittercity.com 20 years ago with the goal of making a mash-up of a nanny agency, an online dating site and an online jobs board to allow moms to instantly see and hire safe caregivers in their area. Fast forward 20 years and two of my own twins later, it did work, and we brought ease to our customer via an internet model that did improve things in a very real way.

But…things in the tech world can also go the opposite way, as all of us have just seen. Thanks to the tech world, the Capital was just invaded by an army of tens of thousands of Trump supporters demanding justice to what they think was a botched election. If it appears that they are living in an alternate universe, it’s because they are. In their universe, Trump won. In their universe, Democrats are pedophiles, vaccines are bad, climate change is not real and it’s fine to take this out on Nancy Pelosi’s office by putting your feet up on her desk or stealing a famous podium.

How did things get this bad? And what can we do about it to save this country that has been a beacon of hope for so long?

Let’s split these questions up.

First, how did things get this bad?

Most will point immediately to echo chambers as the cause of our plight. But echo chambers are a secondary effect of what is happening online, and the real problem is a lot more chilling. To set the stage here, let me explain to you three things first, beginning with the rise of “Don’t Make Me Think” thinking.

It’s important to know that when tech builders first saw the rise of the internet and flooded onto it to bring industries that were not optimized to better optimization through it, they all thought that people would have more exposure to dissenting opinions. They thought that it would be the ultimate way to find and collaborate on problems and tackle them together. And so the pursuit of making things easier really was based on the consumer and what they wanted to a minute degree. I remember in then heyday there was a lionization in the Valley of a book called “Don’t Make Me Think” by Steve Krug that argued that web sites needed to be so easy to use that they were literally intuitive to the users and required no thought to engage with. Not only were we supposed to not make users think when they hit our site; we were not supposed to expect them to read. Ferocious UX teams set up tools where eyeball flicks as a user scanned a page were captured in heat maps that showed us where to place each element of a web site in graphics so that actual words were not a barrier to entry. Everything — everything — pointed to getting that user to enter information and click “buy.”

The second thing to understand is that alongside this another strain of thought arose from cellphone designers that had to do with the idea of capturing emotion inside of intuitive processes by mirroring the real world. Called skeuomorphism, this thinking can be seen in the fact that your iPhone probably took hours less time to learn than your PC. That shutter sound after a picture if taken? It mimics a camera. Swiping in Kindle to turn a book page? This mimics a book layout in real life. Apple Pay even gives you an audible “cha-ching” when a transaction happens. All this was meant to short-cut the ease of use of a new product and use real-world brain cues to form usage patterns instantly for a new user.

Finally, when you start to combine “Don’t Make Me Think” and skeuomorphism, or the capturing of emotion in online experiences, you begin to see how far builders could (and would) go to sell something. And while generally these principles worked when it came to devices, they crashed when it came to media. Because in media, we can’t run on pure emotion, or we get mob rule. We have to think rationally and in a balanced way in order to make decisions, and to compromise, and to lead. These tendencies permeating tech listed above crashed into a media industry that was not prepared for any of it. And the impact was cataclysmic. Hit by a rise of citizen journalists and sudden, huge competition for attention and eyeballs, news media struggled to monetize itself when the internet appeared. Ad revenues briefly worked for a while, but as computers per household creating sourcing web usage, content needs flattened and news media was left floundering, grappling with how to get eyeballs and paring down stories to bare minimum to try and appeal to users. Dying, media outlets turned to clickbait, frantically presenting anything that would give a splashy headline, whether it was true or not. Some went increasingly niche as well in an attempt to live. Publicists leaped to take advantage of the decline. Ryan Holiday, author of “Trust Me, I’m Lying” wrote that he would create a trail of users tied to an outrage post to help a story ascend. “Trading up the chain is a strategy that I developed that manipulates the media (sic)…I can turn nothing in something by placing a story with a small blog that has very low standards, which then becomes the source of the story by a larger blog and that, in turn, for a story by larger media outlets. I create (sic)…a ‘self-reinforcing news wave.’”

As newspapers and outlets shuttered all over America in the late 2000’s and beyond, it opened the door for the unthinkable. A country that was used to the reliable, steady reporting of Katie Couric and Dan Rather found itself confronting a growing chant that truth itself was malleable. Not only did Trump immediately begin to attack any mainstream press, on January 22, 2017, Kellyann Conway presented an “alternative fact” on network news. This was a glaring alarm that not only was this President going to lie like a rug; he was going to redefine what truth itself was to anyone who would believe him or his team.

But no one realized this until too late.

Now let’s talk echo chambers. Because while everyone points fingers at them, they are not the real culprit here. The culprit is what social media does before we reach the echo chambers. According to the research of C. Thi Nyugen, a Professor of Philosophy at Utah Valley University, echo chambers are a secondary effect of what is called “epistemic bubbles.” In an article in The Conversation, he explains that “an epistemic bubble is what happens when insiders aren’t exposed to people from the opposite side. An echo chamber is what happens when insiders come to distrust everybody on the outside.”

So, in other words, once you find yourself inside an epistemic bubble, you are only seeing views and opinions that support your own. And this is the first thing that happens on social media, and it happens without our knowledge or consent. The echo chamber comes second, and we have the illusion entering it there that we had choice in the matter, but did we? None of us were given a warning label when we entered these sites from the FDA saying “Warning: Use of this site might be damaging to your mental health and might lead to alternative facts and the downfall of democracy.”

Maybe we should have gotten that warning.

This is where the smoking analogy comes in. The original smokers were groomed by big media to think what they was doing was cool, healthy and smart. They only realized later, to their horror, that it was expensive, addictive, and would kill them. Sound familiar? Social media is the same. Putting us in an epistemic bubble is like giving us 40 free cigarettes and then shoving us back out into the world and saying “ok, now have no more of those.” Most of us walked into all this thinking it was a good thing that could bring people together, and accepting blame for our behavior once inside, but once you see it all in two parts, it all begins to be suspect.

Let me explain how thorough the creation of an epistemic bubble is on Facebook so that you understand how cemented your echo chamber is before you even walk in its door. First of all, before you even join Facebook, it knows things about you. It might already have your image tagged in a photo someone else took. You might have already had your name mentioned in a post by a friend. You don’t even need to be on it to be a part of it. Then, when you do decide to join, it can make innumerable inferences the second you walk on simply by who you know. We tend to flock with like minds, and so that group immediately often comes with you into the social experience, but before it even knows that it will have an idea of who you might know just from your area, and school, and spouse. If you start to post anything — which is where people traditionally think the info-gathering begins — you are already ten steps down the pike. You don’t even need a post for the algorithm to categorize you into one of several psychological-political groups that Facebook uses to sort its users. Thence categorized, you are going to see, for example, all-liberal news, media sources and opinions, if you landed in that group, or ultra-conservative news media sources and opinions on the opposite side of things. This is what we do not choose. This is the epistemic bubble, and it’s scary that so much of it happens without our knowledge or consent.

When first hearing this, it seems like there should be an easy solution. Can’t Facebook simply remove those filters that we’re grouped into and tell its investors to take the hit? Or if they won’t, why don’t we mandate them to take off the filters!

This will not work. For starters, Facebook’s fallout from removing its filters would be catastrophic to its revenues based on lost ad revenue alone. There’s an army of lobbyists, researchers, board members, VCs, LPs and C-levels to make sure that no ever screws with their primary capitalization source. There’s a period where a company can decide to do good and make a real pivot. Facebook is light years past this time period.

Second, Facebook argues that the filters are critical to not break our brains. In fact, the company blatantly experimented on 70,000 users like lab rats in 2014 to see if changing their news feeds would affect their moods (spoiler alert — it did.). The users had no idea. Who reads a Terms of Use thinking that they will be openly experimented on? Researchers condemned the experiment, stating that it harmed participants. Facebook, of course, argues that the filtering is critical because there would be too many stories and opinions to wade through should we experience an unfiltered feed, not even to mention the quiet categorizations going on based on psychology, comsumer, location and political leanings that back it all up. Want to see an example? Go to facebook.com/ads/preferences on your browser. (You may have to log in to Facebook first.) That will bring you to a page with your ad preferences. Under the “Interests” header, click the “Lifestyle and Culture” tab. Then look for a box titled “US Politics.” (Or “see more,” should that be missing.) In parentheses, it will describe how Facebook has categorized you, such as liberal, moderate or conservative. Facebook talks a great game about why this happens. It backs up its categorizations of us with numerous studies and data points and impassionate articles from its researchers and developers indicating that this is all important to our health. But that’s what tobacco did too, at first. Before 1950, cigarette companies proudly placed doctors front and center with cigarettes in their ads.

But finally, Facebook shutting down the epistemic bubbles and echo chambers we are used to would be a huge mental blow now that we know they can exist and are hooked on them. Millions of users would be jerked out of the warm bath we’ve been marinating in for over a decade.

You see, echo chambers mimic the highs of a drug. The social media world is built not only on gathering users into its fold but on hooking their eyeballs and wallets with a drug-like need to have more. The Netflix documentary The Social Dilemma explains how social media uses the functions of the pleasure/reward center of the brain to hook people — and especially kids — on its experience. This can mimic the effects of addiction, as any of us with kids that have experienced smartphones or Nintendo already know. Echo chambers are the same. A dopamine hit occurs with validation of one’s popularity, and there’s no difference between getting 1000 likes on your recent vacation phot and getting huge likes on a Facebook tweet calling for x politician to be thrown out.

Assistant Professor of Philosophy Bert Baumgartner at the University of Idaho has been researching echo chambers since at least 2014, and his 2016 paper “Opinion Strength Influences the Spatial Dynamics of Opinion Formation” in The Journal of Mathematical Sociology breaks down just how far echo chambers can go in skewing opinion. Baumgartner points out the amplification of personal opinion naturally occurs when other people agree with you, and the validation that offers — or, in other words, the dopamine hit — can begin to erode the need for real facts. But his most interesting finding was that polarization can occur with even the slightest bit of amplification. Individuals with the most extreme opinions and least tolerance for compromise become the most influential, even if there are very few of them. No one moves towards the center, in other words, inside an echo chamber. Like the phenomenon of online reviews — e.g. no one ever writes reviews online unless they had a middling experience, only if they had an extreme one — the most opinionated people resonate the most inside an echo-friendly space.

And this can lead to ruin. The downfall of polarized societies, Baumgaertner said, is that they are “less receptive to the truth. Collective action depends on us being able to reach some kind of consensus (sic)…If we’re highly polarized and the thing creating this polarization has to do with our population structure, it will be more difficult to get that consensus.”

We are radicalized now, and we don’t even know it. And it feels good. Pulling us out of it all of a sudden would be like yanking a user out of a cocaine high. Nyugen point out in The Conversation that the real danger of echo chambers is that they create cults. They not only create a recording repetition of an idea — they give members of that chamber full arguments as to how to refute the most standard tenets of what used to be believed. For example, vaccine deniers are not only telling people not to get vaccines; they are presenting full, standardized arguments and then refuting them with group-think and being supported by news that is at best skewed and at worst totally false. In their book “The Echo Chamber” Kathleen Hall and Joseph Cappella talked about the fact that FOX news was the first to realize that if the world was presented as a simple binary between evil and good, it was both easier to understand and manipulate. If someone was on the outside of the thinking presented, and the reasoning behind it, they were not only wrong; they were presented as evil. You could not trust anyone from the outside. Only those on the inside were good. (Nyugen also points out that echo chambers can form around any area — for example, parenting. As a mother, I can personally tell you that this could not be more true. Head onto any online forum and ask a question about breastfeeding and watch the explosion.)

Misinformation is scary. But mistrust? That’s deadly. And it’s happening all around us, in droves.

The hit is clear. A 2019 PEW Research poll showed that 68% of Americans say it is very important to repair the public’s level of confidence in the federal government, and 58% say the same about improving confidence in fellow Americans. About half of Americans (49%) link the decline in interpersonal trust to a belief that people are not as reliable as they used to be. We’ve lost confidence in each other. And without confidence and trust, we’re never going to be able to face the challenges ahead in 2030 beyond — especially not automation, which is looming somewhere in the next 10–40 years, depending on which expert you talk to.

This argument above lays out what is happening, but not how to fix it. To be clear, the solutions being placed out at the moment are paltry. Sure, we can throw out Trump, and Pelosi is right to call for it, but that will only radicalize his followers more, as it will give them a martyr. Sure, Facebook can block Trump from the Facebook and Instagram platforms for 13 measly days while he is left in power and claim that’s enough despite their platform and model being largely responsible for us all being in this gigantic mess. Sure, we can hope that with the Biden-Harris inauguration on January 20 that things will go back to normal. But it won’t shake the feeling that a lot of us have — rightly — that things are going to get worse in ways we can’t see. Why? Because the underlying problem — laid out above — is so vast and unaddressed.

So how do we see past these band aids and aim at the underlying problem?

Well, for starters, let’s imagine, for a moment, that each American has the right to not be placed in an epistemic bubble/echo chamber combination against their will or without their knowledge. Let’s go further and say that these two things — let’s call them “propaganda centers” — are actively harmful to our health. Let’s say that they increase paranoia and cause addiction and are grooming processes for echo chambers that have long-lasting, negative effects on the user. And let’s point out that since their rise, at least tens of thousands of deaths might have been caused by them on American soil (and a myriad more internationally) at the very least.

That’s a start.

When the cigarette industry was finally called out for the immense harm it caused, it grudgingly became regulated and made some paltry recompenses. For example, the FDA slapped warnings all over cartons and ads, and settlements were formed for the biggest fighters. It took half a century to take down and fought the whole way. But it fell. Various things — with good reason — have been labeled the “new tobacco,” among them processed foods and soda. Both of these have caused untold number of deaths from obesity-related conditions like diabetes and heart failure. The argument that social media belongs this this category might at first seem weak, but can we not hold the effects of epistemic bubbles/echo chambers accountable for COVID 19 deaths? How many anti-maskers — some at the very invasion of the Capital this week — are about to infect thousands more in their hometowns after leaving DC? How many have already infected others due to refusal to wear a mask, get vaccinated or distance? In this light, it’s less easy to see a difference. And this is not even to mention the harm caused by any other levels of propaganda spread or absorbed by the platform.

So — while it’s hard to believe that anything Facebook could offer as a settlement could make up for the last four years, we could at least start with a warning. There are no less than 11–13 warnings now, depending on the product, placed on cigarette labels, like the example below shows. So we have a clear template to work from.

At the very least, we can begin with a label explaining that social media consumption inside of propaganda centers is aimed at dopamine centers and can promote addiction of such content and the amplification thereof within echo chambers. And it might work. People managed to wean themselves away from cigarettes after watching loves ones die hacking their lungs out in hospitals and after the highly successful campaigns of the past few decades showing the lungs of smokers in great, gory detail. The only issue here is that we can’t wait for a total societal collapse to then point to the dangers of propaganda centers. The storming of the capital by an unruly mob needs to be enough. Trump’s gaslighting and outrageous social media usage has conditioned us to believe that this is just the next thing in a series of insane events. It’s not. It’s a definable collapse and using it as such can inform swift and decisive action.

In the wildest example of treating a right to form opinions outside of propaganda centers, a constitutional amendment could even be in order. I would leave this up to someone more knowledgeable, but even the first amendment is not immune to change times and methods of communication, consumption and commerce.

Think it’s impossible to get propaganda centers labeled the same way as cigarettes? Remember there are a few times in history when things get so insane…like the last few days…where any and all solutions on the table are welcome. We’re so mad right now we’re talking about more than impeachment — we are talking about removing a President. How is this that far off from that?

I welcome any and all opinions to this article. My personal email is gftdiva@gmail.com.

Genevieve Thiers is the former Director and Founder of the NewFounders group in Chicago, which gathered over 2000 leaders in the Midwest from 2016–2019 including Senators, activists, builders and more. She is the tech trainer on the political reality show RUN the Series, found at www.runtheseries.com and now on Amazon Prime. She is the master editor and instructor of the TechYourself Guide — a playbook on how to use tech to win. Diverse and female candidates can access the online courses, book and resource sheets by contacting Thiers or finding more at www.techyourself.org. She is a successful entrepreneur, winner of over 20 major awards in entrepreneurship, and has sung opera on major stages around the world. See more at www.genevievethiers.com.

SOURCES

https://www.pewresearch.org/politics/2019/07/22/trust-and-distrust-in-america/

https://edu.gcfglobal.org/en/digital-media-literacy/what-is-an-echo-chamber/1/

https://theconversation.com/the-problem-of-living-inside-echo-chambers-110486

https://www.psychologytoday.com/us/blog/brain-wise/201802/the-dopamine-seeking-reward-loop

https://www.uidaho.edu/class/politics-and-philosophy/news-and-events/echo-chamber

https://medium.com/macoclock/skeuomorphism-the-secret-behind-apples-success-7b7e06348e4c

https://time.com/5926883/trump-supporters-storm-capitol/

https://www.fda.gov/tobacco-products/labeling-and-warning-statements-tobacco-products/cigarette-labeling-and-health-warning-requirements

https://www.fda.gov/tobacco-products/health-information/nicotine-addictive-chemical-tobacco-products

https://www.theguardian.com/technology/2014/jun/30/facebook-emotion-study-breached-ethical-guidelines-researchers-say

https://www.uidaho.edu/class/politics-and-philosophy/news-and-events/echo-chamber

https://www.nytimes.com/2016/08/24/us/politics/facebook-ads-politics.html

--

--

Genevieve Thiers
Genevieve Thiers

Written by Genevieve Thiers

Genevieve Thiers is a mom, entrepreneuer and political activist that lives in Chicago IL. You can contact her at gftdiva@gmail.com.

No responses yet