n December of 2010, struggling to find work and support his family, Mohamed Bouazizi started selling fruit at a modest Tunisian roadside stand. When his wares were seized by a municipal...
Don’t look at Facebook, look at the face in the mirror
n December of 2010, struggling to find work and support his family, Mohamed Bouazizi started selling fruit at a modest Tunisian roadside stand. When his wares were seized by a municipal inspector, Bouazizi set himself on fire in protest. Reports of his self-immolation then spread like a firestorm on social media, which helped launch the Arab Spring revolts that raged for years against authoritarian Middle Eastern governments.
Starting in 2009 with the Iranian election protests, Facebook and social media in general have helped spread democratic ideals around the globe. Nor would anyone question that they’ve fulfilled Mark Zuckerberg’s vision of bringing the world closer together.
In the recent American election, free Facebook posts by Russians posing as U.S. citizens were seen and shared by more than 126 million people—almost 40% of the U.S. population.
Yet one glimpse at the trolling, flaming plus outright deception committed online is proof that these, like any tools, can just as easily be used to destroy as to build.
Recent Russian tampering in the American presidential elections is just one of many examples. Whether or not voters were flipped, the meddling sparked a huge public outcry, and caused executives from Facebook, Twitter and Google to be grilled by the U.S. Senate over a lack of self-policing.
But should social media really shoulder all the blame? Or could the real culprit be staring back at us in the mirror?
From Russia with love
The 2016 American presidential election has definitely shone a light on social media’s dark, dingy basement. Investigations by our intelligence community revealed that an obscure Russian troll farm known for spreading Kremlin-linked propaganda and fake news, the Internet Research Agency, was found to have:
- Produced some 3,000 sponsored Facebook ads and 80,000 free posts from users posing as U.S. citizens
- Created politically polarizing Facebook pages like Defend the 2nd, LGBT Rightsor BlackMattersUS
- Uploaded more than 1,000 incendiary videos on Google’s YouTube platform
- Launched tens of thousands of bots (automated, non-human text generators) from thousands of bogus accounts
While no one knows for sure whether they impacted poll numbers, and Russian ad spending comprised a fraction of the election total, these efforts went far to further divide an already polarized population, achieving a reach that would make even the most skeptical marketer rejoice:
- Some 1.4 million tweets from spurious sources were viewed 288 million times, accounting for one out of five election-based tweets
- Sponsored Facebook ads linked to Russian troll pages reached some 11.4 million people
- More astounding yet, unsponsored Russian free posts were seen and shared by more than 126 million people—almost 40% of the U.S. population
Many of these posts were provocative enough to spur people on opposite sides of the same issue to actually get up, grab their pitchforks, and face off at local political rallies.
Hate them if you must, it’s hard not to admire the sheer scope, determination and effectiveness of the Russian initiative. You also cannot help but be impressed with their faith in social media’s power to rouse our animal spirits.
Much of what we consume online is judged less on its veracity, more on its ability to stop our scrolling, amuse or enrage us, then get us to stick and click.
If you think that’s outrageous—
Social media has also helped spread other insane rumors such as the pope backing Donald Trump’s candidacy, or Hillary Clinton selling weapons to ISIS, stories whose headlines garnered thousands of shares.
But it gets better. In October of 2016, a rumor retweeted more than 6,000 times (often by bots) claimed a popular Washington D.C. pizza parlor was a front for a child sex ring linked to Hillary Clinton. “I just can’t hold back the truth anymore,” chimed propagandist Alex Jones on a YouTube clip viewed more than 400,000 times. “I think about all the children [she] has personally murdered and chopped up and raped.” All this hubbub compelled one thin-crusted man to drive all the way from North Carolina to investigate the rumors, and later fire his gun inside the D.C. restaurant.
Mea culpa, Congress
Responding to public pressure and death stares by U.S. senators, Facebook and company say they are taking steps to prevent this kind of misuse, and help users know where their content is coming from. But many say much more should be done, like hiring legions of fact-checkers, or introducing algorithms that identify and de-accentuate corrupt material.
The problem is, these properties are making lots of money serving up stuff we say we like—content that aligns with our preferences and viewing habits. This has created a situation where much of what we consume online is judged less on its veracity, more on its ability to stop our scrolling, amuse or enrage us, then get us to stick and click.
Welcome to the new attention economy and the battle for our eyeballs.
With free, widely available social tools, practically anyone is about 12 minutes away from putting crazy thoughts in front of potentially millions of people. That includes insufferable soccer moms, deranged psychopaths, even Russian trolls.
How the heck did we get here?
There was a time when getting people to read your words wasn’t easy. News was published or broadcast periodically. Space and time were precious. Media was costly. So, five white guys would sit around a table and decide which stories were important, engaging or credible enough to grace that limited space. Through this careful though often biased vetting process, lesser or questionable stories seldom saw the light of day.
That all changed with the advent of the 24-hour news cycle, the internet, electronic publishing and online social communities. Demand for content went way up; barriers to becoming a publisher went way down—to practically zero. As a result, there was an explosion of media.
Now, with free, widely available social tools, and absent expert curation, practically anyone is about 12 minutes away from putting crazy thoughts in front of potentially millions of people. That includes insufferable soccer moms, deranged psychopaths, even Russian trolls.
According to a report recently published by the Pew Research Center and Knight Foundation, 40% of U.S. adults get their news on Facebook.
This is your brain on social
Social media is great for posting pictures of babies and dogs, bragging about your vacation, or slinging opinions about this or that. But it was never intended to be an unimpeachable news source. Tell that to the more than 40% of U.S. adults who get their news on Facebook, according to a report recently published by the Pew Research Center and Knight Foundation.
Science also has something to say about why the Russian effort was so insidiously effective: our brains might not be at their most discerning when we’re on a social site. In fact, a recent study by Yale researchers has determined that “social media platforms help to incubate blatantly false news stories” through a phenomenon known as the mere-exposure effect. Basically, this and other research suggests that if we view something often enough, or in a familiar venue like Facebook, we have a tendency to believe it outright.
When people shun more established, reputable news sources, and suspend their critical thinking skills, perhaps it’s easier to believe that the pope is stumping for Trump, or that microwaving our iPhone can recharge its battery.
Time to grow up
The very things that make social media so powerful—the democratization of it tools; a general absence of censorship; its ability to connect us to the people, things and ideas we cherish—these are what also make it so susceptible to abuse. Yet, in our rush to condemn Facebook and company, it would be wrong to significantly change any of these.
It’s also important to remember that social is still in its infancy. A kid compared to more established mediums. Of course, we should continue pressing these platforms to make improvements. But as users, maybe we also have some growing up to do.
 Pennycook, Gordon and Cannon, Tyrone D and Rand, David G., Prior Exposure Increases Perceived Accuracy of Fake News (August 26, 2017).