While many lament social media as the source of any and all of society’s ills, few are aware of the mechanisms with which companies like Facebook have to curate content and even make editorial decisions. With 62 percent of adults in the United States getting their news from social media, it is time to consider seemingly non-editorial social media networks—particularly Facebook—as legitimate media outlets that have a responsibility to the public to disseminating information that is at least somewhat rooted in fact, if not create better and more nuanced editorial standards than traditional media outlets.
As previously discussed in an Argus article on the post-2016 media landscape, much of the media reporting that has taken place in the aftermath of the election has been focused on the rise of fake news, which is normally defined as an objectively false headline supported by no corroborated sources meant to garner as much web traffic and social media shares as possible. This is sometimes conflated with what has become known as “click-bait” journalism, yet there is a clear distinction. Click-bait is often reserved exclusively for the headline of an article. Conceptions of what makes a click-bait headline range from simply attention-grabbing and engaging headlines—also known as effective headlines—to headlines that have little to nothing to do with the story itself and the body contents of the article. The latter is a more accurate definition of click-bait journalism, while the former is most certainly convenient when one does not agree with an article or when one’s self-interest is at stake in the piece. The rise of fake news, however, began on Facebook after a June decision to change their newsfeed algorithm from news stemming from “traditional” sources to whatever the user’s friends were reading.
In a New York Times piece from late June, media reporters Mike Isaac and Sydney Ember reported that Facebook had made the decision to move away from its coordination with media publications in its news feed in favor of promoting what users’ friends and family were sharing. Although the article mostly focused on the loss in web traffic many publishers would see as a result, the warning signs for the rise of conspiracy articles that could affect the outcome of the most consequential election in most Americans’ lifetimes were already there.
“The side effect of those changes, the company said, is that content posted by publishers will show up less prominently in news feeds, resulting in significantly less traffic to the hundreds of news media sites that have come to rely on Facebook,” Isaac and Ember write. “The move underscores the never-ending algorithm-tweaking that Facebook undertakes to maintain interest in its news feed, the company’s marquee feature that is seen by more than 1.65 billion users every month.”
This resulted in scores of fake news stories outperforming factually accurate pieces of journalism in the final weeks of the campaign, such as Pope Francis endorsing Donald Trump for President (he didn’t) and an FBI agent investigating Hillary Clinton’s emails found dead (there’s no evidence of this happening). Several outlets have examined the evolution of these stories. Chris Hayes of MSNBC recently tracked Donald Trump’s erroneous tweet about “millions” of voters voting illegally back to Info Wars, and its source: a random guy on Twitter with less than 20 followers.
Yet, as President Barack Obama pointed out in a recent interview with New Yorker Editor David Remnick, in the modern media landscape, a conspiracy theory from a Twitter egg has the same weight as a piece from The New York Times. And in many ways, the public has more confidence in their uncle on Facebook or a Twitter egg than they do in the nation’s most venerable publications. 2016 is the first time in 15 years that public trust in the media fell below 40 percent among Americans 50 and older.
As for millennials, who rely on news from social media outlets far more than the 62 percent of the general population, a greater problem beyond fake news is the self-selection and fractioning of political beliefs that are exacerbated by social media. Because of how integral social media has become for millennials, social media outlets need to take more responsibility for their role as a media outlet when the future is being informed primarily on their platforms.
The logistics of this are complicated, but given the nuance and constant updates to the algorithms behind the Facebook news feed alone, companies with this level of capital have the ability to make editorial decisions with code. If they can’t, it will be not only at their cost, but that of civil society as we know it.
Jake Lahut can be reached at jlahut@wesleyan.edu and on Twitter @JakeLahut.