Facebook Can No Longer Avoid Being a News Organization
A few months ago, Facebook came under a lot of criticism—including from me—for revelations about ideological bias in its "trending" news section, where recent Ivy League graduates would routinely suppress conservative news sources they didn't like.
Part of the reason this was a scandal is that Facebook had previously implied there was no human bias because it was all done with algorithms, which turned out not to be true. But in my comments, I pointed out that this was really unavoidable.
For all of the swelling talk about algorithms and Big Data and artificial intelligence, covering the news still requires judgment, which means that it requires people who are making those judgments. Consider how easily white nationalists manipulated an “artificial intelligence” chatbot to pay tribute to Adolf Hitler. Without people involved in the decision-making process, a Trending Topics feature could easily spit out obviously fake news stories and a lot of irrelevant junk.
Yet that's exactly what Facebook tried to do. It responded to the controversy, first by holding one of those internal inquiries in which you investigate yourself and naturally find yourself totally blameless, then by acknowledging that the whole system was biased by totally scrapping it. Supposedly this was going to go back to the promise of running everything with algorithms again: "Facebook has laid off the entire editorial staff on the Trending team—15-18 workers contracted through a third party. The Trending team will now be staffed entirely by engineers, who will work to check that topics and articles surfaced by the algorithms are newsworthy."
And the result, in 3, 2, 1...
"Facebook Fires Human Editors, Algorithm Immediately Posts Fake News"
The Washington Post traces the fake story's origins:
The trending “news” article about Kelly is an Ending the Fed article that is basically just a block quote of a blog post from National Insider Politics, which itself was actually aggregated from a third conservative site, Conservative 101. All three sites use the same “BREAKING” headline. The Conservative 101 post is three paragraphs long, and basically reads as anti-Kelly fan fiction accusing her of being a “closet liberal” who is about to be removed from the network by a Trump-supporting O’Reilly. It cites exactly one news source as a basis for its speculation: the Vanity Fair piece.
So what went wrong? Well, notice that in the description of how Facebook fired its editorial staff, it says that it kept an engineering staff to "check that topics and articles surfaced by the algorithms are newsworthy"—a job that actually requires, you guessed it, an editorial staff.
Here's what's going on. Facebook is becoming, for better or worse, an important and prominent conduit for news and political commentary, the equivalent in the Internet's "new media" of what the Big Three networks or the newspaper syndication services used to be. In addition to becoming a major transmission belt for news stories from across the Web, Facebook has even spawned its own bizarre internal ecosystem of political clickbait pages.
I call this the Zuckerberg News Bureau. And if Facebook is hosting a de facto news bureau, Mark Zuckerberg and the muckity-mucks at Facebook ought to be taking seriously the job of building a good news bureau, one that will be balanced and objective and trustworthy.
The problem is that this is hard. Here is how I described it.
But getting people involved creates its own problems. I’ve worked in just about every facet of the new Internet media, including a lot of what we call “news aggregation,” which is what Trending Topics really is: somebody making judgments about what is legitimate news and interesting to readers, versus what’s noise. It requires some experience and judgment to sort the real stories from the fake, and doing it objectively also requires a scrupulous dedication to linking to articles and people that you hate. That means resisting the temptation to use some snobbish conception of what is a “respectable” news source as an excuse to exclude views you don’t like.
They didn't want to do this before, so they fobbed it off onto a bunch of smug leftist 20-something Ivy League graduates who seemed to operate without much adult supervision. When that didn't work, they still didn't want to run a news bureau, so they tried to do it all with algorithms supervised by software engineers who lack the knowledge or experience to sort fake news from real news.
Facebook needs to realize that it is a media company and that yes, it needs an editorial staff. And they need to do the hard work of figuring out how to hire, train, and supervise them in such a way that they don't use Facebook as a sandbox for their own pet political obsessions. That's difficult, and a lot of media organizations—all right, most of them—fail at it. But there are a few sites out there that actually do this well. I've worked for one of them. And I'm sure we could offer Mark Zuckerberg a lot of advice about what to do and what not to do.
But first he has to give up the fantasy that this can all be avoided by picking the right algorithms. Facebook needs to start taking seriously its status as a media company—and a very big and influential media company—and also take seriously the need to be fair, balanced, and impartial in its management of that news operation.