Facebook is the largest social media platform ever created, boasting about 1.6 billion active users — more than the population of China — and a core tenet of its mission statement is to act as a democratizing force, giving users equal access to a platform to share content.
However, the social network’s commitment to being a truly open and equal platform was challenged earlier this month. An anonymous source claiming to be an ex-Facebook employee told Gizmodo that the company routinely suppresses conservative political issues in its Trending Topics section.
The suggestion that one of the largest drivers of traffic to news outlets could be effectively censoring certain political views drew criticism from conservatives and media commentators. Politico’s Tony Romm and Hadas Gold wrote, for instance, that Facebook "has worked feverishly to contain the fallout."
Robert Fronk, MD of reputation strategy at Purple Strategies, contends Facebook’s initial response was not comprehensive. "They kind of let it happen [with] a little bit of news each day," he explains.
Ryan Brack, SVP at Mercury Public Affairs, agrees, saying, "Facebook should be commended for ending up in the right place, although it has taken several media cycles to get there."
The Palo Alto, California-based company first only stated that it had procedures in place to prevent bias, then it issued a statement denying the allegations. Later, portions of internal guidelines were leaked to The Guardian. Facebook followed up by issuing a statement detailing how the Trending Topics area is populated, including links to RSS feeds it monitors, a list of news outlets it uses to verify breaking news and trends, and its 28-page Trending review guidelines. It also arranged a meeting with a number of conservative pundits, including The Blaze founder Glenn Beck and former White House press secretary Dana Perino.
Fronk says these moves were a good start, but due to the nature of the allegations, Facebook must continue to work to regain trust and convince users it is a platform where all voices can be heard equally. Other experts agree.
"This can’t be a one-and-done. [Facebook] can’t just have this conversation and be through," says Corey Ealons, a former Obama administration spokesperson and partner at Washington, D.C.-based public affairs firm Vox Global. "That’s PR 101."
Facebook is not a news outlet in a traditional sense; it doesn’t employ reporters or create original news content. But that doesn’t mean it has the luxury of ignoring allegations of political bias, says Dan Gillmor, who teaches media literacy at Arizona State University’s Walter Cronkite School of Journalism and Mass Communication.
"[Facebook is] a news aggregator. There are editorial decisions going on," Gillmor says, noting that Facebook employs a team of curators that chooses what stories are displayed in the Trending Topics section, and it creates algorithms that influence what content users see. "That’s an editorial function whether they want to acknowledge it or not...They had to respond to this because they were getting beat up by some politically powerful people."
Brack agrees. "Digital platforms that reach more than a billion users are a quasi news source because it’s where the eyeballs are," he says. "They are a media company, even though it’s social media."
The notion that Facebook is so large it could afford to ignore the controversy and wait for it to blow over was also generally rejected.
"CNN and Fox are two very real examples of this happening. There was a space in the market where folks felt their views weren’t being reflected, the topics they cared about weren’t being discussed, and Fox has taken advantage of that," Brack says. "Just because [Facebook] is a social media tool doesn’t mean it isn’t at risk of the same thing happening."
Gillmor adds there may be another reason other than frustrated conservatives and potential loss of users and revenue for the platform’s response: legal action by the government.
"I think it’s much more likely they’re worried about [antitrust] regulation, which I firmly believe is becoming necessary," Gillmor says. "Facebook is becoming dominant in every way when it comes to what people see, hear, and watch relating to the news, and that’s very worrisome."
There are several other cases that illustrate why manipulation of content on a platform as large as Facebook could be of interest to regulators. In 2014, The New York Times reported that Facebook had conducted an unannounced psychological experiment involving half a million users in which it altered the amount of positive or negative content that appeared in their feeds to gauge emotional impact. (Facebook users actually agree to take part in such experiments). Some commentators called the study manipulative, creepy, and unethical. In a 2012 experiment, Facebook reportedly manipulated voter turnout in groups it targeted with certain messages; the researchers behind the study concluded it could have decided close down-ticket races.
"Social media has sparked revolutions in countries in the Middle East. We’ve seen what Donald Trump has been able to do tweeting in the middle of the night versus Jeb Bush’s $130 million super PAC and TV ads," says Ealons. "It’s having a real impact on our political discourse. That’s why this issue matters, and whoever has the lever to control what content is shown to particular communities has a lot of power."
Behind closed doors
Facebook’s private meeting with conservative commentators has drawn mostly positive media coverage.
"Mark Zuckerberg really impressed me with his manner, his ability to manage the room, his thoughtfulness, his directness, and what seemed to be his earnest desire to ‘connect the world,’" wrote Glenn Beck after the meeting.
Ealons says the choice to include mostly conservative pundits and not more traditional leaders of the Republican Party was an "interesting" choice.
"From a communications standpoint, from a PR standpoint, it appears [Facebook is] talking to people who have a bullhorn," Ealons says. "The type of people who have a platform, and they will take the message from the meeting back to those platforms and push it out to their broader community."
But Gillmor, who says he doesn’t believe there was any anti-conservative bias at Facebook, says at least one invitee could open Facebook up to criticism.
"I retweeted somebody who pointed out that Glenn Beck is a conspiracy theorist, among other things," he says. "If it were me, I would not want my brand associated with him."
Anti-establishment and alt-right conservatives also criticized Beck and other media figures for attending the meeting.
"Facebook believes that a feel-good PR-controlled photo op with Beck, a couple of think-tank directors, and a few talking heads is going to dispel perceptions that the company is biased against conservatives," wrote Breitbart columnist Milo Yiannopoulos.
Brack, Ealons, Fronk, and Gillmor say Facebook will be able to avoid long-term brand or financial damage if it is transparent and open to discussion. Brack wants to see the discussion about Facebook’s size and reach turn into a larger debate about journalism in the age of social media with involvement from institutions such as the Poynter Institute, the Knight Foundation, and Columbia University’s journalism school.
Ealons and Fronk say there’s an opportunity for Facebook to emerge with an even stronger brand image than before if it is open and educates its users and the outside public on its inner-workings.
"Not only will it not lose customers and consumers, I think they may actually gain some," says Ealons.
A spokesperson for Facebook declined comment for this article. The National Republican Committee did not reply to a request for comments as of publication date.