Facebook whistleblower accuses company of putting profit before safety

Frances Haugen, who left the company earlier this year, was interviewed on CBS, claiming 'incentives are misaligned' at Facebook.

Facebook whistleblower accuses company of putting profit before safety

The whistleblower, whose leaks were the basis for The Wall Street Journal report series The Facebook Files has said the social media giant prioritised "growth over safety" in an interview last night on CBS news show 60 Minutes.

Before leaving the company this year, Frances Haugen, a former product manager, copied tens of thousands of pages of Facebook internal research. 

The WSJ reports revealed that Facebook’s own research had found:

  • The company had different procedures in place for high-profile users who violated its rules

  • Instagram was having a damaging effect on the mental health of many teens users.  Although Facebook pointed out in a response that, on most measures, it was more likely to have a positive than negative impact

  • Facebook founder Mark Zuckerberg resisted changes suggested by colleagues, fearing that they would harm engagement

In the interview, Haugen said: “Facebook, over and over again, has shown it chooses profit over safety. It is subsidising, it is paying for its profits with our safety. I'm hoping that [these revelations] will have had a big enough impact on the world that they get the fortitude and the motivation to actually go put those regulations into place. That's my hope.”

Responding to this claim, Lena Pietsch, Facebook's director of policy communications, said: “The growth of people or advertisers using Facebook means nothing if our services aren't being used in ways that bring people closer together  that’s why we are investing so much in security that it impacts our bottom line.

“Protecting our community is more important than maximising our profits. To say we turn a blind eye to feedback ignores these investments, including the 40,000 people working on safety and security at Facebook and our investment of $13 billion since 2016.”

Haugen said that when she joined Facebook in 2019, having previously worked at Google, Yelp and Pinterest, she took the job on the condition that she could work on tackling misinformation, saying that she had lost a friend to online conspiracy theories. 

She was assigned to Facebook’s Civic Integrity unit, which worked to protect elections. But following the 2020 U.S. presidential election, it was dissolved, although Facebook says its work was distributed to other units.

Haugen’s lawyers have filed a series of complaints to the Securities and Exchange Commission, on the basis that Facebook has withheld information that could negatively affect its investors. 

John Tye, founder of legal group Whistleblower Aid, said: “As a publicly traded company, Facebook is required to not lie to its investors or even withhold material information. So, the SEC regularly brings enforcement actions, alleging that companies like Facebook and others are making material misstatements and omissions that affect investors adversely.”

In response to this, Pietsch added: We stand by our public statements and are ready to answer any questions regulators may have about our work.

Despite her course of action, Haugen spoke in defense of Zuckerberg.

“I have a lot of empathy for Mark, and Mark has never set out to make a hateful platform,” she said. “But he has allowed choices to be made where the side effects of those choices are that hateful, polarising content gets more distribution and more reach.”

She added: “It's one of these unfortunate consequences, right? No-one at Facebook is malevolent, but the incentives are misaligned, right? Like, Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.”

In a statement provided to 60 Minutes, a Facebook spokesperson said: “Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”

“If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”

This story first appeared on campaignlive.co.uk. 

Have you registered with us yet?

Register now to enjoy more articles and free email bulletins

Already registered?
Sign in