The whistleblower who shared a trove of Facebook documents alleging the social media giant knew its products were fueling hate and harming children’s mental health revealed her identity on Sunday in a televised interview, and accused the company of choosing “profit over safety”.
Frances Haugen, a 37-year-old data scientist from Iowa, has worked for companies including Google and Pinterest – but said in an interview with CBS news show 60 Minutes that Facebook was “substantially worse” than anything she had seen before.
She called for the company to be regulated. “Facebook over and over again has shown it chooses profit over safety. It is subsidising, it is paying for its profits with our safety,” Haugen said.
“The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world,” she said.
The world’s largest social media platform has been embroiled in a firestorm brought about by Haugen, who as an unnamed whistleblower shared documents with US lawmakers and The Wall Street Journal that detailed how Facebook knew its products, including Instagram, were harming young girls, especially around body image.
ALSO READ: Facebook puts ‘Instagram for kids’ on ice after mounting criticism
US Senator Richard Blumenthal responded to the interview ahead of Haugen’s appearance to testify in Congress next week, saying in a statement: “Facebook’s actions make clear that we cannot trust it to police itself. We must consider stronger oversight.”
In the 60 Minutes interview Haugen explained how the company’s News Feed algorithm is optimised for content that gets a reaction.
The company’s own research shows that it is “easier to inspire people to anger than it is to other emotions,” Haugen said.
“Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads, they’ll make less money.”
During the 2020 US presidential election, she said, the company realised the danger that such content presented and turned on safety systems to reduce it.
But “as soon as the election was over they turn them back off, or they change the settings back to what they were before, to prioritise growth over safety, and that really feels like a betrayal of democracy to me,” she said.
“No one at Facebook is malevolent,” she said, adding that co-founder and CEO Mark Zuckerberg did not set out to make a “hateful” platform. But, Haugen said, the incentives are “misaligned”.
Facebook’s vice president of policy and global affairs Nick Clegg vehemently pushed back at the assertion its platforms are “toxic” for teens, days after a tense congressional hearing in which US lawmakers grilled the company over its impact on the mental health of young users.
While Haugen did not draw a straight line between the decision to roll back safety systems and the US Capitol riot on January 6, 60 Minutes noted that the social network was used by some of the organisers of that violence.
During an appearance on CNN, Clegg dismissed the link.
“I think the assertion (that) January 6th can be explained because of social media, I just think that’s ludicrous,” Clegg told the broadcaster, saying it was “false comfort” to believe technology was driving America’s deepening political polarisation.
The New York Times reported on Saturday that Clegg sought to pre-empt Haugen’s interview by penning a 1,500-word memo to staff alerting them of the “misleading” accusations.
Facebook has encountered criticism that it fuels societal problems, attacks Clegg said should not rest at Facebook’s feet. But he acknowledged that people with pre-existing issues may not benefit from social media use.
He also disputed reporting in an explosive Wall Street Journal series that Facebook’s own research warned of the harm that photo-sharing app Instagram can do to teen girls’ well-being.
“It’s simply not borne out by our research or anybody else’s that Instagram is bad or toxic for all teens,” Clegg told CNN, but added Facebook’s research would continue.
Facing pressure, the company had previously announced it would suspend but not abandon the development of a version of Instagram meant for users younger than 13.
NOW READ: Facebook blames facial recognition software after mistaking Black men for ‘primates’
Download our app and read this and other great stories on the move. Available for Android and iOS.