Facebook still has a huge problem with spreading misinformation throughout the world, said Frances Haugen, former product manager turned Facebook Whistleblower.

“The problem at Facebook is not about bad people or bad ideas it’s about giving the most reach to the people with the most extreme ideas,” Haugen said.

Haugen spoke at South by Southwest on Tuesday morning. Facebook CEO and Founder Mark Zuckerberg spoke virtually via camera, in an afternoon session but didn’t mention any of the issues Haugen raised. Instead, he focused on the rebranding of the company to Meta and its focus on creating tools for the emerging Augmented Reality, Virtual Reality 3-D world known as the Metaverse.

In her talk, Haugen said she joined Facebook in 2019 after a Facebook recruiter reached out to her. She worked on a team focused on civic misinformation. It was an extremely personal issue, she said. She had lost connection with a close friend who went down a rabbit hole of misinformation.

Before joining Facebook, Haugen worked as a product manager at Google, Pinterest, and Yelp. Her job largely focused on algorithmic products and recommendation systems like the one that powers the Facebook news feed.

In her job, Haugen began to see Facebook prioritize profits over safety. She realized that the company also misrepresented how it was dealing with the problems of hate speech, violence, teenage mental health, human trafficking, and misinformation on its platform to the public, investors, and the government.

So, in 2021, Haugen became a whistleblower and gave tens of thousands of Facebook’s internal documents to the Securities and Exchange Commission.

“It’s really easy to be dismissive of the severity of misinformation,” Haugen said.

Without having the ability to have shared facts, people don’t have the ability to discern fact from fiction.

“In the case of the truth, there is only one truth,” Haugen said.

Zuckerberg thought AI could solve the problem of misinformation online but it only made the problem worse, Haugen said. All artificial intelligence or machine learning systems are biased, Haugen said. She called them hill climbers. They consume information and optimize for the content that gets the most reactions, she said.

“The scary thing is that fake news is way more compelling than real news,” Haugen said.

That’s why third-party fact-checking and AI by machines do not work, Haugen said. The systems as they are written are not designed for scale, she said.  

Google and Apple are transparent on how they operate but Facebook is not, Haugen said.

“There is much less incentive for Apple to lie to us,” she said.

But Facebook is different, Haugen said. Facebook is a closed system, she said. Each one of us has a different experience on the platform, she said.

Facebook had been warned by an activist five years prior to 2020 that Facebook had a problem with human trafficking that was happening on the platform, Haugen said. When she brought it up with Facebook executives, they told her over and over again that it wasn’t real and that it was not a big deal.

“Because no one could see inside of Facebook they got away with it,” Haugen said. “When my disclosures came out – we had proof. The public had proof.”

Facebook gives the most reach to the most extreme ideas, Haugen said.

“It does that because it makes them money,” she said.

Facebook has not been about family and friends for a long time, Haugen said. It started to change around 2008, she said. .Facebook is a private company and they have a fiduciary duty to its shareholders, she said.

“Every quarter they need you to look at more and more content,” Haugen said.

That solution to get people on Facebook to spend more time and engage more was groups, Haugen said. These things are designed to be addictive and they are designed to create a rabbit hole for you to fall down into, she said.

“Groups are the gasoline that is ignited by the algorithm,” Haugen said.

Facebook also had a huge influence on the election, Haugen said. By the time we got to the ballot box in December, Facebook got to vote first, she said.

The QAnon, Neo-Nazi and other hate and misinformation groups flourished on Facebook, Haugen said.

Facebook has an algorithm that rewards polarized content, she said. And government oversight and regulation are needed to make sure the public’s safety is prioritized, Haugen said.

“The main thing I always advocate for is transparency,” Haugen said. “We have to have legislative-supported transparency.”

“Facebook knows how to keep us safe without censoring us,” Haugen said. “But the reason why it doesn’t. It makes more money running the system the way it does today.”