Tech

Algorithms shouldn’t be protected by Section 230, Facebook whistleblower tells Senate

Former Facebook employee Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation hearing on October 5, 2021. Haugen left Facebook in May and provided internal company documents about Facebook to journalists and others, alleging that Facebook consistently chooses profit over safety.

Enlarge / Former Facebook employee Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation hearing on October 5, 2021. Haugen left Facebook in May and provided internal company documents about Facebook to journalists and others, alleging that Facebook consistently chooses profit over safety. (credit: Drew Angerer/Getty Images)

Facebook whistleblower Frances Haugen testified before a Senate panel yesterday, recommending a slate of changes to rein in the company, including a Section 230 overhaul that would hold the social media giant responsible for its algorithms that promote content based on the engagement it receives in users’ news feeds.

“If we had appropriate oversight, or if we reformed [Section] 230 to make Facebook responsible for the consequences of their intentional ranking decisions, I think they would get rid of engagement-based ranking,” Haugen said. “Because it is causing teenagers to be exposed to more anorexia content, it is pulling families apart, and in places like Ethiopia, it’s literally fanning ethnic violence.”

Haugen made sure to distinguish between user-generated content and Facebook’s algorithms, which prioritize the content in news feeds and drive engagement. She suggested that Facebook should not be responsible for content that users post on its platforms but that it should be held liable once its algorithms begin making decisions about which content people see.

Read 13 remaining paragraphs | Comments