Facebook is making online hate worse, Haugen tells Parliament
Facebook's internal culture "prioritises profitability over its impact on the wider world"
Frances Haugen, a former Facebook employee turned whistle-blower, told MPs that the social network is exacerbating online hate worldwide because its algorithms are designed to promote divisive content.
Ms Haugen, who worked as product manager on Facebook's civic misinformation team, arrived in London to appear before a parliamentary select committee that is examining plans to regulate social media platforms and make them responsible for the content published on their sites.
"The events we're seeing around the world, things like Myanmar and Ethiopia, those are the opening chapters because engagement-based ranking does two things: one, it prioritises and amplifies divisive and polarising extreme content and two it concentrates it," Haugen told the committee, according to Reuters.
She said Facebook's internal culture prioritises profitability over its impact on the wider world, and that "there is no will at the top to make sure these systems are run in an adequately safe way."
"Until we bring in a counterweight, these things will be operated for the shareholders' interest and not the public interest."
Haugen said Facebook Groups intensify online hate, as its algorithms take people with mainstream interests and push them to the extremes.
"Unquestionably, it's making hate worse"
Haugen added that Facebook's moderation systems are not efficient at catching content in languages other than English. That is a major issue even in the UK, which is a diverse country.
"UK English is sufficiently different that I would be unsurprised if the safety systems that they developed, primarily for American English, would be underenforced in the UK."
Dangerous misinformation in other languages can also affect people in the UK, if citizens are radicalised with misinformation they find online.
Haugen said Facebook has a culture that "lionises a start-up ethic," which she believes is irresponsible.
She said she had "no idea" whom to flag her concerns to within the firm because of the risk it could have to growth.
Haugen's testimony was her second appearance before lawmakers this month. She earlier spoke to US Congress about the dangers Facebook poses, from fuelling political violence to spreading misinformation and harming children.
In her latest testimony, she said that Instagram, which is used by millions of children across the world, may never be safe for pre-teens.
Since quitting Facebook in May, Haugen has provided thousands of Facebook's internal documents to lawmakers and regulators. Some of these, she says, show that the company has misled the public about making 'significant' progress against misinformation, hate speech and violence. One document indicates that the social media platform takes action on only 3-5 per cent of hate and 0.6 per cent of violence and incitement content.
Ms Haugen's appearance came one day after Monika Bickert, Facebook's VP of content policy, said the technology industry needs stronger regulation to address issues like misinformation, privacy and harmful content.
Bickert, who has worked at Facebook since 2012, believes government regulation can help to establish standards that all companies would be required to meet, enabling people to judge how companies enforce rules on their platforms.
She said Facebook is advocating for democratic governments to set new rules for the internet on areas like elections, data, hate content and privacy, because it - apparently - believes digital platforms should not be allowed to make those decisions on their own.
Meanwhile, Facebook founder Mark Zuckerberg has insisted that recent revelations are an attempt to "paint a false picture" of the company.
"It makes a good soundbite to say that we don't solve these impossible tradeoffs because we're just focused on making money, but the reality is these questions are not primarily about our business, but about balancing difficult social values."