Whistlblower claims Facebook used by authoritarian regimes like China

0
16477

Frances Haugen, a former Facebook employee turned whistleblower, testified before the US Senate on Tuesday, claiming that the social media platform is “definitely” being used by “authoritarian or terrorist-based leaders” around the world.

“My team directly worked on tracking Chinese participation on the platform, surveilling, say Uighur populations, in places around the world. You could actually find the Chinese based on them doing these kinds of things,” she said. “We also saw active participation of, say, the Iran government doing espionage on other state actors.”

Despite the national security threat, Haugen said she did not believe Facebook was adequately prepared to monitor and combat this behavior.

“Facebook’s consistent understaffing of the counterespionage information operations and counter terrorism teams is a national security issue, and I’m speaking to other parts of Congress about that … I have strong national security concerns about how Facebook operates today.”

A former data scientist for Facebook, Haugen is pushing the US Congress for new rules that address the concerns she’s raised.

She said in her testimony before Congress that Facebook has long known about misinformation and hate speech on the platform and negative impacts on young users.

At the hearing, Haugen explained to a Senate commerce committee panel how she believes Facebook’s Instagram platform affects children negatively.

“I am here today because I believe that Facebook’s products harm children, stoke division and weaken our democracy,” Haugen said during her opening remarks.

“The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary change because they have put their astronomical profits before people,” she said. “Congressional action is needed. They won’t solve this crisis without your help.”

Haugen said the social media site’s algorithms can quicky steer children away from safe content like healthy recipes to content about eating disorders. She called on lawmakers to demand more transparency into the company’s algorithms and internal metrics to guide how to regulate the company.

“Facebook knows that content that elicits an extreme reaction from you is more likely to get a click, a comment or a re-share,” Haugen said. “Those clicks and comments and re-shares aren’t necessarily for your benefit, but because they know other people will produce more content if they get the likes and comments and re-shares.”

Monika Bickert, vice president of content policy at Facebook, said it was “not true” that the platform’s algorithms are designed to push inflammatory content.

“We do the opposite, in fact, and if you look in our transparency center, you can actually see that we demote, meaning we reduce the visibility of engagement bait, click bait, and why would we do that? One big reason is for the long-term health of our services, we want people to have a good experience,” Bickert said in a television interview

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.