Integrity Score 638
No Records Found
No Records Found
No Records Found
Former Facebook product manager Frances Haugen has been revealed as the source behind tens of thousands of pages of leaked internal company research, which she says show that the company has been negligent in eliminating violence, misinformation and other harmful content from its services, and that it has misled investors about these efforts.
In an interview with The Washington Post, Haugen said that while working at Facebook in the company’s civic integrity division, she realized it was not disclosing important information about the harms of its products to the public and the policymakers tasked with regulation, creating a situation she said posed a threat to democracy.
“Facebook in its current form is dangerous,” she said. “It became necessary to get the public involved.”
For Facebook, the document leak — and the public reveal of the source — represents perhaps the most significant crisis in the company’s history, further deteriorating relationships between the company and Washington politicians. The company is the target of a historic federal antitrust case and is fielding document requests as members of Congress investigate its role in the Jan. 6 riot at the U.S. Capitol.
Widely referred to as a “Facebook whistleblower” responsible for leaking documents behind a Wall Street Journal series, Haugen spoke publicly about her complaint to federal authorities, disclosing her identity for the first time in an interview airing Sunday night on “60 Minutes.”
“There were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money,” Haugen said in the interview.
A veteran of tech companies including Pinterest, Yelp and Google, Haugen, 37, left Facebook in May after developing serious reservations about the company’s policies, particularly surrounding the events of Jan. 6. Before the 2020 election, Haugen said, Facebook implemented measures to prevent the spread of misinformation, but the company decided to dissolve many of these protections after the election. She said she stopped trusting that her employer was willing to limit growth to improve public safety.
https://www.washingtonpost.com/technology/2021/10/03/facebook-whi