Integrity Score 2097
No Records Found
No Records Found
No Records Found
The parent company of Facebook and Instagram, Meta, is looking into whether its platforms treat users differently based on race, after years of criticism particularly from Black users and its own employees about racial bias.
"There are a lot of members of systemically and historically marginalized communities who feel that their experience on our platforms is different," said Roy Austin Jr., vice president of civil rights at Meta, formerly known as Facebook. https://www.npr.org/2021/11/18/1056916140/facebook-to-study-black-users-experience
That includes Black users who say their posts about racism have been taken down for violating the company's hate speech rules. https://www.usatoday.com/story/news/2019/04/24/facebook-while-black-zucked-users-say-they-get-blocked-racism-discussion/2859593002/. Facebook also apologized in September after a flaw in its artificial intelligence software led to a video of Black men being labeled as "primates." https://www.npr.org/2021/09/04/1034368231/facebook-apologizes-ai-labels-black-men-primates-racial-bias.
Meta is starting by tracking the race of its platforms' users, which Austin described as "a huge step to moving from the anecdotal to the data driven." He said the work would allow the company to understand how people's experiences on Facebook may differ by race, a first step toward addressing any problems.
Facebook’s race-blind policies around hate speech came at the expense of Black users, new documents accessed by The Washington Post shows. Researchers proposed a fix to the biased algorithm, but one internal document predicted pushback from ‘conservative partners’.
Last year, researchers at Facebook showed executives an example of the kind of hate speech circulating on the social network: an actual post featuring an image of four female Democratic lawmakers known collectively as “The Squad.” The poster, whose name was scrubbed out for privacy, referred to the women, two of whom are Muslim, as “swami rag heads.” A comment from another person used even more vulgar language, referring to the four women of color as “black c---s,” according to internal company documents exclusively obtained by The Washington Post.
The trove of documents show how Facebook CEO Mark Zuckerberg has, at times, contradicted, downplayed or failed to disclose company findings on the impact of its products and platforms.
READ MORE: https://www.washingtonpost.com/technology/2021/11/21/facebook-algorithm-biased-race/