Background
In mid September 2021, ''The Wall Street Journal'' began publishing articles on Facebook based on internal documents from unknown provenance. Revelations included reporting of special allowances on posts from high-profile users ("XCheck"), subdued responses to flagged information on human traffickers and drug cartels, a shareholder lawsuit concerning the cost of Facebook (now Meta) CEOThe reports
Beginning October 22, a group of news outlets began publishing articles based on documents provided by Haugen's lawyers, collectively referred to as ''The Facebook Papers''.Instagram's harmful effects on teenagers
The Files show that Facebook (now Meta) has been conducting internal research of how Instagram affects young users for the past three years. While the findings point to Instagram being harmful to a large portion of young users, teenage girls were among the most harmed. Researchers within the company reported that "we make body issues worse for one in three teenage girls". Furthermore, internal research revealed that teen boys were also affected by negative social comparison, citing 14% of boys in the US in 2019. Instagram was concluded to contribute to problems more specific to its app use, such as social comparison among teens.Violence in developing countries
An internal memo seen by the '' Washington Post'' reveal that Facebook has been aware ofControlling falsehoods about the U.S. elections
''The New York Times'' points internal discussions where employees raised that Facebook was spreading content about the QAnon conspiracy theory more than a year before thePromoting anger-provoking posts
In 2015, in addition to the Like button on posts, Facebook introduced a set of other emotional reaction options: love, haha, yay, wow, sad and angry. The Washington Post reported that for three years, Facebook's algorithms promoted posts that received the 'angry' reaction from its users, based on internal analysis showing that such posts lead to five times more engagement than posts with regular likes. Years later, Facebook's researchers pointed out that posts with 'angry' reactions were much more likely to be toxic, polarizing, fake or low quality. In 2018, Facebook overhauled its News Feed algorithm, implementing a new algorithm which favored "Meaningful Social Interations" or "MSI". The new algorithm increased the weight of reshared material - a move which aimed to "reverse the decline in comments and encourage more original posting". While the algorithm was successful in its efforts, consequences such as user reports of feed quality decreasing along with increased anger on the site were observed. Leaked documents reveal that employees presented several potential changes to fix some of the highlighted issues with their algorithm. However, documents claimEmployee dissatisfaction
''Politico'' quotes several Facebook staff expressing concerns about the company's willingness and ability to respond to damages caused by the platform. A 2020 post reads: "It’s not normal for a large number of people in the 'make the site safe' team to leave saying, 'hey, we're actively making the world worse FYI.' Every time this gets raised it gets shrugged off with 'hey people change jobs all the time' but this is NOT normal."Apple's threat to remove Facebook and Instagram
In 2019, following concerns about Facebook and Instagram being used to trade maids in the Middle East, Apple threatened to remove their iOS apps from the App Store.XCheck
The documents have shown a private program known as "XCheck" or "cross-check" that Facebook has employed in order to whitelist posts from users deemed as "high-profile". The system began as a quality control measure but has since grown to protect "millions of VIP users from the company's normal enforcement process". XCheck has led to celebrities and other public figures being exempt from punishment that the average Facebook user would receive from violating policies. In 2019, football player Neymar had posted nude photos of a woman who had accused him of rape which were left up for more than a day. According to The Wall Street Journal, "XCheck grew to include at least 5.8 million users in 2020" according to Facebook's internal documents. The goal of XCheck was "to never publicly tangle with anyone who is influential enough to do you harm".Collaboration on censorship with the government of Vietnam
In 2020, Vietnam's communist government has threatened to shut down Facebook if the social media company doesn't co-operate on censoring political content in the country, Meta's (then known as Facebook) biggest market in the region. The decision to comply was personally approved by Mark Zuckerberg.Suppression of harmful political movements on its platform
In 2021, Facebook developed a new strategy for addressing harmful content on their site, implement measures which were designed to reduce and suppress the spread of movements that were deemed hateful. According to a senior security official at Facebook, the company "would seek to disrupt on-platform movements only if there was compelling evidence that they were the product of tightly knit circles of users connected to real-world violence or other harm and committed to violating Facebook’s rules". As part of their recently coordinated initiative, this included less promotion of the movement's posts within users' News Feed as well as not notifying users of new posts from these pages. Specific groups that have been highlighted as being affected by Facebook's social harm policy include the Patriot Party, previously linked to the Capitol attack, as well as a newer German conspiracy group known as Querdenken, who had been placed under surveillance by German intelligence after protests it organized repeatedly “resulted in violence and injuries to the police”.Facebook's AI concern
According to the Wall Street Journal, documents show that in 2019, Facebook reduced the time spent by human reviewers on hate-speech complaints, shifting towards a stronger dependence on their artificial intelligence systems regulate the matter. However, internal documents from employees claim that their AI has been largely unsuccessful, seeing trouble detecting videos of cars crashing, cockfighting, as well as understanding hate-speech in foreign languages. Internal engineers and researchers within Facebook have estimated that their AI has only been able to detect and remove 0.6% of "all content that violated Facebook’s policies against violence and incitement".''The Wall Street Journal'' podcast
For ''The Facebook Files'' series of reports, ''The Wall Street Journal'' produced a podcast on its ''The Journal'' channel, divided into eight episodes: * Part 1: The Whitelist * Part 2: 'We Make Body Image Issues Worse' * Part 3: 'This Shouldn't Happen on Facebook' * Part 4: The Outrage Algorithm * Part 5: The Push To Attract Younger Users * Part 6: The Whistleblower * Part 7: The AI Challenge * Part 8: A New Enforcement StrategyFacebook's response
In the Q3 2021 earnings call, Facebook CEOLobbying
In December 2021, news broke on The Wall Street Journal pointing to Meta's lobbying efforts to divide US lawmakers and "muddy the waters" in Congress, to hinder regulation following the 2021 whistleblower leaks. Facebook's lobbyst team in Washington suggested to Republican lawmakers that the whisteblower "was trying to help Democrats," while the narrative told to Democratic staffers was that Republicans "were focused on the company's decision to ban expressions of support for Kyle Rittenhouse," The Wall Street Journal reported. According to the article, the company's goal was to "muddy the waters, divide lawmakers along partisan lines and forestall a cross-party alliance" against Facebook (now Meta) in Congress.See also
* Criticism of Facebook * Comparison of user features of messaging platforms * Instagram's impact on people * Problematic social media useReferences
Further reading
External links
* (''The Facebook Files'', '' The Wall Street Journal'') {{Facebook navbox News leaks Facebook criticisms and controversies Corporate scandals 2021 scandals