Whistleblower reveals the ‘Facebook Papers’ to protect America’s youth
November 12, 2021
Ex-Facebook product manager Frances Haugen came forward with the “Facebook Papers,” a series of internal Facebook documents and research, in May 2020. As the product manager of the company, Haugen was responsible specifically for ensuring that social media preserved American democracy. Haugen was dubbed the term “whistleblower” by national media outlets because she felt that Facebook was prioritizing its profits over the safety and privacy of the users. She further emphasized how little of Facebook’s interior information was accessible to the public.
The Securities of Exchange (SEC) was the first to receive the “Facebook Papers.” Additionally, federal officials and various state attorney generals gained access to the papers. With this spillage of information, American media outlets did not hesitate to reveal any morsel of details they could catch. Facebook was about to be set ablaze in the press due to these papers.
After the “Facebook Papers” were revealed, Facebook formally changed its brand name to “Meta” on Oct. 28. This name change aimed to rebrand the social media company and distance itself from the negative press that resulted from Frances Haugen’s actions. Facebook CEO Mark Zuckerberg claims that the name change intended to display the desire to innovate and excel Facebook into broader spheres. The metaverse is a shared 3-dimensional online space on the internet that is the anticipated way people will interact in the future.
As the “Facebook Papers” went public, several problems were revealed. The most pressing issue for America’s youth was the failure of sufficient moderation for the company’s users.
Through Haugen’s disclosure of the documents, it became known that Meta has not properly filtered sensitive content towards impressionable teen users. For instance, Instagram, the sub-platform of Meta, had been advertising inadvisable weight-loss content to users dealing with eating disorders.
One may be wondering if it is the responsibility of Meta to censor information or if it is at the discretion of users to choose to use the platform, agreeing to the subsequent consequences.
Since Meta is a privately-run platform, it can create its own rules that users must follow. Within such terms, Meta can choose to filter data that does not align with the company values. Meta’s censorship policy, the “Facebook Community Standards,” explicitly states which content is subject to moderation. The list includes the broad term of safety, which will filter out content relating to suicide and self-injury. This category is where the content of intensifying eating disorders would fall under the jurisdiction. The goal of the social media platform always remains to create a safe online community for its users.
Meta is a complex medium that serves billions across the globe. Though, it has become tainted through the leak of the “Facebook Papers.” New information concerning both the company and Frances Haugen is continually being released, making the end of this affair indefinite. The hidden issues revealed of Meta have a long process of reforming to go before the social media giant can professionally put an end to this dark era.
Meta has misled young teenage girls to damaging content on weight-loss. Through its suggested search engine prompts, Instagram has tailored the in-app experience to better align with interests of its users.
However, users struggling with eating disorders have been led to content that has further intensified their situations and has caused them to feel more worse about themselves.
Teenage girls will be locked in a rhythm of continuously scrolling through supposed ‘helpful’ weight-loss content which only further harms their mental health.
Meta is aware through its own research that this is a standing issue: about 13% of teenager girls claim that Instagram worsens thoughts of ‘suicide and self injury.’ 17% claim that the app makes ‘eating issues’ worse. One in three teen girls feel that the platform worsens their body image issues.
There is a third-party system in place that functions as a fact-checker. Though, this system is not without its shortcomings- as is the case with most things. The system does its very best to remove content that is causing ‘imminent real-world harm’ and tries to filter out information that fails to align with the values of the United States.
The event of exposure was unintentional, and the company has worked to remove this issue. Meta’s vision is transparency with its users and does not aim to hide information.
In 2019, the company released restrictions on diet products and cosmetic surgery content. Persuasion to purchase such items and services and price were made hidden from users that are under the age of 18.
The company has a ban on ‘pro-ana’ content, (information that encourages eating disorders).
The company is struggling to create regulations that would allow for the differentiation between advertisements of legitimate healthy weight-loss and fitness and idealized rapid, harmful weight-loss.