By Mike Flunker, Editor-in-Chief
When Facebook and its services went down for a day in October the world was left reeling. The cause of the outage was later deemed to be an update that took Facebook servers offline, a problem that was initially unsolvable because the ability to fix it was also offline. Facebook’s company-wide practice to keep everything internal caused its worst outage yet.
But Facebook hasn’t been able to keep everything internal. Documents from Facebook whistleblower Frances Haugen show the depth of Facebook’s own extensive internal research, revealing the negative impacts it knows it has on users, how fast inflammatory content spreads, and its inability to squash mis- and disinformation.
On Oct, 28, Facebook announced it was bringing all apps and technologies under the new brand “Meta,” which is still headed by Mark Zuckerberg. To many critics, this seems like an ill-timed announcement as Zuckerberg tries to distance himself from the Facebook name in the midst of controversy. Zuckerberg told The Verge, a technology blog, that the announcement had been planned six months in advance.
It is this editor’s opinion that everyone should be aware of the problems that reside in Facebook as a company and platform, so informed decisions can be made regarding its use in everyday life. Rebranding or moving in new directions doesn’t solve the underlying issues revealed in this document leak.
Facebook protects the elite on the platform
Facebook is supposed to be an equal meeting ground where everyone starts at zero friends and zero followers. Zuckerberg insists publicly that each of Facebook’s 3 billion users can be on equal footing with the most famous, influential, and popular celebrities and public figures. However, internal documents show that many influential users, including members of the Trump family, Sen. Elizabeth Warren, and dog influencer Doug the Pug, are protected against all enforcement actions.
The system, known as “XCheck,” protects several million VIP users, while all other users are held to particular rules. When the average Facebook user posts something that goes against guidelines, the post is automatically removed as soon as another user reports it. But the elite users who are protected can post misleading or inflammatory content at no risk of it being removed without review by a Facebook employee. Internal documents show these reviews are few and far between, and often violating posts remain even after review.
Facebook’s own internal reviews paint this system as “a breach of trust” and that “unlike the rest of our users, these people can violate our standards without consequences.” As recently as June 2021, Facebook told it’s own Oversight Board in writing that XCheck was only used for a small number of decisions. However, given the growing number of users that benefit from the XCheck system, it has far-reaching consequences.
The system was designed to minimize the public relations backlash that has occurred when removing or moderating content posted by these VIP users. Facebook fears the political fallout from making the wrong call on high profile posts, according to internal documents presented by Wall Street Journal. XCheck now serves to protect from any enforcement actions taken without additional review.
In 2019, Brazilian soccer star Neymar de Silva Santos Jr. was accused of rape by an unidentified woman. Later he posted videos on Facebook and Instagram accusing the woman of extortion. The videos included WhatsApp chats that revealed the woman’s full name as well as nude photos of her, according to a report by Wall Street Journal.
Facebook’s policy on this is straightforward, “non-consensual intimate imagery” is immediately deleted. But Neymar had more than 150 million followers on Instagram and Facebook and was protected by XCheck. The videos remained on social media and were reshared countless times over the course of the day before being removed. After additional review by Facebook, Neymar even got to keep his account, a deviation from Facebook’s own policy of deleting accounts that share non-consensual nude photos.
Facebook’s algorithm rewards anger
Over the last few years, Facebook has made several changes to its algorithms to better promote user engagement. The primary goal of the changes was to incentivize interactions between users and their friends and family, instead of passively consuming professionally made content on the platform. Likes, comments, and shares make up the core of how the platform tracks engagement but internal documents showed decreasing engagement and Facebook didn’t know why. According to the Wall Street Journal’s Facebook Files, an investigative review of leaked Facebook documents, there was fear that users would stop using Facebook altogether.
On the surface, the changes worked. More users were interacting with each other, their close friends and family, while interactions decreased with professionally made content from sources like Buzzfeed, ABC News, and Brietbart. But as users adjusted to the new algorithm changes, sensationalized content which often played into anger and hate, increased. More users engaged with this kind of content, so it was widely pushed to more people. The Wall Street Journal reported that Zuckerberg resisted additional changes to the algorithm that may have reduced engagement on any kind of content.
James Barnes, a former Facebook employee, said Facebook had hoped that giving priority to user engagement in the News Feed would bring people closer together. But the platform had grown so complex the company didn’t understand how the change might backfire.
These leaked documents and further coverage by journalists from all sectors continue to reveal some of the concerning issues that platforms like Facebook propagate. It is this editor’s opinion that all of these findings should be investigated, addressed, and the problems solved before Facebook takes the next step into social media virtual reality.