According to the company’s oversight board, Facebook neglected to reveal vital details regarding its “Cross-Check” program, which apparently exempted millions of VIP users from the social media platform’s usual content moderation procedures.
In a report released on Thursday, the monitoring board stated that the internet giant “has not been entirely transparent on Cross-Check.” “On some occasions, Facebook failed to submit essential information to the Board, and on other occasions, the information it did offer was incomplete,” the statement continued.
Cross-Check is used by Facebook to evaluate content decisions involving high-profile users like politicians, celebrities, and journalists. According to the Wall Street Journal, the program has grown to 5.8 million users by 2020.
The Facebook Oversight Board is a group of specialists in fields such as human rights and freedom of expression. They are hired by the corporation yet work on their own. Because it permits users to challenge content judgments made on Facebook-owned platforms, the Oversight Board is often referred to as a kind of Facebook Supreme Court.
The Wall Street Journal cited internal business documents in a piece released last month to illustrate that Cross-Check protects VIPs from Facebook’s (FB) routine enforcement processes. In practice, this means that posts that break the company’s standards aren’t removed right once, and certain people are exempt from disciplinary action.
According to the Journal, “at times, the documents suggest, [Cross-Check] has shielded public personalities whose posts involve abuse or encouragement to violence, infractions that would normally result in sanctions for regular users.”
Facebook spokesman Andy Stone told the Journal in a written statement that the system’s critics were justified, but that it was “intended for an important reason: to establish an additional step so we can accurately enforce standards on content that may require more knowledge.”
Despite its size, Facebook did not mention Cross-Check when it requested the oversight board to reconsider its decision to bar former President Donald Trump from using its platform. Instead, when the oversight board questioned if Trump’s page or account had been subjected to normal content moderation processes, Facebook merely acknowledged the program.
According to the board, Facebook told the oversight board that the program only applied to “a small number of choices,” which the corporation later admitted was false. Despite a request from the board, it also gave “no significant transparency on the methodology for accounts or pages being selected for inclusion in Cross-Check.”
The board agreed to evaluate Cross-Check and provide recommendations on how it may be improved after receiving a request from Facebook on Thursday.
A spokeswoman for Facebook (FB) commended the board for its ongoing efforts and for releasing the transparency report.
“We feel the board’s work has had an impact,” the spokesman said in a statement, “which is why we requested the board for input into our Cross-Check system, and we will endeavor to be clearer in our explanations to them going forward.”