Advertisement

Meta's automated tools removed Israel-Hamas war content that didn't break its rules

The Oversight Board confirmed in its report that Meta's automated tools are to blame.

ASSOCIATED PRESS

Meta's Oversight Board has published its decision for its first-ever expedited review, which only took 12 days instead of weeks, focusing on content surrounding the Israel-Hamas war. The Board overturned the company's original decision to remove two pieces of content from both sides of the conflict. Since it supported Meta's subsequent move to restore the posts on Facebook and Instagram, no further action is expected from the company. However, the Board's review cast a spotlight on how Meta's reliance on automated tools could prevent people from sharing important information. In this particular case, the Board noted that "it increased the likelihood of removing valuable posts informing the world about human suffering on both sides of the conflict in the Middle East."

For its first expedited review, the Oversight Board chose to investigate two particular appeals that represent what the users in the affected region have been submitting since the October 7th attacks. One of them is a video posted on Facebook of a woman begging her captors not to kill her when she was taken hostage during the initial terrorist attacks on Israel. The other video posted on Instagram shows the aftermath of a strike on the Al-Shifa Hospital in Gaza during Israel’s ground offensive. It showed dead and injured Palestinians, children included.

The Board’s review found that the two videos were mistakenly removed after Meta adjusted its automated tools to be more aggressive in policing content following the October 7 attacks. For instance, the Al-Shifa Hospital video takedown and the rejection of a user appeal to get it reinstated were both made without human intervention. Both videos were later restored with warning screens stating that such content is allowed for the purpose of news reporting and raising awareness. The Board commented that Meta “should have moved more quickly to adapt its policy given the fast-moving circumstances, and the high costs to freedom and access to information for removing this kind of content…” It also raised concerns that the company's rapidly changing approach to moderation could give it an appearance of arbitrariness and could put its policies in question.

That said, the Board found that Meta demoted the content it reinstated with warning screens. It excluded them from being recommended to other Facebook and Instagram users even after the company determined that they were intended to raise awareness. To note, a number of users had reported being shadowbanned in October after posting content about the conditions in Gaza.

The Board also called attention to how Meta only allowed hostage-taking content from the October 7 attacks to be posted by users from its cross-check lists between October 20 and November 16. These lists are typically made up of high-profile users exempted from the company’s automated moderation system. The Board said Meta’s decision highlights its concerns about the program, specifically its “unequal treatment of users [and] lack of transparent criteria for inclusion.” It said that the company needs “to ensure greater representation of users whose content is likely to be important from a human-rights perspective on Meta’s cross-check lists.”

“We welcome the Oversight Board’s decision today on this case. Both expression and safety are important to us and the people who use our services. The board overturned Meta’s original decision to take this content down but approved of the subsequent decision to restore the content with a warning screen. Meta previously reinstated this content so no further action will be taken on it,” the company told Engadget in a statement. “As explained in our Help Center, some categories of content are not eligible for recommendations and the board disagrees with Meta barring the content in this case from recommendation surfaces. There will be no further updates to this case, as the board did not make any recommendations as part of their decision.”