CyberSecurity SEE

Meta’s Oversight Board Recommends Policy Change in Response to False Biden Video

Meta’s Oversight Board Recommends Policy Change in Response to False Biden Video

A video clip emerged online in May 2023 falsely depicting US President Joe Biden inappropriately touching his adult granddaughter’s chest. This doctored footage, often accompanied by the erroneous claim that Biden is a “pedophile,” began circulating on social media platforms, including Facebook. The fabricated video was created by maliciously editing actual footage of President Biden voting in the US Midterm elections in October 2022.

Despite the video being widely recognized as fake, it was not removed from Facebook because it did not violate Meta’s Manipulated Media policy. This policy, as disclosed in a post by Meta’s Oversight Board in February 2024, imposes restrictions on when content can be taken down, such as if it was created using artificial intelligence (AI) or if it shows people saying things they did not actually say. Since the altered footage of President Biden did not meet these specific criteria, Meta opted to leave it on the platform.

The Oversight Board outlined that Meta’s current Manipulated Media policy is too narrow and fails to effectively combat misinformation and disinformation. The Board’s consultation with experts and public comments demonstrated broad agreement that non-AI-altered content, often referred to as “cheap fakes,” can be just as misleading. Therefore, the policy should not prioritize “deep fakes” over other forms of altered content.

The Oversight Board also criticized Meta for the confusing placement of the policy in two locations, calling for clearer guidelines to avoid user confusion. With these criticisms in mind, the Board recommended that Meta expand the scope of its Manipulated Media policy to include audio and audiovisual content, broaden the policy to cover content showing people doing things they did not do, and apply the policy to all altered content regardless of how it was created. In instances where no other policy violation exists, Meta should consider labeling manipulated media as significantly altered and potentially misleading, rather than removing it completely.

Moreover, the Board urged Meta to promptly reconsider and revise its policy given the upcoming elections in 2024, emphasizing the need to prevent interference with the right to vote and participate in public affairs. The Oversight Board’s mission is to review Meta’s most challenging content-related decisions on Facebook and Instagram, and it continues to work towards ensuring the platform’s policies effectively address the spread of fake and manipulated content.

In light of this recommendation, Meta will need to seriously re-evaluate and update its current policy on manipulated media to better combat the dissemination of falsified content. As technology continues to advance, the company must adapt its policies accordingly to protect its users from potentially harmful and misleading material.

Source link

Exit mobile version