Meta Shifts Content Moderation Approach, Emphasizes User Reporting

Chỉnh sửa bởi: Veronika Nazarova

In a significant policy shift, Meta has announced that it will discontinue third-party fact-checking on its platforms, Facebook and Instagram, in the United States. Instead, users will now have the ability to report misleading content directly, a model inspired by Elon Musk's platform X. This change reflects Meta's response to criticism regarding its previous moderation efforts, which CEO Mark Zuckerberg described as ineffective and frustrating for users.

The new user-driven system, dubbed 'Community Notes', allows individuals to flag false or misleading statements and provide additional context. Meta plans to roll out this approach to other countries following the U.S. implementation. Zuckerberg cited the need to enhance free expression as a driving factor behind this decision, claiming that the existing moderation framework often hindered open dialogue.

Furthermore, Zuckerberg indicated a relaxation of restrictions on content related to immigration, gender identity, and other sensitive topics, asserting that many such discussions are part of mainstream discourse. He acknowledged the limitations of automated filters, particularly in recognizing context and nuance, stating that even minimal errors in censorship can impact millions of users.

This pivot towards a more lenient content moderation policy aligns with the incoming U.S. administration's stance, reflecting a broader ideological shift within Meta's leadership. Critics, however, warn that this could lead to increased tolerance of hate speech and harmful content on the platform.

Bạn có phát hiện lỗi hoặc sai sót không?

Chúng tôi sẽ xem xét ý kiến của bạn càng sớm càng tốt.