Meta Fixes Error that Flooded Instagram Reels with Violent Videos
Meta Platforms said on Thursday it had resolved an error that flooded the personal Reels feeds of Instagram users with violent and graphic videos worldwide. Meta's moderation policies have come under scrutiny after it decided last month to scrap its U.S. fact-checking program on Facebook, Instagram and Threads, three of the world's biggest social media platforms with more than 3 billion users globally. The company has in recent years been leaning more on its automated moderation tools, a tactic that is expected to accelerate with the shift away from fact-checking in the United States.
- The increased reliance on automation raises concerns about the ability of companies like Meta to effectively moderate content and ensure user safety, particularly when human oversight is removed from the process.
- How will this move impact the development of more effective AI-powered moderation tools that can balance free speech with user protection, especially in high-stakes contexts such as conflict zones or genocide?