skip to Main Content

Meta Chooses to ‘Err On the Side of an Adult’ When Uncertain About Images

Facebook is a leader among tech companies in detecting child sexual abuse content, which has exploded on social media and across the internet in recent years. But concerns about mistakenly accusing people of posting illegal imagery have resulted in a policy that could allow photos and videos of abuse to go unreported. Meta, the parent company of Facebook, Instagram, Messenger and WhatsApp, has instructed content moderators for its platforms to “err on the side of an adult” when they are uncertain about the age of a person in a photo or video, according to a corporate training document.

Go to Source
Author: Doug Isenberg

Back To Top