In its efforts to curb the spread of harmful content on its platforms, Facebook revealed that it removed a total number of 11.6 million pieces of content related to child nudity and child sexual exploitation between July and September 2019.
The social network is not new to this issue. Apparently, it has been a member of the Internet Watch Foundation, which hunts for these kinds of contents and since 2011, the company has been using sophisticated artificial intelligence systems to search and take down such content. However, images of child sexual abuse keep increasing on its platforms.
This follows the public outcry of Ian, the father of the 14-year-old Molly Russell who committed suicide after updating her Instagram page with large amounts of graphic material about self-harm and suicide on her Instagram account.
The incident happened in 2017.
Facebook vice president, Guy Rosen said in a blog post:
“We remove content that depicts or encourages suicide or self-injury, including certain graphic imagery and real-time depictions that experts tell us might lead others to engage in similar behaviour. We place a sensitivity screen over content that doesn’t violate our policies but may be upsetting to some, including things like healed cuts or other non-graphic self-injury imagery in a context of recovery.”
The figures in Facebook’s report reveal that between July and September 2019,
- Facebook removed 11.6 million copies of content related to child nudity and child sexual exploitation from its platform and 754,000 from Instagram.
- The company removed 2.5 million pieces of content related to suicide and self-harm from Facebook and 845,000 from Instagram.
- Drug sales content were removed from both platforms. 4.4 million from Facebook and 1.5 million from Instagram.
- 133,300 pieces of terrorist related content were removed from Instagram.
- 99% of the content related to al-Qaeda, the Islamic States and their affiliates were removed from Facebook.
However, the social network’s decision to extend the end-to-end encryption on Facebook-owned WhatsApp to Facebook and Messenger and Instagram could hamper the future efforts to clamp down on harmful content. Even the Facebook boss, Mark Zuckerberg acknowledges that there would be a trade-off that might benefit child sex abusers and other criminals.