As previously reported by ReadWrite, Telegram’s CEO, Pavel Durov, was arrested in France on charges related to the app’s harmful and illegal content and spreading alleged harmful content on his messaging service back in August. Now we are hearing that Telegram has reportedly removed approximately 15 million illicit groups from its messaging platform, using artificial intelligence to address the issue. It has faced unprecedented pressure to clean up its platform this year.
Although Durov remains under stringent restrictions following his initial court appearance, the instant messaging service claims it has made significant progress in tackling the issue. The site claims to have eliminated more than 15 million groups and channels involved in fraud and other unlawful activity.
Telegram announced a crackdown in September and now says it has deleted 15.4 million groups and channels related to hazardous content such as fraud and terrorism by 2024, claiming that this work was “enhanced with cutting-edge AI moderation tools.”
According to a message on Durov’s Telegram channel, the statement is part of Telegram’s newly developed moderation website, which aims to better explain its moderation efforts to the public. According to Telegram’s moderation page, there has been a significant rise in enforcement since Durov’s arrest in August:
How Telegram utilized AI to delete millions of suspected unlawful groups
Telegram attributed its achievement to “enhanced with cutting-edge AI moderation tools,” and described it as a step forward in decreasing unlawful content. This follows a crackdown announced in September, during which Durov stated that the business intends to meet government demands for stronger content regulation.
The new moderation page demonstrates Telegram’s attempt toward transparency in its procedures. In a Telegram post, Durov emphasized the company’s commitment to combating unlawful operations.
He disclosed that the moderation staff has been working tirelessly behind the scenes for the past few months, eliminating “millions of pieces of content that violate its Terms of Service, including incitement to violence, sharing child abuse materials, and trading illegal goods.”
Durov has vowed to provide users with real-time updates on the moderation team’s work. According to Telegram’s new moderation page, the network has dramatically increased enforcement since Durov’s arrest, and the crew has clearly been active. The elimination of illegitimate accounts has been ongoing since 2015, but the numbers are staggering—more than 15.4 million illegal groups and channels were blocked in 2024 alone.
This year, Telegram has also stepped up its fight against Child Sexual Abuse Materials (CSAM), removing 703,809 groups and channels. Along with user reports and aggressive moderation, Telegram works with third-party groups to prevent CSAM, resulting in thousands of rapid bans.
The Internet Watch Foundation, the National Centre for Missing and Exploited Children, the Canadian Centre for Child Protection, and Stitching Offlimits have all made significant contributions to these initiatives.
Its ongoing moderation efforts
The platform’s dedication to combating violence and terrorist propaganda is not new. Since 2016, Telegram has published daily updates on these measures, garnering Europol’s recognition. Since 2022, Telegram has collaborated with multiple groups to blacklist 100 million pieces of terrorist content, with 129,099 blocked in 2024 alone.
Meanwhile, Durov’s legal issue in France is unsolved. While he is presently out on €5 million ($5.3 million) bail, the platform is keen to continue its cleanup operations.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.