Telegram has quietly updated its policy to allow users to report private chats to its moderators following the arrest of founder Pavel Durov in France over “crimes committed by third parties” on the platform.
The messaging app, which serves nearly 1 billion monthly active users, has long maintained a reputation for minimal supervision of user interactions.
On Thursday night, Telegram began implementing changes to its moderation policy. “All Telegram apps have ‘Report’ buttons that let you flag illegal content for our moderators — in just a few taps,” the company states on its updated frequently-asked-questions page.
The platform has also provided an email address for automated takedown requests, instructing users to include links to content requiring moderator attention.
It’s unclear how, and whether, this change impacts Telegram’s ability to respond to requests from law enforcement agencies. The company has previously cooperated with court orders to share some information about its users.
TechCrunch has reached out to Telegram for comment.
The Dubai-headquartered company has additionally edited its FAQ page, removing two sentences that previously emphasized its privacy stance on private chats. The earlier version had stated: “All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them.”
These policy changes follow Durov’s arrest by French authorities in connection with an investigation into crimes related to child sexual abuse images, drug trafficking, and fraudulent transactions.
Responding to his arrest, Durov posted on his Telegram channel, criticizing the action: “Using laws from the pre-smartphone era to charge a CEO with crimes committed by third parties on the platform he manages is a misguided approach.”
He argued that the established practice for countries dissatisfied with an internet service is to initiate legal action against the service itself, rather than its management.
Source : Techcrunch