In the past few years, Mastodon, a decentralised and open-source microblogging platform, has become a popular place for users to discuss topics such as technology, video games, politics, and more. Unfortunately, like most other major web platforms, such as Twitter and Facebook, Mastodon has recently had to confront a rising problem: the presence of child abuse material (CAM).
Awareness of this issue was heightened in December 2020 when a research project led by the Norwegian non-profit Child Right International Norway (CRI) revealed that 8% of all uploaded images on the Mastodon platform contained CAM. The report also found that CAM was the most common category of images depicting children that had been uploaded to the site.
The results of the research project were shocking, and prompted Mastodon to immediately implement new measures to address the issue. It introduced advanced filters for its NSFW (not safe for work) content, as well as an improved algorithms for detecting images depicting children. Furthermore, it collaborated with The Internet Watch Foundation (IWF), a leading charity that combats CAM, to share knowledge and best practices on the issue.
Mastodon is not the only web platform that is dealing with a CAM problem. Twitter, YouTube and Facebook have all struggled with this issue in the past. All of the aforementioned platforms have been working to provide better tools and safeguards to remove and block CAM. Furthermore, all of them actively collaborate with the IWF to respond to reports and to have suspicious images removed.
Most importantly, Mastodon’s efforts to tackle the CAM problem have been praised as one of the most comprehensive responses to the issue thus far. This is evident in the fact that the platform has managed to achieve a 95% reduction in the amount of CAM that can be found on the site, compared to when the research began.
In conclusion, it is clear that Mastodon, like all other major web platforms, has had to grapple with an issue as difficult and personal as CAM. However, its response has been swift and thorough and has had a huge impact in terms of reducing the amount of CAM on the platform. We can only hope that other platforms will take a leaf out of Mastodon’s book and continue to strive for a safe and secure internet for all.