Preventing the sharing of child sexual abuse material (CSAM) has become an important issue for many content platforms recently. However, according to the Financial Times, the social media platform TikTok is being investigated by the United States for lack of moderation.
The Financial Times reported that TikTok is currently the subject of an investigation by the U.S. Department of Homeland Security (DHS), while the U.S. Department of Justice (DoJ) is investigating whether specific privacy features on TikTok have been abused by abusers.
Moreover, the U.S. Department of Homeland Security says TikTok is the uploading platform of choice for abusers due to its large number of young users. Investigations into TikTok-related child exploitation cases also increased sevenfold between 2019-2021.
And TikTok responded: TikTok has zero-tolerance for child sexual abuse material. When we become aware of any attempt to post, obtain or distribute [child sexual abuse material], we will remove content, ban accounts and devices, report immediately to NCMEC, and cooperate with law enforcement as necessary.
We are firmly committed to the safety and well-being of minors, incorporating teen safety into our policies, enabling privacy and safety settings for teen accounts by default, and restricting features by age.
Furthermore, with regard to the privacy features being investigated by the US Department of Justice, the Financial Times said that abusers are likely to use private accounts to share passwords to trade CSAM. Then upload illegal content using the “Only Me” feature, which is only visible to those who are logged into the profile.