TikTok has asked other social media platforms to join it in establishing a partnership to better combat content depicting self-harm and suicide after clips from a Facebook livestream of a man taking his own life circulated around TikTok, Facebook, Instagram, YouTube and more for weeks earlier this month.
On TikTok, where an estimated 18 million daily users are 14 or younger, teens and their parents complained that videos were recommended on the “For You” discovery page, with users warning each other to stay off the app until the problem was fixed. In response, TikTok interim chief Vanessa Pappas has written to the chief of executives of nine social and content platforms—Facebook, Instagram, Google, YouTube, Twitter, Twitch, Snapchat, Pinterest and Reddit—asking to create a partnership through which violent and graphic content can be better addressed, the company said in a a“What we are proposing is that, the same way these companies already work together around child sexual imagery and terrorist-related content, we should now establish a...
According to Bertram, who admitted the platform needs to “do better” following what he described as “a coordinated effort by bad actors to spread this video across the internet and platforms,” TikTok will also implement changes to machine learning and emergency systems, in addition to tweaking how its algorithms detect violent content and co-ordinate with moderators for quicker takedowns.
“We also fully agree that [TikTok], along with every single other major social platform must do better,” said Steen in a Tuesday statement to. “It’s the least these companies can do to make sure the content we are shown on their platforms is safe and responsible.” at 800-273-TALK or text the Crisis Text Line at 741-741.