Content 360 2025 Singapore
Study: Most major social media platforms fail to moderate suicide and self-harm content

Study: Most major social media platforms fail to moderate suicide and self-harm content

share on

Most major platforms appear to be substantially failing to proactively identify and remove harmful content such as suicide and self-harm content. 

This is according to a study by Molly Rose Foundation where it analysed over 12 million decisions taken by six major platforms between September 2023 and April 2024. The platforms include Instagram, Facebook, TikTok, Pinterest, Snapchat and X. 

Of the 12 million decisions, suicide and self-harm content amounts to a relatively small proportion (2%) of all decisions made during
this period. 

Don't miss: SG competition watchdog hunts for consultant for study on digital ad ecosystem

The largest volume of reports related to illegal content (33%), unsafe or illegal products (23%), and pornographic or sexualised content (18%), said Molly Rose Foundation. 

Of which, all content moderation decisions relating to suicide or self-harm content were taken by just two social media platforms: Pinterest (74%) and TikTok (24%). TikTok and Instagram only make up 1% while X and Snapchat performed the lowest at 0.14% and 0.04% respectively. 

As such, there is a "clear lack of investment and commitment from Meta's platforms to adequately target and make progress on violative suicide and self-harm content", it said. 

In addition, the research found that there are significant inconsistencies and shortcomings in how major platforms respond to suicide and self-harm content on their services.

Less than 0.2% of major platforms opted to take more than one measure to restrict the content. For example, while TikTok detected almost three million items of suicide and self-harm content, it suspended only two accounts. 

Meanwhile, only 18% of Instagram’s content moderation decisions related to image-based posts, with a further 2% related to video-based content. This is despite research that shows that Instagram’s short-form video product, Reels, has the highest risk profile. 

Meanwhile, only 10% of TikTok’s decisions related to video content, with just 4% relating to images. Over 1,500 of TikTok's decisions relate to audio content. 

"While this is a very small proportion of the platform’s overall reports, the risk profile of audio-based content is under-researched and often overlooked," said Molly Rose Foundation. 

"Our analysis suggests that audio clips can be used to promote and glorify suicidality and self-harming behaviours, including through the use of song lyrics," it added. 

It added that some platforms appear to act too slowly to remove content too. 

TikTok is by far the most responsive platform to identify and decide on
harmful content, with 94.4% of its moderation decisions related to content posted on the same day. This is in contrast to 87% of Instagram's and 67% of Facebook's decisions. 

According to the research, one-sixth of content actioned by Facebook had been available for at least 100 days. Although Pinterest is responsible for more content moderation decisions than any other platform, the speed at which it identifies and decides on harmful content is slow, it added.

Less than one-fifth of Pinterest’s moderation decisions relate to content posted on the same day, with 23% relating to content that had been available on the platform for at least a year. 

Among platforms that are significantly underreporting suicide and self-harm content, Snapchat claims that 97% of moderation decisions relate to content posted on the same day.

X claims a 100% same-day response, albeit having actioned just 0.002 per cent of the content reviewed by Pinterest, said the report. 

Currently, more than two-thirds of the world's population now use the Internet, with social media once again proving that it is not going anywhere.

Today's online world currently boasts 5.07 unique social media identities, with 37 million users created in the last quarter, according to a report from We Are Social and Meltwater. 

In Southeast Asia, 73.7% of the region's total population are on the Internet. The region also sees more social media users active than the global average, particularly in the Philippines, Indonesia, Malaysia, Thailand and Vietnam.

Video content is particularly popular in the region, where Southeast Asia has been highlighted as having the world's most "active" gamers. 

TikTok has proved to be particularly popular throughout the region, with countries in Southeast Asia once again prevailing over the global average (31 hours 47 minutes). 

Related articles: 
Study: Only 17% of SG marketers strongly agree good digital marketing can increase revenue

TikTok pushes back against MY govt's social media licensing plans

Industry experts on what it will take for a Malaysian social media app to fly

share on

Follow us on our Telegram channel for the latest updates in the marketing and advertising scene.
Follow

Free newsletter

Get the daily lowdown on Asia's top marketing stories.

We break down the big and messy topics of the day so you're updated on the most important developments in Asia's marketing development – for free.

subscribe now open in new window