SINGAPORE: Social media platforms need to do more to protect children in Singapore from harmful and age-inappropriate content, according to the Infocomm Media Development Authority (IMDA) in its inaugural online safety assessment report on Monday (Feb 17).
Six social media services - Facebook, HardwareZone, Instagram, TikTok, YouTube and X (formerly Twitter) - were evaluated on how thorough and effective their measures met the Code of Practice for Online Safety.
Under the code, the platforms are required to implement system-wide measures to minimise users’ access to harmful content, give users effective and easy solutions to report harmful content, and be transparent and accountable to users by submitting annual online safety reports for publishing on IMDA’s website.
IMDA also assessed how quickly platforms removed harmful content that violated community guidelines after it was reported.
The report aims to help users, including parents, decide for themselves and their children about the risks and available safety measures on the various social media sites.
The Code of Practice for Online Safety – Social Media Services, specifies that social media services which are or will be designated under 45K(1) of the Broadcasting Act 1994 have to meet to enhance online user safety, particularly for children, and curb the spread of harmful content on their service.
The categories of harmful content include:
The obligations for these social media services are categorised into three sections:
Section A - User safety
Section B - User reporting and resolution
Section C - Accountability
Findings revealed that all six platforms had safety measures in place – such as community guidelines, content moderation, tools for users to manage their own safety and Singapore-based safety resources. However, some did not do as well for user safety measures for children, with X scoring the lowest with two out of five points.
IMDA said in the report that although X claimed it had detected and removed six pieces of child sexual exploitation and abuse material from Singapore, tests uncovered “considerably more cases” during the same period.
“X needs to improve the effectiveness of its efforts in proactively detecting and removing child sexual exploitation and abuse material,” the media authority said.
The report also found that the platform did not effectively enforce its measures to restrict children from viewing adult sexual content. Tests showed that minors could easily find and access explicit material, including hardcore pornography, with simple search words.
In addition, X took action on 54 per cent of harmful content flagged by users, but only took action on the remaining 46 per cent of harmful content when IMDA notified X directly.
It also took 10 days on average to act on reported violations.
“Specifically, X took an average of seven days to take action on sexual content, nine days for suicide and self-harm content, and between 10 to 20 days for other categories of harmful content,” IMDA said.
Responding to the report's findings, X said that it has "zero tolerance" towards material that features or promotes child sexual exploitation,
X also said it recognised that users between 13 and 17 years old are more vulnerable, and claimed that it has measures in place to ensure minors have a safe and secure experience on the platform.
TikTok and Instagram scored full marks for their child safety measures, while YouTube and Facebook scored four out of five and HardwareZone 2.5 out of 5.
“HardwareZone did not have additional measures for children such as accounts with more restrictive default settings and dedicated tools for parents or guardians to manage children’s safety,” the report said.
Although the site’s terms of service prohibit users under 18 years old from accessing the service, its age-gating measures were easily bypassed.
While Facebook and YouTube performed relatively well, the report found instances where children could access age-inappropriate content that should have been restricted under their guidelines.
None of the six social media services scored full marks in user reporting and resolution.
Similar to X, Instagram scored two out of five. Facebook and TikTok scored 2.5, YouTube had three points, while HardwareZone was given four points.
Apart from HardwareZone, the other five sites acted on only about 50 per cent or less of the content that violated their own guidelines. This meant that a large number of real user reports were not acted on, with most of them taking five days or longer to respond. For example, Instagram took an average of seven days.
Despite these shortcomings, all six platforms scored at least 3.5 out of five for accountability in keeping their service safe for users, with HardwareZone, TikTok and X scoring full marks.
In response to IMDA's findings, Meta - which owns Facebook and Instagram - said its technology prioritises high-severity content that can cause imminent offline harm, and material that goes viral quickly.
Hence, test accounts used by IMDA "might not have considered platforms' prioritisation of user reports," Meta explained.
HardwareZone, YouTube and TikTok also added that they remain committed to improving online safety efforts for Singaporeans.
IMDA emphasised that social media platforms must take greater responsibility in protecting children from harmful content. They will also have to update the authority on the steps taken to improve their safety measures in the next annual online safety report.
“At the same time, the government will continue to enhance public education efforts to equip Singaporeans with the knowledge and skills to go online safely, securely and safeguard themselves against online harms and threats,” IMDA said.
Continue reading...
Six social media services - Facebook, HardwareZone, Instagram, TikTok, YouTube and X (formerly Twitter) - were evaluated on how thorough and effective their measures met the Code of Practice for Online Safety.
Under the code, the platforms are required to implement system-wide measures to minimise users’ access to harmful content, give users effective and easy solutions to report harmful content, and be transparent and accountable to users by submitting annual online safety reports for publishing on IMDA’s website.

IMDA also assessed how quickly platforms removed harmful content that violated community guidelines after it was reported.
The report aims to help users, including parents, decide for themselves and their children about the risks and available safety measures on the various social media sites.
Code of Practice for Online Safety - Social Media Services
The Code of Practice for Online Safety – Social Media Services, specifies that social media services which are or will be designated under 45K(1) of the Broadcasting Act 1994 have to meet to enhance online user safety, particularly for children, and curb the spread of harmful content on their service.
The categories of harmful content include:
- Sexual content
- Violent content
- Suicide and self-harm content
- Cyberbullying content
- Content endangering public health
- Content facilitating vice and organised crime
The obligations for these social media services are categorised into three sections:
Section A - User safety
- Social media services must put in place measures to minimise end-users’ exposure to harmful content, empower end-users to manage their safety on the platform and mitigate the impact on end-users that may arise from the propagation of harmful content.
- The service must also have specific measures to protect children from harmful content.
Section B - User reporting and resolution
- Social media users must be able to report concerning content or unwanted interactions to the platform in relation to the categories of harmful or inappropriate content.
- The reporting and solutions provided to end-users must be effective, transparent, easy to access, and easy to use.
- A user's report must be assessed, and appropriate action(s) must be taken by the social media service in a timely and diligent manner.
Section C - Accountability
- Social media users must have access to clear and easily understandable information that enables them to assess the level of safety and related safety measures.
- Social media platforms must submit to IMDA annual online safety reports on the measures the Service has put in place to combat harmful and inappropriate content, for publishing on IMDA’s website.
X SCORED LOWEST IN USER SAFETY MEASURES FOR CHILDREN
Findings revealed that all six platforms had safety measures in place – such as community guidelines, content moderation, tools for users to manage their own safety and Singapore-based safety resources. However, some did not do as well for user safety measures for children, with X scoring the lowest with two out of five points.
IMDA said in the report that although X claimed it had detected and removed six pieces of child sexual exploitation and abuse material from Singapore, tests uncovered “considerably more cases” during the same period.
“X needs to improve the effectiveness of its efforts in proactively detecting and removing child sexual exploitation and abuse material,” the media authority said.
The report also found that the platform did not effectively enforce its measures to restrict children from viewing adult sexual content. Tests showed that minors could easily find and access explicit material, including hardcore pornography, with simple search words.
In addition, X took action on 54 per cent of harmful content flagged by users, but only took action on the remaining 46 per cent of harmful content when IMDA notified X directly.
It also took 10 days on average to act on reported violations.
“Specifically, X took an average of seven days to take action on sexual content, nine days for suicide and self-harm content, and between 10 to 20 days for other categories of harmful content,” IMDA said.
Responding to the report's findings, X said that it has "zero tolerance" towards material that features or promotes child sexual exploitation,
X also said it recognised that users between 13 and 17 years old are more vulnerable, and claimed that it has measures in place to ensure minors have a safe and secure experience on the platform.
HOW OTHER SOCIAL MEDIA SITES SCORED
TikTok and Instagram scored full marks for their child safety measures, while YouTube and Facebook scored four out of five and HardwareZone 2.5 out of 5.
“HardwareZone did not have additional measures for children such as accounts with more restrictive default settings and dedicated tools for parents or guardians to manage children’s safety,” the report said.
Although the site’s terms of service prohibit users under 18 years old from accessing the service, its age-gating measures were easily bypassed.
While Facebook and YouTube performed relatively well, the report found instances where children could access age-inappropriate content that should have been restricted under their guidelines.
Related:


None of the six social media services scored full marks in user reporting and resolution.
Similar to X, Instagram scored two out of five. Facebook and TikTok scored 2.5, YouTube had three points, while HardwareZone was given four points.
Apart from HardwareZone, the other five sites acted on only about 50 per cent or less of the content that violated their own guidelines. This meant that a large number of real user reports were not acted on, with most of them taking five days or longer to respond. For example, Instagram took an average of seven days.
Despite these shortcomings, all six platforms scored at least 3.5 out of five for accountability in keeping their service safe for users, with HardwareZone, TikTok and X scoring full marks.
In response to IMDA's findings, Meta - which owns Facebook and Instagram - said its technology prioritises high-severity content that can cause imminent offline harm, and material that goes viral quickly.
Hence, test accounts used by IMDA "might not have considered platforms' prioritisation of user reports," Meta explained.
HardwareZone, YouTube and TikTok also added that they remain committed to improving online safety efforts for Singaporeans.
IMDA emphasised that social media platforms must take greater responsibility in protecting children from harmful content. They will also have to update the authority on the steps taken to improve their safety measures in the next annual online safety report.
“At the same time, the government will continue to enhance public education efforts to equip Singaporeans with the knowledge and skills to go online safely, securely and safeguard themselves against online harms and threats,” IMDA said.
Continue reading...