Nikita Mishra
Jun 21, 2022

Singapore govt clamps down on harmful social media content

Online security and scrutiny beef up in Singapore, as social media services are directed to remove harmful digital content.

(Unsplash)
(Unsplash)

Social media platforms in Singapore will soon have to act against “harmful online content”, sexual or self-harm, or content that can threaten public health, under a proposed directive by the Infocomm Media Development Authority (IMDA) to protect consumers.

"Online safety is a growing concern and Singapore is not alone in seeking stronger safeguards for our people," said Josephine Teo, communications and information minister and minister-in-charge of smart nation and cybersecurity, in a Facebook post. "Over the years, social media services have put in place measures to ensure user safety on their platforms. Still, more can be done given the evolving nature of harms on these platforms and the socio-cultural context of our society."

Proposed guidelines

According to an online survey by Sunlight Alliance for Action—a public-private-people partnership which tackles online harms—done in January this year, nearly half of 1,000 Singaporeans polled have had rough personal experiences with online safety.

Teo announced the government aims to set up two codes of practice to bolster online safety and security for its people.

Under the first code, high-reach social media services will be expected to place intense safety standards to ensure users are not exposed to any harmful or inappropriate content. Additional safeguards will have to be put for young users under the age of 18.

The second proposal mandates that IMDA can order any social media service to remove specified types of “flagrant content”. 

A public consultation exercise will commence next month.

Social media’s path forward

While the contribution of social media platforms and new technologies cannot be denied, the usage has resulted in many unseen and unintended consequences. “More countries are pushing to enhance online safety, and many have enacted or are in the process of enacting laws around this,” highlights Teo.

Under the new directive, action can be taken against social media platforms that do not comply in improving the online safety gap. Singaporeans are proactively encouraged to report child sexual exploitation and any abusive material, as well as terrorist content they encounter online. Networks must set in vigorous accountability processes to handle and act on such complaints. 

Related Articles

Just Published

1 day ago

Behind Spotify's new Southeast Asia campaign

EXCLUSIVE: Campaign talks to Jan-Paul Jeffrey, Spotify’s head of marketing, on the streamer's latest regional campaign for Indonesia, Thailand and Philippines.

1 day ago

Tech MVP 2022: Sunil Naryani, Dentsu

MOST VALUABLE PROFESSIONAL: Chief product officer Sunil Naryani has been instrumental in elevating the product offerings from Dentsu and driving radical collaborations across market product leaders.

1 day ago

Why purposeful creativity is more important than ...

Why do we still rush to come up with a once-in-a-lifetime brilliant stroke of genius that had zero impact on anyone’s life or business, and then proudly stand on a stage receiving accolades for our achievements? MediaMonks' APAC ECD ponders this question and more.

1 day ago

Here's Google's plan to to help advertisers manage ...

David Temkin, senior director of product management, ads privacy and user trust at Google, who is leading the charge on preserving ad targeting and measurement while tracking restrictions loom, discusses these shifts.