Nikita Mishra
Jun 21, 2022

Singapore govt clamps down on harmful social media content

Online security and scrutiny beef up in Singapore, as social media services are directed to remove harmful digital content.

(Unsplash)
(Unsplash)

Social media platforms in Singapore will soon have to act against “harmful online content”, sexual or self-harm, or content that can threaten public health, under a proposed directive by the Infocomm Media Development Authority (IMDA) to protect consumers.

"Online safety is a growing concern and Singapore is not alone in seeking stronger safeguards for our people," said Josephine Teo, communications and information minister and minister-in-charge of smart nation and cybersecurity, in a Facebook post. "Over the years, social media services have put in place measures to ensure user safety on their platforms. Still, more can be done given the evolving nature of harms on these platforms and the socio-cultural context of our society."

Proposed guidelines

According to an online survey by Sunlight Alliance for Action—a public-private-people partnership which tackles online harms—done in January this year, nearly half of 1,000 Singaporeans polled have had rough personal experiences with online safety.

Teo announced the government aims to set up two codes of practice to bolster online safety and security for its people.

Under the first code, high-reach social media services will be expected to place intense safety standards to ensure users are not exposed to any harmful or inappropriate content. Additional safeguards will have to be put for young users under the age of 18.

The second proposal mandates that IMDA can order any social media service to remove specified types of “flagrant content”. 

A public consultation exercise will commence next month.

Social media’s path forward

While the contribution of social media platforms and new technologies cannot be denied, the usage has resulted in many unseen and unintended consequences. “More countries are pushing to enhance online safety, and many have enacted or are in the process of enacting laws around this,” highlights Teo.

Under the new directive, action can be taken against social media platforms that do not comply in improving the online safety gap. Singaporeans are proactively encouraged to report child sexual exploitation and any abusive material, as well as terrorist content they encounter online. Networks must set in vigorous accountability processes to handle and act on such complaints. 

Related Articles

Just Published

4 hours ago

Moves and wins roundup: Week of October 2, 2023

A new month and new moves and wins from the World Federation of Advertisers, The Advertising Standards Council of India, Viddsee and more.

4 hours ago

CJ taps Tyroo to offer APAC advertisers access to ...

The partnership hopes to offer advertisers a number of insights including shopper event tracking, integrations with e-commerce platforms, program management tools, and data analytics.

4 hours ago

Allison PR vows not to work with fossil fuel industry

The Stagwell firm is the largest agency to sign anti-fossil fuel industry group Clean Creatives’ pledge.

5 hours ago

Asia-Pacific Power List 2023: Yajuan Wang (Zhiheng),...

National influencing app in China, under the leadership of Wang, is now building comprehensive products’ word-of-mouth marketing mechanism to tackle pain points for brands.