Social Media Platforms To Add Safeguards For Youths Under Singapore’s Proposed Laws
While social media can be a great way to connect with others and enjoy some harmless fun, it can also be a breeding ground for dangerous activities.
To combat this, Minister for Communications and Information (MCI) Josephine Teo said new Codes of Practice need to be in place to improve online safety.
On Monday (20 Jun), she shared details of these proposed new laws on Facebook.
The laws mandate that social media platforms ensure additional safeguards for users under 18, reducing their exposure to inappropriate content.
Need to enhance online safety
In her Facebook post, Mrs Teo said online safety is a growing concern around the world. Despite social media platforms putting measures in place to ensure user safety, more can be done.
To illustrate this, she said a recent survey by Sunlight Alliance for Action (AfA) in January found that one in two Singaporeans personally experienced online harm. Most of these Singaporeans are teenagers and young adults.
More countries are pushing to enhance online safety, and many have enacted or are in the process of enacting laws around this, said Mrs Teo.
She shared that Singapore’s preferred approach to strengthening its online regulations is “consultative and collaborative”.
This means learning from other countries’ experiences, engaging tech companies on the latest developments and innovations, as well as understanding its people’s needs.
In this way, she added that we could develop technologically feasible requirements that can be enforced and are fit for purpose.
Two Codes of Practice proposed
Mrs Teo then introduced the two proposed Codes of Practice.
The first is the Code of Practice for Online Safety.
Under this law, designated social media platforms are to have system-wide processes to enhance online safety.
While this applies to all users, it especially applies to younger users.
Through measures such as community standards, young users’ exposure to harmful or inappropriate content — including dangerous self-harming acts — should be minimised.
But if users encounter online harm, they should be able to report and flag them to social media services to take action. This would include non-consensual distribution of their intimate videos.
The second code is the Content Code for Social Media Services.
This protects users from egregious harms such as sexual harm, self-harm, or content that can threaten public health.
Under this proposed code, the Infocomm Media Development Authority (IMDA) can direct social media services to take action against harmful online content to protect users.
Harmful content also includes livestreams of mass shootings and viral social media “challenges” such as dangerous stunts, reported The Straits Times (ST).
The codes will also take into account sensitive issues in Singapore, such as race and religion.
Laws will allow authorities to take action against social media platforms
According to ST, the new codes aim to codify these standards in law.
They will also give authorities power to take action against platforms that do not meet requirements.
Mrs Teo said the MCI acknowledges tech companies’ efforts to better support their users and is now engaging them on the codes.
MCI is also working closely with Sunlight AfA to close the online safety gap.
Public consultations on the codes will also be starting in July.
The codes will likely be added to the Broadcasting Act after the consultations.
Hope laws clamp down on dangerous trends & behaviours
The free-for-all space on social media typically offers harmless fun.
Hopefully, when the laws are introduced, they will help control the spread of such risky trends on social media to youths.
Have news you must share? Get in touch with us via email at firstname.lastname@example.org.
Featured image adapted from iDropNews.