S’pore app stores to block underage users from downloading apps such as TikTok & Instagram by 31 Mar

singapore-app-stores-block-young-users

Singapore app stores to block users under 18 from downloading certain apps

On 15 Jan, the Infocomm Media Development Authority (IMDA) announced measures to prevent underage users in Singapore from accessing certain apps in app stores.

From March 31, users under 18 years old will be blocked from downloading applications such as Tinder and games such as Grand Theft Auto, both of which are rated for users aged 18 years and above.

The code, called Code of Practice for Online Safety for App Distribution Services, also requires app stores to block users under 12 years old from downloading apps such as TikTok and Instagram.

App Distributions Services (ADSs) including the Apple App Store, Google Play Store, Huawei App Gallery, Microsoft Store, and Samsung Galaxy Store are required to put in place “system-level measures to curtail the risk of exposure to harmful content”, particularly for children.

Singapore app stores to block underage users

Source: cottonbro studio on Pexels

Singapore app stores to block users by age

IMDA’s factsheet encouraged the use of “significantly evolved” artificial intelligence and facial screening technology, which will allow the implementation of age assurance measures to be effective.

The two age assurance measures include age estimation and age verification.

Age estimation relies on systems or processes to establish a user’s likely age or age range through the use of Artificial Intelligence, machine learning technology, or a facial age analysis algorithm.

Age verification relies on verified sources of identification, such as digital ID and credit card, to determine a user’s age or age range.

ADSs can decide the appropriate age assurance measures to use, either age estimation, age verification, or both, IMDA said.

ADs that fail to apply age assurance measures run the risk of being blocked in Singapore.

In the coming months, IMDA plans to engage with designated ADSs on the implementation of age assurance measures. The designated ADSs are also required to submit an “implementation plan” to IMDA.

Harmful content to look out for

According to the Code of Practice for Online Safety (July 2023), categories of harmful content include:

  • Sexual content
  • Violent content
  • Suicide and self-harm content
  • Cyberbullying content
  • Content endangering public health
  • Content facilitating vice and organised crime

Managing young users’ safety

In addition to IMDA’s age assurance measures, parents and guardians are encouraged to minimise children’s exposure to online content through the following:

  • Limit public visibility of accounts, including profile and content
  • Limit who can contact and/or interact with the account
  • Limit location sharing

IMDA states that it will continue its efforts in “working with relevant government agencies, the industry and community” to combat harmful online content and to “protect Singapore users against online harms”.

Also read: US passes bill to ban TikTok, CEO Chew Shou Zi promises to overcome issues for users

US passes bill to ban TikTok, CEO Chew Shou Zi promises to overcome issues for users

Have news you must share? Get in touch with us via email at news@mustsharenews.com.

Featured image adapted from Ron Lach on Pexels. 

  • More From Author