SINGAPORE, Jan 16 — From March 31, app stores in Singapore must block users under 18 from downloading apps not intended for their age group.
This includes apps like Tinder and adult games such as Grand Theft Auto.
The new Code of Practice for Online Safety for App Distribution Services, announced by the Infocomm Media Development Authority (IMDA), also requires app stores to block children under 12 from downloading apps like Instagram and TikTok, which are rated for ages 12 and above.
IMDA explained that app stores play a key role in providing access to digital content, and with more children using mobile devices, the risk of exposure to harmful content increases reported The Straits Times.
The rule affects major platforms like the Apple App Store, Google Play Store, Huawei AppGallery, Microsoft Store, and Samsung Galaxy Store which must now verify the age of users—referred to as "age assurance"—to reduce children’s access to harmful content.
App stores are also responsible for reviewing apps and their updates before release to ensure they do not contain inappropriate content, such as sexual or violent material, or promote cyberbullying, self-harm, or other harmful behaviours.
IMDA will engage with app stores over the next few months to ensure the age assurance measures are in place.
To verify age, some apps like Instagram and Yubo (a French youth networking app) now use facial recognition technology. Alternatively, platforms can use other forms of ID, such as a digital ID or credit card. Apps without age assurance measures must submit a plan to IMDA detailing how they will implement age verification.
If app stores fail to comply, they risk being blocked in Singapore under the Broadcasting Act.
It is still unclear how app stores will manage apps with mature content for younger users. For example, although Netflix is rated 12+ on the Google Play Store, it offers many shows aimed at adults (18+), usually protected by a passcode.
From March 31, app stores must also make it easier for users to report apps with harmful content and provide annual safety reports outlining the steps taken to ensure user safety.
The IMDA requires that reports of harmful content, especially related to child exploitation or terrorism, be acted upon promptly. The platform must also inform the user who reported the issue, unless the report is deemed frivolous.
Digital Development and Information Minister Josephine Teo said the new code is designed to ensure children can safely access age-appropriate content. She shared her concern about a young mother, whose children were using mobile devices excessively and were vulnerable to harmful online content.
The code adds to ongoing efforts to improve online safety, including guidelines for parents and regulations for social media services. Mrs Teo stated: “While there is no single solution to online safety, Singapore remains committed to strengthening protections against emerging online risks.”
Singapore’s new code is part of a global trend to hold tech platforms accountable for children’s safety online. Australia, for example, is about to introduce a law banning children under 16 from using social media, and Singapore is considering similar measures.
The republic has gradually updated its Broadcasting Act to require stricter safety measures, including rules for social media platforms to provide clear reporting channels and transparency about their safety practices.
IMDA is currently reviewing annual safety reports from social media companies to encourage transparency and assess the effectiveness of their safety measures.