JULY 28 — From Aug 1, all social media and online messaging platforms with at least 8 million users in Malaysia will be required to apply for a licence from the government.

The Malaysian Communications and Multimedia Commission (MCMC) stated that this was to ensure a “safer online ecosystem”. The regulatory framework would require these platforms to comply with Malaysians laws and combat cybercrime, including scams and online fraud, cyberbullying and sexual crimes against children.

Many would certainly welcome a safer online ecosystem, but will licensing social media companies actually produce this outcome in Malaysia?

Data privacy

A key part of a safe online ecosystem is that our personal data is kept private and safe. When we talk about online frauds and scams, many of these begin with perpetrators having access to personal data such as names, phone numbers, emails and identity card numbers.

We provide this personal identifiable data in many instances as we go about our daily lives. In Malaysia however, while private sector companies are obliged under the Personal Data Protection Act 2010 (PDPA) to keep such data secure and not share it without our consent, the government is exempt from any such obligation. Section 3(1) of the PDPA states that the PDPA “shall not apply to the Federal Government and State Government”.

This is a significant omission especially since the government holds a vast amount of our personal data. To compound matters, a CyberSecurity Malaysia 2023 report revealed that the government sector experienced the “greatest number of data breaches” and that government ministries and agencies “are exposed to significant cyber risks, including vulnerable software, weak access controls, data exposure and other critical issues”.

Malaysians may recall a massive data breach in 2017 of 46 million names and mobile phone numbers that was linked to an MCMC initiative to deter mobile phone theft.

Any serious attempt to create a safer online ecosystem must therefore ensure the PDPA applies equally to everyone who handles data, especially the government and all its agencies. The government, in fact, recently proposed amendments to the PDPA that were passed on 16 July in the Dewan Rakyat, but notably missing, were any amendments to include itself under the PDPA’s scope.

Backdrop of repressive laws

Malaysia also has a long history of governments using repressive laws, such as the Sedition Act and the Printing Presses and Publications Act, to silence its critics. For the online realm, there is section 233 of the Communications Multimedia Act which makes it an offence to post obscene, indecent, false, menacing or offensive content with intent to annoy, abuse, threaten or harass.

These broad, repressive laws are still on our books, and have been used under the current government, despite many of its members having campaigned for their amendment or repeal.

While platforms must certainly be more accountable for harmful and dangerous content disseminated widely through their services, in the last two decades, they have also played a part in creating space for Malaysians to express themselves, in what was once a much more repressive environment.

Against this backdrop, any new law that strengthens the government’s power further regarding online content, must guard against arbitrary misuse. Otherwise, something that claims to be designed for our safety, would likely end up being used to stifle discontent against the government, while not doing much to protect us at all.

Not just about taking stuff down

Finally, an arguably more troubling feature of platforms is not the questionable content found on them but how that content is pushed out to users.

Platforms vacuum up vast amounts of detailed personal information about users such as their website visits, purchases, locations, contacts and phone numbers. Unseen algorithms then tweak each user’s experience and serve up targeted advertising and content to keep them engaged for longer.

Social media platforms are therefore not mere canvases where people come to post and view information. They have become gatekeepers to information – determining for its users what content they see, and don’t see, on their platforms.

Various studies have demonstrated that this leads to users consuming content in “echo chambers”, seeing primarily content they tend to agree with. But it goes further than that. Algorithmic curation can have more insidious effects, related to hate and bias.

A Guardian Australia experiment revealed that a social media platform served up a regular diet of misogyny and sexist images to a generic 24-year-old male account with no prior activity on the account.

A Human Rights Watch report accused Meta of “systematically and globally supressing voices in support of Palestinians on Instagram and Facebook.”

Other studies have demonstrated that tech platforms assist in the proliferation of hate and extremism, sometimes recommending even more extreme content when a user searches for such subjects.

If we really want to create a safer online ecosystem for our children and ourselves, it will therefore involve much more than just taking stuff down off the platforms. It will involve being able to demand a measure of transparency for how platforms’ algorithms work and are trained, and explanations for when they go awry such as when they block out Palestinian content or push misogynistic memes to young male users.

Getting platforms licensed will not be a silver bullet against these problematic aspects and will involve a much broader stakeholder approach. It will probably also require us to work in concert with other Asean countries such as Indonesia.

Safeguarding rights online

Safeguarding our rights online is a crucial aspect of safety online. This includes our rights to choose how our data is collected and used, to privacy, to freedom of expression, and to access information. All key stakeholders must be able to be held accountable for the safeguarding of our rights online, not just the platforms.

If the government is serious about its laudatory aims in creating a safer online ecosystem, it must take its own role in the ecosystem seriously. It must make itself accountable for our personal data, respect the freedom of expression, and be transparent and accountable in its requests to platforms for removal of content.

It should appreciate the problematic aspects of tech platforms’ business model and work to address how these platforms gather data and find ways to demand more transparency about platforms’ algorithms and content moderation.

Only then can we talk seriously about building a safer online ecosystem.

*Ding Jo-Ann is Principal, Asia at Luminate, a global non-profit foundation.

** This is the personal opinion of the writer or publication and does not necessarily represent the views of Malay Mail.