In a significant move towards enhancing online safety, Indonesia’s Ministry of Communication and Digital Affairs (Komdigi) has issued a stern warning to Meta over its failure to adequately moderate disinformation and harmful content, including online gambling. The government’s intervention, which follows an unscheduled inspection of Meta’s office in Jakarta, comes as part of broader efforts to enforce stronger content moderation standards on global digital platforms operating in Indonesia.
In a parallel development, the Indonesian government has introduced new regulations aimed at protecting children from harmful online environments. The government’s Ministerial Regulation Number 9 of 2026 will prevent children under 16 years of age from creating accounts on high-risk platforms like YouTube, TikTok, Facebook, and Instagram, effective March 28, 2026. This move is part of a phased implementation that will first target platforms identified as particularly risky for minors.

Meta Faces Increased Scrutiny Over Disinformation and Harmful Content
The government’s actions against Meta come after the company failed to meet Indonesian standards for content moderation. Authorities cited ongoing issues with the platform’s management of disinformation, hate speech, and online gambling content, which have become prevalent on social media platforms.
Minister Meutya Hafid led the inspection and expressed concern over Meta’s failure to remove harmful content at an acceptable rate. Underscoring that the government has no tolerance for such lapses in moderation, Hafid remarked,
Platforms must be held accountable for the risks posed by content that negatively impacts public safety and stability.
The Indonesian government has called for stronger content removal systems, including enhanced age-verification systems for platforms like Meta. In addition to the public reprimand, authorities are now urging platforms to ensure more responsive content management, with penalties looming for non-compliance.
Strengthening Child Protection: New Regulations for Digital Platforms
In a bid to create a safer online environment for children, the Indonesian government has rolled out new child protection regulations that directly affect high-risk digital platforms. As per Ministerial Regulation Number 9 of 2026, children under 16 will be prohibited from accessing major social media and gaming platforms, including Roblox, TikTok, and Facebook, starting on March 28, 2026.
This new regulation, which falls under the PP TUNAS framework, aims to limit children’s exposure to the risks associated with these platforms, including pornography, cyberbullying, online fraud, and digital addiction.
The phased implementation of the regulation means that platforms like TikTok, Instagram, and Facebook will face the first wave of changes, restricting children from creating new accounts or accessing services that have not been age-verified. This is expected to create a significant shift in how platforms manage younger audiences.
A Double-Pronged Approach: Content Regulation and Child Safety Online
Both the Meta intervention and new child protection regulations signal Indonesia’s increasingly assertive stance on digital regulation. The government is focusing on two key areas of online safety: content moderation and child protection. By pressuring platforms like Meta to enhance their moderation policies, Indonesia is sending a strong message that social media companies must take responsibility for the content they allow on their platforms.
At the same time, the new regulations around child access to high-risk platforms indicate a growing awareness of the specific dangers posed to children in digital spaces. These regulations aim to create a safer digital ecosystem where platforms cannot simply ignore the responsibility of protecting minors from harmful content.
The dual focus on both disinformation and child protection highlights Indonesia’s broader strategy of comprehensive digital oversight. This strategy holds companies accountable for their content while safeguarding vulnerable populations. In addition to these initiatives, the Indonesian government has taken decisive steps to combat illegal online gambling by ordering banks to block accounts linked to unauthorised gambling sites. Platforms operating in Indonesia will now be under greater scrutiny than ever before, as the government moves to ensure compliance with both new digital content guidelines, child protection rules, and stricter regulations on online gambling.
What’s Next for Meta and Other Platforms?
Meta has yet to publicly respond to the government’s demands, but the company faces increasing pressure to comply with local laws or risk facing penalties. As Indonesia’s digital governance framework continues to evolve, it will likely set a precedent for other countries in the region, encouraging a wave of tighter regulations and greater corporate responsibility for digital platforms.
With these regulatory changes, Indonesia is making a clear push to protect its citizens from the dangers posed by digital platforms. As March 2026 approaches, platforms will need to make significant adjustments to their content moderation strategies and compliance practices to align with Indonesian law.