Regulation of the Minister of Communication and Digital Affairs No 9 of 2026 regarding the prohibition of social media use for children under 16 is a strategic government measure to protect children from the risks of the digital space, such as exposure to harmful content, cyberbullying, social media addiction, and online exploitation.
Although this regulation cannot completely eliminate risks as there may still be technical loopholes such as age falsification, its existence remains crucial as an initial protective framework. At the very least, this regulation affirms that the state acknowledges the real threats posed by the digital ecosystem to children’s development.
To support the effective implementation of this policy, several strategic steps can be considered, including:
1. Collaboration between the government, digital platforms, and the public
The government cannot act alone. Social media platforms must be encouraged to strengthen child protection systems, for example through more accurate age verification systems, restrictions on certain features, and stricter content moderation.
Currently, most social media platforms still rely on a self declaration system when users sign up for an account. This system is relatively easy for children to manipulate when they want to access services that are actually age restricted. Therefore, platforms need to develop more credible age verification methods, such as AI-based age estimation technology, digital identity verification, or parental consent mechanisms. With a stronger verification system, the likelihood of minors accessing platforms in violation of the rules can be minimized.
Additionally, platforms can implement more child-safe designs. For example, by limiting features that could increase the risk of harmful interactions, such as private messages from strangers, unsupervised live streaming features, or recommendation algorithms that are too aggressive in displaying content. Furthermore, platforms can automatically set children’s accounts to private, limit usage time, and disable features that could potentially trigger digital addiction.
Platforms have a responsibility to ensure that the content circulating does not harm users, particularly children. Therefore, content moderation must be carried out more proactively, whether through AI technology or human moderation teams. This system is essential for detecting and removing harmful content such as violence, child exploitation, hate speech, or forms of cyberbullying. Additionally, the reporting system must be made more accessible so that users can immediately report harmful content.
Digital platforms must also demonstrate their commitment through transparency in child protection policies. This can be achieved by publishing safety standards, the volume of content addressed, and the steps taken to protect child users. With such transparency, the government and the public can monitor the extent to which the platform fulfills its responsibilities.
Through these various efforts, the role of social media platforms extends beyond that of mere technology service providers; they become vital partners in creating a safer digital ecosystem for children. Government regulations will ultimately be more effective if accompanied by a genuine commitment from technology companies to prioritize user safety, particularly for the most vulnerable groups.
2. Digital literacy for children and parents
Regulations must be accompanied by digital literacy education. Children need to understand the risks of the digital world, while parents must have the ability to guide the healthy use of technology. Digital technology is evolving very rapidly, while the public’s ability, especially for parents to understand the risks and impacts often lags behind. As a result, many children use the internet without adequate understanding of data security, online etiquette, or the potential dangers that may arise in the digital space.
Therefore, digital literacy must be understood not merely as the ability to use devices or applications, but also as the ability to comprehend the social, psychological, and security consequences of digital activities. Children need to be equipped with an understanding of various risks, such as cyberbullying, the spread of misinformation, the exploitation of personal data, and social media addiction, which can affect their mental health and learning patterns.
On the other hand, parents also need to have adequate digital literacy skills. Many parents actually want to monitor their children’s internet use but lack sufficient knowledge about the digital platforms their children use. This situation renders such monitoring ineffective. Therefore, digital literacy programs must target two groups simultaneously: children as the primary users and parents as the primary guardians. With good literacy, families do not merely become users of technology but are also able to manage technology wisely and responsibly.
3. Monitoring and enforcement systems
There must be clear monitoring mechanisms, including mandatory child protection standards that digital platforms operating in Indonesia must adhere to. Good regulation does not stop at drafting rules but also requires clear oversight and enforcement mechanisms. Without strong oversight, regulations risk becoming mere symbolic policies with no real impact on the ground. This is particularly important in the context of social media, which is largely managed by global technology companies with cross-border reach.
The government needs to establish child protection standards that all digital platforms operating in Indonesia must comply with. These standards could include more credible age verification requirements, child data protection, stricter content moderation systems, and reporting mechanisms that are easily accessible to users. Additionally, platforms must be transparent about how they handle harmful content and protect child users.
Equally important, regulations must also be accompanied by accountability mechanisms. This means that if a platform fails to comply with the established child protection standards, there must be clear consequences, whether in the form of administrative sanctions or service restrictions. Thus, regulations serve not only as normative guidelines but also as instruments with the power to drive behavioral change among digital platforms.
4. Safe Digital Spaces for Children
The government and the technology sector can promote the development of more child friendly digital platforms so that children’s needs for learning and digital exploration are met without jeopardizing their development. Restricting children’s access to social media should not be interpreted as an attempt to completely isolate them from the digital world. Instead, a more constructive approach is to provide digital spaces that are safe, educational, and appropriate for children’s developmental stages. Children still need space to learn, express themselves, create, and interact with peers within a positive digital ecosystem.
Therefore, the government, in collaboration with the technology sector and educational institutions, can promote the development of digital platforms specifically designed for children and adolescents. Such platforms can offer various features that support learning, collaboration, and creativity, but with stricter moderation systems and stronger privacy protections. With a design focused on the best interests of the child, these digital spaces can serve as a safer alternative to general social media platforms dominated by adult users and commercial interests.
In addition, the development of a child friendly digital space can also be part of a long-term strategy to build a healthier digital ecosystem. Children are not only protected from risks but also given the opportunity to use technology in a positive and productive way. Thus, the goal of regulation is not merely to restrict but also to guide the use of technology in a direction that is more beneficial for children’s development.
The Role of Parents in Supervision
Although government regulations are already in place, the role of parents remains the most important factor in protecting children in the digital space. Regulations are not intended to replace family responsibility, but rather to reinforce it. Policies will also become more effective as the role of parents grows. Some of the roles parents can play include:
1. Active Guidance in Internet Use
Parental guidance should not be viewed merely as strict supervision or control, but as a process of mentoring that helps children understand how to use technology in a healthy and responsible manner. The digital world offers many benefits, such as access to information, learning tools, and creative outlets. However, it also poses various risks, such as exposure to violent content, pornography, and misinformation.
In this context, parental involvement is crucial to ensure children do not navigate the digital space without guidance. Parents can identify which platforms their children use, the types of content they consume, and how they interact with other users. Through this guidance, parents can explain to their children the boundaries that need to be maintained, including how to recognize inappropriate or harmful content. This approach is far more effective than simply imposing bans, as children are not merely faced with rules but are also equipped with the understanding and ability to think critically in the digital space.
2. Fostering open communication between parents and children
One of the biggest challenges in monitoring internet use is when children choose to hide their digital activities from their parents. This often happens when the approach taken places too much emphasis on restrictions or punishments. Therefore, fostering open communication is key to helping children feel safe sharing their digital experiences.
Children need to know that parents are not just supervisors, but also a trusted source when they face problems online. For example, when a child experiences cyberbullying, receives messages from suspicious strangers, or accidentally stumbles upon disturbing content. If a strong communication bond is established, children will be more likely to seek help from their parents rather than trying to resolve the issue on their own. Thus, open communication serves as an early protective mechanism against various risks in the digital space.
3. Establishing rules for digital device use at home
In addition to guidance and communication, parents also need to establish clear rules regarding the use of digital devices. These rules are important for fostering healthy technology habits and preventing dependence on devices and social media.
Examples include limiting the daily duration of digital device use, designating specific gadget-free times such as during family meals or before bedtime, and managing internet usage spaces so they aren’t entirely private. Such rules aren’t intended to stifle children’s creativity, but rather to maintain a balance between digital activities and other activities such as studying, face to face social interaction, and physical activities.
4. Setting a good example in technology use
Children learn a great deal from the behavior they observe in daily life, including how parents use technology. If parents frequently use their devices excessively, it’s difficult for them to expect their children to behave differently. Therefore, setting a good example is a crucial aspect of digital education within the family.
Parents can demonstrate wise use of technology, for example, by not constantly holding their phones while interacting with family, using the internet for productive activities, and maintaining proper etiquette when communicating in digital spaces. By seeing these real life examples, children will understand that healthy technology use is not just a rule imposed from above, but a value practiced in daily life.
Addressing Loopholes in Age Verification
One of the main challenges in implementing these regulations is the self declaration system used when creating social media accounts, which allows children to falsify their age. To address this loophole, several steps are needed:
First, strengthening age verification systems by digital platforms.
One of the most fundamental weaknesses in child protection on social media today is the fact that many platforms still rely on self-declared age. In practice, such mechanisms are highly fragile because they merely ask users to enter their date of birth without any genuine verification. As a result, age restrictions often amount to little more than an administrative formality, rather than a protective measure that actually works. Minors can easily manipulate data to create accounts, while platforms appear to have “complied” with the rules simply because they have listed a minimum age requirement.
Therefore, digital platforms must be encouraged to develop stronger and more responsible age verification systems. These could take the form of technology based age estimation, proportionate identity verification, or parental consent schemes for children’s accounts. The essence of these efforts is not merely to make registration more difficult, but to ensure that platforms no longer turn a blind eye to children’s vulnerabilities. As long as age verification remains lax, child protection will remain nothing more than a slogan, while access to harmful content, risky interactions, and addictive usage patterns remains wide open. In other words, weak age verification renders regulations toothless. Conversely, more credible age verification will serve as the first step toward building a digital environment that is truly safer for children.
Second, regulations that promote platform accountability.
Until now, too much of the burden of child protection has been placed on families and individuals, as if the issue of children’s digital safety were solely the parents’ responsibility. In reality, social media platforms are the ones designing the systems, features, algorithms, and business models that determine how users interact within them. Therefore, it is unfair for platforms to simply reap the benefits of high user engagement without bearing an equal share of responsibility for protecting vulnerable groups.
This is where government regulation becomes crucial: not only to regulate users, but also to hold platforms accountable for the design and impact of their services. Governments need to establish strict child protection standards, such as robust age verification requirements, setting children’s accounts to private by default, restricting high-risk features, mandating prompt responses to complaints, and imposing sanctions on negligent platforms.
Without regulatory pressure, platforms tend to move slowly because, from a business perspective, they benefit from user growth and high usage duration. This means that commercial interests often do not automatically align with child protection interests. Therefore, regulation is needed to change the underlying logic: that child safety is not a voluntary choice for platforms, but a mandatory obligation.
Third, educating children about digital ethics and risks.
Child protection will never be effective if it relies solely on technical safeguards. Children who do not understand the reasons behind restrictions will tend to view rules as obstacles to be circumvented, rather than protections to be respected. This is where digital education is crucial: children must be helped to understand that the digital world is not a neutral space, but one filled with both opportunities and risks.
This education is important so that children understand that age restrictions are not merely arbitrary bans, but a response to real risks such as addiction, cyberbullying, exploitation of personal data, fraud, and exposure to content that is not appropriate for their stage of psychological development. Children also need to be equipped with digital ethics: how to behave politely, respect others’ privacy, not easily trust information, and know when to ask for help.
Without this awareness, children may be successfully restricted from one platform, only to move to another digital space with the same or even greater risks. Therefore, education serves not merely as a complement to regulations, but as the foundation for building children’s resilience in navigating the digital ecosystem. The ultimate goal is not merely to make children compliant with rules, but to help them grow into technology users who are aware, critical, and responsible.
Fourth, strengthening parental supervision.
No matter how well designed the regulations are, the government cannot be present at every moment of a child’s digital life. This is where parents continue to play an irreplaceable role. Parental supervision is important not because children should always be viewed with suspicion, but because they lack the full maturity to independently assess all digital risks. They are easily drawn to new things, easily influenced by their peer group, and in many cases do not yet understand the long term consequences of their online activities.
Therefore, parental supervision should not be narrowly interpreted as spying on children. Effective supervision actually means knowing which apps children use, who they interact with, what kind of content they consume, and how their digital habits are formed. Parents need to be closely attuned to their children’s digital lives to detect warning signs early on, such as behavioral changes, withdrawal, unstable emotions, or a tendency to hide online activities.
In the context of age verification loopholes, the role of parents becomes increasingly important. Children may be able to create accounts using fake information, but parental supervision can minimize the opportunities for such violations to continue. In other words, regulations provide an external safeguard, while parents provide protection from within. When both work together, child protection becomes much stronger. However, if family supervision is weak, children will still easily access risky digital spaces, even if regulations are in place.
Practical Solutions to Ensure Children Are Not Hindered in the Digital Learning Process
Regulations restricting social media for children under 16 are indeed intended to protect children from various risks in the digital space. However, on the other hand, social media also serves positive functions as a tool for learning, creativity, communication, and strengthening digital literacy. Therefore, it is important to ensure that these regulations do not hinder children’s access to the benefits of digital learning. Several practical solutions can be implemented to maintain this balance.
1. Encouraging the use of education-focused digital platforms
Not all digital activities need to take place on general social media. Children can still access various learning resources through educational platforms, learning websites, digital libraries, and apps specifically designed for learning activities. Such platforms typically offer a more curated and secure environment compared to open social media. Thus, children’s needs to access information, engage in discussions, and develop digital skills can still be met without exposing them to greater risks.
2. Providing child friendly digital spaces
Governments, educational institutions, and technology companies can encourage the development of digital spaces specifically designed for children and adolescents. Such platforms can offer features for sharing work, learning discussions, or creative collaboration with stricter moderation systems. A specially designed environment will help children continue to express themselves, create, and interact positively without having to enter a social media ecosystem that may not be safe for them.
3. Strengthening the Role of Schools in Digital Learning
Schools can serve as safe spaces for introducing digital technology in a productive manner. Teachers can utilize various online learning platforms to foster student discussion, collaboration, and creativity. With guidance from educators, children can learn to use technology in a more focused and responsible manner, ensuring that the benefits of digital learning remain accessible even when access to general social media is restricted.
4. Parental guidance in children’s digital activities
Parents can help children use the internet as a tool for learning and self development. For example, by directing children toward educational content, learning channels, or positive digital communities. This guidance is essential to ensure children continue to gain beneficial digital experiences while understanding the boundaries that must be maintained.
5. Strengthening digital literacy from an early age
In addition to providing access to learning platforms, children also need to be equipped with the ability to use technology critically and responsibly. Digital literacy will help them utilize the internet as a source of knowledge and creativity, not merely as a means of entertainment. With strong digital literacy, children will be better prepared to navigate broader digital spaces once they are old enough.
Dr Yulina Eva Riany
Chairperson of the Center for Gender and Child Studies, IPB University

