Data Privacy for Young Australians: Key Considerations
Introduction
Social media is a crucial part of daily life, enabling connection, expression, and information sharing, especially for adolescents and young adults. At the same time, increased social media use among young individuals has led to significant safety and well-being concerns, particularly regarding mental health and data privacy.
With growing awareness of data sharing practices, privacy concerns have become more prominent over the last decade. Young social media users’ personal information is often over-collected and misused. Recent studies on the effects of social media have examined these trends. These studies indicate that social media use can negatively impact life satisfaction and mental health, particularly during vulnerable developmental stages as children mature into young adults.
To address these concerns, Australia recently passed the Online Safety Amendment (Social Media Minimum Age) Bill 2024 (“Bill”). This Bill aims to shield young users from mental health issues, exposure to harmful content, and data privacy risks. Establishing a minimum age of 16 years for social media use in Australia, this Bill places the responsibility on social media platforms to prevent underage users from creating accounts.
This Bill is the world’s first law to ban social media for children under 16 years old. Many are interested to see how the new law will be implemented to regulate previously uncontrolled technology, and whether this bill will become a new global standard.
This blog will explore the Bill’s key aspects, rationales, social impact on young Australians, and the digital landscape of data privacy.
Overview
Applicability
- The Bill sets a minimum age of 16 for social media account holders. It requires social media platforms, such as TikTok, Instagram, and Snapchat to take reasonable steps in preventing users under 16 from creating accounts.
- Proponents of the Bill argue that the current minimum age of 13 is based on outdated legislation (i.e., the 1998 Children’s Online Privacy Protection Act) and does not accurately reflect the present-day safety needs for all young users.
Adaptation and Review
- The Bill includes requirements for adapting to technological changes and evolving social media landscapes.
- An independent review of the Bill’s impact will be conducted within two years of its implementation.
Enforcement
- The Australian Minister for Communications has not yet specified the Bill’s effective date; however, it will not go into effect earlier than 12 months after receiving Royal Assent.
- Platforms that misuse personal information collected for age verification purposes can face major penalties. Currently, repeated breaches of privacy provisions could result in penalties of up to $50 million under section 13G of Australia’s 1988 Privacy Act. This Bill also imposes significant penalties for non-compliance of up to $49.5 million for corporate organizations.
- Compliance monitoring and the distribution of penalties will be the eSafety Commissioner’s responsibility, Australia’s independent regulator for online safety.
Considerations and Challenges
Effectiveness
- Although children might find ways to avoid the age restrictions, the Bill mandates that platforms take “reasonable steps” to prevent such circumvention. Ongoing technological advancements in age assurance will help improve effectiveness. Even if not foolproof, these measures set a clear societal standard and reduce the number of young users exposed to harmful content.
Mental Health Impact
- Some have argued that banning children under 16 from social media could lead to feelings of isolation, emphasizing social media as a vital tool for connection. This Bill allows for exceptions for certain platforms, like messaging apps and educational tools, to ensure young individuals can still connect with friends and access beneficial online activities. Ultimately, the primary goal is to protect young users from negative impacts that could affect their mental health.
Freedom of Expression
- Opponents of the Bill have also stressed that the restrictions could infringe on the rights of young users to freely express themselves. The Bill does not ban young individuals from accessing all online content, but only from creating accounts on age-restricted platforms. The intent is to protect young users from harmful content and interactions that could have long-lasting negative effects.
Consultation
- Others have voiced concerns that the Bill does not consider the interests of young people. However, prior to drafting the Bill, the Australian Department of Infrastructure, Transport, Regional Development, Communications and the Arts conducted extensive discussions with young social media users, parents, mental health professionals, and industry representatives. These stakeholders showed support in favor of setting a minimum age of 16.
Challenges of Implementation
- Privacy is a valid concern when it comes to age verification. The Bill requires social media platforms to implement age assurance methods to verify users’ ages. It also includes strict measures to ensure that the personal information collected for age verification is not misused. Platforms are required to destroy this information once it is no longer needed, and any misuse can result in significant penalties. Understandably, implementing such strong age verification systems could result in costly and technically challenging outcomes.
- These systems will require investment and high effort from social media platforms. To address this challenge, the Bill allows for flexibility in defining which services are considered age-restricted platforms, excluding services for messaging apps and educational tools.
- To facilitate a smooth implementation, the Australian government will also conduct an age assurance trial to guide platforms on effective methods, with a 12-month lead time.
Alternative Approach
- Age restrictions are considered a necessary part of a broader strategy that protects young users from the negative impacts they may face on social media. Some individuals, however, feel that the Bill should consider alternative approaches. For example, instead of age restrictions, platforms should enhance their safety features.
- To address this concern, the Bill encourages a comprehensive approach to online safety, including both age restrictions and enhanced safety measures.
Conclusion
By setting a minimum age for social media use and holding platforms accountable, the Bill aims to create a safer digital environment for children and adolescents, with a strong emphasis on protecting their data privacy. Balancing safety with privacy, effectiveness, and the rights of young individuals is a challenging task that requires careful consideration.
At the same time, this Bill presents many complexities and potential unintended consequences of implementing a minimum age for social media use. Only time will tell if this Bill will have its intended impact and whether it will be adopted in other jurisdictions.
For more information on this bill and/or other privacy laws and requirements, please contact Director Wills Catling at wills.catling@myna.com.