Meta’s response to the South Australian Children (Social Media Safety) Bill 2024

Meta Policy ANZ
Meta Australia Policy Blog
9 min read1 day ago

Executive Summary

Meta welcomes the opportunity to provide feedback on the South Australian Government’s Children (Social Media Safety) Bill 2024 (Bill), and build on our engagement with the Hon Robert French AC during his consultations to prepare the Report of the Independent Legal Examination into Banning Children’s Access to Social Media (French Report).

Meta shares the South Australian Government’s objective that young people should have an age appropriate experience online. This is why we invest significantly in policies, technology and partnerships to promote a safer online experience for all of the people who use our services and especially young people.

We have steadily increased our investment over the years and now have around 40,000 people overall working on safety and security, and have invested over US$20 billion since 2016. This includes around US$5 billion in the last year alone. We have also continued to expand our use of proactive detection technologies to identify and action problematic content before anyone reports it to us. Today, for categories of content — like terrorism, child exploitation, or suicide and self-injury — our proactive rate (the percentage of content we take action on that we find before a user reports it to us) is well over 95%.1

The privacy, safety, and wellbeing of young people on our platforms is essential to our business. Our policies prohibit problematic content, including content or behaviour that exploits young people, and we work closely with experts in mental health, child development, digital literacy and more to build features and tools so that teens can connect online safely and responsibly. We continue to make ongoing investments in this space. Ensuring that young people receive age-appropriate experiences is a responsibility we take seriously, and our approach is guided by these three overarching principles:

  • Responsible empowerment — we strongly believe in the importance of responsibly empowering young people to enjoy the many benefits our platform provides.
  • Age-appropriate safeguards — we recognise that younger users require additional safeguards for their safety, privacy, and wellbeing.
  • Innovation — innovative technologies help us provide young people on our platform with age-appropriate experiences and are essential to solving the defining challenges for our industry when it comes to protecting young people online.

Reflecting these principles, over the past decade, we have developed a number of safety tools2 to support teens and families. For example, for those users who are between the ages of 13 and 17, we take a number of steps to ensure they have an age-appropriate experience on Facebook and Instagram, such as defaulting new teen accounts to private; setting location sharing as off by default; preventing teens from receiving messages from accounts they are not connected with; applying warning labels for sensitive content that may be allowed on our platforms for public interest, newsworthiness or free expression value; and making it more difficult for adults to find and follow teens.3

We also offer tools that empower parents and teens to work together to set time limits and schedule breaks, allow parents to see who their teen follows and who follows their teen, and enable teens to notify their parents when they block or report someone.4

Building on these innovations, last month we announced the introduction of Instagram Teen Accounts in Australia, the US, UK and Canada (with other countries to follow in January 2025),5 to automatically place teens in built-in protections and reassure parents that teens are having safe experiences.

Now, teens under the age of 18 will be automatically placed into Teen Accounts,6 and teens under 16 will need a parent’s permission to change these built-in protections to be less strict. Early teens (aged 13-15) will be automatically placed into defaults, for example, a private account, messaging restrictions, the strictest setting of our Sensitive Content Control,7 a daily limit reminder after 60 minutes of use and no notifications from 10pm to 7am. If an early teen wishes to change to a less protective setting, they will need to set up a supervision relationship with their parent/guardian and seek parental permission to make that change.

Setting up parental supervision provides parents with numerous insights into their teen’s online activity to help them support their teen (for example, the ability to see who their teen follows and who follows their teen and with whom their teen has been recently messaging) and enables the parent to set a daily time limit on the app that logs the teen out and to turn the app off altogether at night via sleep mode. Late teens (aged 16-17) will be placed into the same defaults, but will not require parental permission to change the default settings.

Teen Accounts are designed to address, among other things, parental concern about their teens' experience on Instagram and, in particular, age appropriate content, unwanted contact and time spent. To inform the design and development of Teen Accounts, in addition to working with experts, we conducted interviews, focus groups, and survey research with teen and parent participants.8

Our ongoing, decade-long investment in building online safety into our products and services demonstrates that we share the intent and goals of the South Australian Government in providing a safe and age-appropriate online experience for teens, and it is in this spirit that we want to raise some concerns about the Bill as currently drafted. In short, it will be challenging to operationalise for both industry and the proposed regulator, cause unintended confusion and burden to South Australian parents and young people, and not achieve the intended net benefit improvement to their online safety. To address these concerns, we have shared some suggestions on areas where the Bill can be amended to best achieve its goals and be a world-leading example of age assurance legislation.

Understanding a user’s real age is key to all of the efforts by policymakers and app providers to promote a more age appropriate experience online. The Bill adopts an “app by app approach” to age verification, which will leave every app provider to identify how they will comply with respect to their services, placing a burden on parents and young people to prove their age and parental relationship with each of the dozens of apps that a young person uses. Teens move fluidly from one app or service to the next, and any regulation in this space needs to reflect how parents and teens actually use apps. It also needs to apply to the ever-evolving nature of these technologies, so that companies can implement requirements for the long term. By placing the obligation to verify age on each individual app, this increases the time required for South Australian families to follow the age verification processes for each app and also potentially increases the privacy risk to them, because they will need to share some form of personally identifiable information with each app to confirm the age of the young person and the parental relationship on an app by app basis.

A more effective and simple approach for the Bill would be to adopt a ‘whole-of-ecosystem’ approach that requires app store/OS level age verification and app stores to get a parent’s approval before their child downloads an app, allowing parents to oversee and approve their teen’s online activity in one place. Teens and parents already provide companies like Apple and Google with this information and these companies have already built systems for parental notification, review, and approval into their app stores. Legislation should provide an overarching framework to this existing practice, require app stores to verify age and then provide apps and developers with this information, which can then be used by app providers as part of their individual age assurance tools. Meta’s investment in User Age Group APIs in the Meta Quest Store, which are designed to help developers understand how old their users are, is an example of how this can be achieved in a privacy-preserving way. When someone launches an app on the Meta Quest platform, these APIs allow Meta to share whether the app is used by a preteen, teen or adult account. The app is then able to use this information to tailor a more age-appropriate experience and to properly protect young people’s data.9

An app store/OS-level solution would not exempt Meta and other app providers from implementing their own age assurance tools. Rather, it would be an important complement to these efforts, which recognises the technical limitations of age assurance technology, the practical realities of how young people and parents use apps, and preserves privacy by minimising data collection.

We want to be clear that we are not recommending this approach in order to divest Meta of our responsibility to ensure safe and age appropriate experiences for teens across our services - a narrative that has gained momentum in some circles but is very much misguided. We make this recommendation based on our long experience in building online safety into our products and services.

As noted above, we know that teens move interchangeably between many websites and apps. The average teenager uses dozens of applications on their phone - in some cases as many as 40 apps or more. Many of these apps have different standards or safety features, which are constantly changing or have new features added which can be challenging for parents and guardians to keep up. Only by creating industry-wide protections will teens actually be safer.

This is why app store/OS level age verification is the most efficient, consistent and sustainable solution, on which Meta and other app providers can continue to build and invest in new tools to ensure children have age appropriate online experiences.

Finally, we know that online safety is most effective when a multi-layered and multi-stakeholder approach is adopted – bringing the expertise and insights from government, industry, academia, child safety and mental health. We urge the SA Government - as well as other state governments and the federal government contemplating the best regulatory reforms with respect to child online safety - to consider the insights shared by Australia’s leading mental health organisations calling for evidence-based measures to help improve the safety of social media platforms for young people10 and also ensure a genuine multi-stakeholder approach that brings industry to the conversation in identifying the most optimal long-term solutions.

To assist the South Australian Government with its consultation on the Bill, we have also made suggestions designed to highlight where the drafting appears to lead to unintended negative consequences. To illustrate this point with a local example, at present, under the current language proposed, an app provider could potentially be liable if a 13 year-old viewed the well-known video of a tour with the Calypso Star Charters11 that was originally posted on their Facebook Page and was shared by a family member in a family group chat. It is challenging to see how restricting young people from enjoying the small business entrepreneurship and beauty of South Australia in this way, which social media enables, is within the intended scope of the Bill.

Footnotes
1 Meta, ‘Community Standards Enforcement Report Q2 2024’, Transparency Center, https://transparency.meta.com/reports/community-standards-enforcement
2 Meta, ‘Our tools, features and resources to help support teens and parents’,
https://www.meta.com/help/policies/safety/tools-support-teens-parents; Meta, ‘Teen Privacy and Safety Settings’, https://www.meta.com/help/policies/safety/Meta-Teen-Privacy-Safety-Settings
3 Meta, ‘Our tools, features and resources to help support teens and parents’,
https://www.meta.com/help/policies/safety/tools-support-teens-parents; Meta, ‘Teen Privacy and Safety Settings’, https://www.meta.com/help/policies/safety/Meta-Teen-Privacy-Safety-Settings
4 Instagram, ‘Introducing Stricter Message Settings for Teens on Instagram and Facebook’, Instagram Blog, 25 January 2024,
https://about.instagram.com/blog/announcements/new-parental-controls-and-teen-privacy-defaults
5 Meta, ‘Introducing Instagram Teen Accounts: Built-In Protections for Teens, Peace of Mind for Parents’, Newsroom, 17 September 2024,
https://about.fb.com/news/2024/09/instagram-teen-accounts
6 Meta, ‘Introducing Instagram Teen Accounts: Built-In Protections for Teens, Peace of Mind for Parents’
https://about.fb.com/news/2024/09/instagram-teen-accounts
7 Meta, ‘Limit sensitive content that you see on Instagram’
https://help.instagram.com/251027992727268?helpref=faq_content
8 Meta, ‘How research and consultation informed Instagram Teen Accounts: a new protected experience for teens, guided by parents’, 17 September 2024,
https://www.ttclabs.net/site/assets/files/12047/report_-_how_research_and_consultation_informed_instagram_teen_ac counts.pdf
9 Meta, Introducing Age Group Self-Certification & Get Age Category API for All Developers, 23 July 2024,
https://developers.meta.com/horizon/blog/age-group-self-certification-apis-meta-quest-developers
10 Press Statement, Leading mental health organisations say proposed ban won’t make social media safe, 10 September 2024
https://nest.greenant.net/s/KfMk8GgcjK4TJjr
11 Facebook, ‘Shark Cage Diving — Calypso Star Charters’,
https://www.facebook.com/calypsostarcharters

--

--