Protecting young people and supporting parents across Instagram, Facebook and Messenger — Safer Internet Day 2024

Alex Cowen
Meta Australia Policy Blog
5 min readFeb 5, 2024

by Mia Garlick, Regional Director of Policy for Australia, Japan, Korea, New Zealand & Pacific Islands at Meta

There is so much happening these days, it can be challenging to stay up to date with all of the new tools and resources that are available to help you have a safe and positive experience online. We wanted to take the occasion of Safer Internet Day 2024 to share a quick update so you know what to check back in on, if you ever need it. So go grab the morning beverage of your choice and read on…

We’ve spent over a decade developing policies and technology to address content and behaviour that breaks our rules, and we’ve developed more than 30 tools and features to help support young people and their parents. Over the past year, we have taken several additional steps to provide safer, age-appropriate experiences on our apps. Here are some of the most recent ones you may find useful.

More tools to protect young people and enable parents to support them

We want the time people spend on our apps to be intentional and positive. We also want to be respectful of young people’s autonomy but still give parents the ability to support the young people in their lives, when they need it.

We’re nudging young people to manage their time on our apps. “Nighttime Nudges” have been introduced to encourage young people to leave the app during the night if they’ve spent more than 10 minutes on Instagram DMs or Reels.

We have introduced stricter messaging settings for young people to protect them from unwanted contact. They will not be able to receive direct messages from anyone they don’t follow or aren’t connected to on Instagram and Messenger.

We’ve also started to hide more types of age-inappropriate content for young people on Instagram and Facebook, even if it’s shared by someone they follow, in line with expert guidance.

Parental supervision tools include better insight into who their teen follows and is followed by, and how many friends they have in common. Parents also have more ways to customise the Parental Supervision notifications they receive on Instagram. In addition, parents will be prompted to approve or deny their young people’ (under 16) requests to change their default safety and privacy settings to a less strict state.

These tools also allow parents to see how their teen uses Messenger, from how much time they’re spending on messaging to providing information about their teen’s message settings. However, these tools do not allow parents to read their teen’s messages.

Educating and empowering young people about online safety

Whilst tools can be a useful way to help people customise their experience online, sometimes situations may be more complex than that any one tool can address. We know from our work with the Office of the eSafety Commissioner and our Australian safety partners that young people are often approached and asked to share intimate images of themselves. Sometimes this is as part of what’s known as financially motivated sextortion, undertaken by scammers, and sometimes it might be from a friend.

When this happens, it can be challenging for a young person to find the confidence to talk about it or know the best way to respond in that situation. This is why we have worked on two recent campaigns with local organisations to provide relevant resources for young Australians to help them manage these situations.

First up, we developed a Sextortion Awareness Campaign delivered in partnership with Kids Helpline, ACCCE & Thorn designed to inform young people about the dangers of online sextortion, reduce stigma and provide tools to report it and seek support.

This campaign had significant reach across Meta platforms and was successful in dispelling the misinformation, stigma and victim-blaming that unfortunately often surrounds sextortion. Drawing on the insights from our safety partners and law enforcement we were able to collaboratively create sextortion awareness content that was relevant and relatable. In response to this campaign, young people indicated that meeting them where they’re at, and talking openly about the more known behaviours surrounding sextortion, such as sexting, is an essential part of raising broader sextortion awareness.

The Intimate Images Unwrapped initiative, delivered in partnership with Project Rockit, fostered important conversations about online intimacy, intimate image sharing, respect, and consent. This youth-led initiative, consisting of a suite of educational videos, helped to build peer literacy of the dynamics that underpin intimate image-sharing.

This campaign had strong cut-through with young people due to its Instagram-first youth-driven approach. Feedback from this campaign indicated that power and consent are things young people struggle to grapple with and there is a need to further create initiatives like this one to bring young peoples’ struggles to the forefront and continue having the hard conversations around these topics.

What’s next?

We are constantly looking to understand what more we can do to understand the issues and concerns of young people on our services to help us identify what additional initiatives we can undertake to support them to have a positive experience online. To this end, in December last year, we held a Youth Online Safety Workshop in Meta’s Sydney HQ attended by Minister for Youth Affairs Dr Anne Aly, as well as safety partners and youth ambassadors representing ReachOut, Butterfly Foundation, Orygen, Kids Helpline and Project Rockit. The workshop brought together our partners and their youth ambassadors to share their perspectives on the most pressing online safety challenges and opportunities. The overarching discussion points young people and safety partners communicated during the workshop included:

  • A desire for more algorithmic transparency and more age-apropropriate social media literacy about the algorithm and how it has developed
  • The current challenges and opportunities that exist for AI in the context of youth safety and wellbeing
  • Anxieties surrounding digital footprints and the desire for more private forms of interaction on platforms
  • The need to further centre diversity, representation & lived experience in online safety conversations and product development
  • The desire for more authentic connections offline and how the online world can better foster this

At Meta, we provide a significant number of transparency tools to explain how algorithms and AI work but it was helpful to understand just how important this was for young people. So we will be taking these insights to inform our 2024 safety initiatives. Stay tuned!

Meta Youth Online Safety Workshop, December 2023.

--

--