Meta’s appearance before Australia’s Joint Select Committee on Social Media and Australian Society

MetaANZ
Meta Australia Policy Blog
6 min readJun 28, 2024

Today, Meta appeared before the Joint Select Committee on Social Media and Australian Society, alongside industry peers. The hearing focused on the impacts of social media to Australians, including what the industry is doing to keep young people and Australians safe online and Meta’s decision to not renew commercial news deals with Australian publishers.

Ms Mia Garlick, Regional Director of Policy, Australia, Japan, Korea, New Zealand & Pacific Islands relayed to the committee the benefits of Meta’s services to the millions of Australians who use them everyday, including small businesses, creators and family and friends. In the below opening statement shared to the Committee, Ms Garlick provided an overview of our longstanding commitment to create tools and features to keep Australians safe, remove harmful content and keep misinformation from spreading on our platforms.

It’s in Meta’s interests to help Australians have positive experiences on our apps. If people don’t believe our apps are safe, they won’t use them. We are committed to safety and have spent more than a decade working on safety issues and creating dedicated safety features and tools. Since establishing our Australian office in 2008, we’ve built relationships with local law enforcement, educators, safety organisations, including the eSafety Commissioner and community groups. We have employees in Australia working on these efforts and engaging regularly with regulators and policy makers.

Globally, we have around 40,000 people overall working on safety and security, and we have invested over (USD) $20 billion since 2016. This includes around (USD) $5 billion in the last year alone. We’ve also been focused on making sure young people are having safe and age-appropriate experiences on our apps, and provide special protections for teen accounts. This includes defaulting all accounts for people under 16 to private, have age verification should teens try to change their age to above 18 and default accounts into the most restrictive content and recommendations settings for under 16s.

Meta also reaffirmed to the Committee our decision to not renew commercial news deals with publishers. This was based on our decision to deprecate our dedicated Facebook News product, which the majority of our commercial deals were tied to. The number of people using Facebook News in Australia and the U.S. dropped by over 80% last year. As a company, our time and resources need to focus on things people tell us they want to see more of on the platform, including short form video, like Reels. Currently, news is still available in people’s Feed on Facebook and publishers are still able to post links to news articles at no charge if they wish to drive traffic to their news pages.

We hope that the data and evidence we provided before the Committee about the benefits of our services to Australians and our commitment to creating a safe experience, particularly for young people, will serve as useful reference points to the Committee.

Mia Garlick, Opening Statement to Australia’s Joint Select Committee on Social Media and Australian Society

Thank you Chair, Deputy Chair and Committee members for the opportunity to appear before you to discuss the important issue of the role of online services — such as those provided by Meta — in Australia.

Before kicking off today, I wanted to acknowledge the Gadigal people of the Eora nation from whose lands that I am dialling in to this hearing today and acknowledge their elders past, present and emerging.

I’m grateful that my colleague Antigone Davis, Meta’s Vice President and Global Head of Safety is also able to join the hearing today because I appreciate that there has been considerable discussion of late — both by Committee members and more broadly across the country — about the safety and well-being of people when they are online, in particular young people.

At Meta, we take a multi-layered approach to understanding someone’s age — we want to keep people who are too young off of Facebook and Instagram, and make sure that those who are old enough receive the appropriate experience for their age. Some of the ways we do this is by defaulting new teen accounts to private, placing a range of default limits on a teen’s accounts, and limiting advertisers’ ability to reach young people. Antigone will be well placed to answer your questions on these tools and related topics about online safety.

But more broadly, millions of Australians regularly use one or more of Meta’s family of apps to connect with the people and the communities they care about. Globally, over 1.8 billion people engage in Facebook Groups every month, and in Australia, we’ve seen the power of Facebook Groups in mobilising people around shared passions, such as the Matildas and A-League Women Supporters Group, which last year organised meetups for its 32,000 members across Australia and New Zealand.

Since establishing our Australian office in 2008, we’ve built relationships with local law enforcement, educators, safety organisations, and community groups.

In this time, our services have undergone significant transformations. Back then the Facebook experience was primarily a desktop experience with static ads and Instagram allowed only photo sharing — now people are engaging with more immersive experiences like Stories and Reels. Today, Reels on Instagram makes up over 50% of time spent on the platform and approximately 3.5 billion Reels are reshared daily.

Amidst the evolution of our products, some haven’t succeeded — like Facebook News. The genesis for Facebook News was to connect people who are interested in seeing news regularly, who are potentially monetizable customers for publishers, in a dedicated news section in the app with those publishers. I do want to be clear that this represents a minority of people who use our services — for the vast majority of people using Facebook, less than 3% of their Feed is news links.

Whilst there were some promising signs when Facebook News was first rolled out in the US and Europe in 2019, over time a massive shift has occurred with respect to consumer preferences. People are now primarily engaging in short form video and primarily with non-news content. Overall, we have seen an 80% drop in the use of Facebook News and the poor performance of this product lead to our announcement that we would deprecate it here in Australia in February of this year. In 2021, we entered in to deals with Australian publishers to support the Facebook News product but now, with the product being withdrawn from the market, those deals too will not be renewed.

Whilst there has been public debate about which external surveys to rely on to assess whether Australians get news on social media, the reality of our business is that there has been a marked shift in recent years to short form video, and repeated and sustained feedback from consumers that they want to see less and not more news on our services. To remain competitive with numerous apps growing in the digital space, we must prioritise experiences most relevant and valuable to our users.

Also, as a business, we have every commercial imperative to combat harmful content on our services. Advertisers don’t want their ads appearing alongside harmful content, and consumers won’t continue to use our services unless they feel safe.

We have around 40,000 people across the globe working on safety and security, and we have invested over US$20 billion since 2016. This includes around US$5 billion in the last year alone. This investment includes building and maintaining our content governance and integrity systems.

Over the years, we have increasingly shifted our focus towards using automation and artificial intelligence to proactively protect people from harmful content. Today, for many of our high risk categories of content, like nudity, terrorism content or child exploitation, well over 90% of the content that we remove, we take down proactively.

We also understand the importance of getting local feedback and insights, which is why we partner with a range of child safety and mental health partners in Australia.

We’ve also worked to comply with local regulations — we’ve been working constructively with the eSafety Commissioner since the office was established in 2015, we’ve been one of the leading industry contributors to the developments of industry codes covering safety and now scams, we were one of the first companies to endorse the Safety By Design principles of the eSafety Commissioner.

Finally, we acknowledge the need for greater transparency and accountability of digital platforms. This is why we have long advocated for government regulation of digital platforms, we’ve been a founding signatory to the DIGI Code of Practice on Disinformation and Misinformation, and constructively engaged with policy makers and regulators on how to address many of the issues being considered by the Committee. With a significant number of laws enacted or proposed in Australia focused on digital platforms in recent years, the question is no longer whether regulation is necessary, but whether it is effective at driving investment in safety and security across the industry.

We hope that these remarks are helpful in framing our discussion today and look forward to your questions.

--

--