When FOMO Trumps Privacy: The Clubhouse Edition

lourdes.turrecha
Privacy & Technology
5 min readFeb 19, 2021

Clubhouse, the new audio-based social media app where users enter rooms to talk about anything they could possibly want, is gaining lots of attention. Part of its popularity comes from it being pegged as the next social media giant. Beyond the hype, its epic product privacy failures warrant scrutiny.

While social media platforms are notorious for their privacy failures, Clubhouse committed its own set at a time when consumer privacy sentiment is increasingly in favor of privacy. What’s more, big tech brands like Apple have been publicly championing privacy, recognizing that privacy has value beyond compliance. Research also shows that there is a clear marketplace demand for privacy. Clubhouse could’ve been the privacy-focused audio-based social media app. It could still be, if it takes a hard look at its privacy infringements and course corrects. Will they?

A Foundation Built on FOMO

Clubhouse was founded by Paul Davison and Rohan Seth last year. In a bidding war that culminated in a ten million-dollar ($10,000,000) Series A in May 2020, Andressen Horowitz won the Clubhouse founders by getting celebrity Kevin Hart onto the app when it only had around five thousand users.

Throughout the pandemic months, Clubhouse rose in popularity. In rolling out the app, Clubhouse leveraged FOMO: users get into Clubhouse by invitation-only, and existing users get a limited number of invites. Clubhouse is reported to have millions of users today, some citing over 10 million. Clubhouse raised a one hundred million-dollar ($100,000,000) Series B round in January 2021 at a 1 billion dollar valuation, despite not having any revenue.

… And Privacy Encroachments

Clubhouse rolled out its app to a global user base without much regard for privacy. Some of the privacy concerns are as follows:

Clubhouse collects not just its users’ personal information, but also their contacts’ — even those who are not Clubhouse users. During sign-up, Clubhouse requests access to a user’s phone contacts. It collects people’s personal information even before they engage with the app. This design is problematic because a user cannot consent on behalf of others to Clubhouse’s collection and use of their personal information. Moreover, this begs the question of whether Clubhouse is creating shadow profiles of non-users.

Clubhouse accesses users’ Twitter account information without explaining why. Clubhouse requests users to connect with their Twitter accounts to find connections. Clubhouse states that it will be able to see users’ Tweets (including protected Tweets), profile information and account settings, and the accounts that users follow, block, and mute on Twitter. It does not disclose why it needs access to all of this information.

Clubhouse claims that it anonymizes the user information uploaded to its servers, but this is doubtful. In requesting access to users’ contacts list, Clubhouse claims that “only anonymized information is uploaded to [their] servers.” Data protection laws like the EU’s GDPR set a high standard in defining “anonymized data” as data rendered anonymous in such a way that an individual data subject is not identifiable. If it’s possible to identify or re-identify the Clubhouse users or their contacts, then the information is not anonymous.

Clubhouse engages in dark patterns in getting users to give access to their contacts and their Twitter information. In asking users for access to their contacts, Clubhouse offers two options: “Don’t Allow” and “OK.” The latter option is in bold, accompanied by a pointing finger emoji, directing the user to give access. Yet when a user chooses “Don’t Allow,” Clubhouse still follows up with a prompt that states, “We need access to your contacts in order for you to choose people to invite” — this time, with only one option labeled “Allow.” These are examples of dark patterns, elements of product design intended to make users do things they might not want to do, but that benefit the business, not the users. Dark patterns are problematic because they manipulate users. Here, they render Clubhouse users’ privacy choice illusory.

Clubhouse has some serious security gaps. According to Alexander Hanff, Clubhouse’s audio recordings are not end-to-end encrypted, which poses serious privacy, security, and personal safety risks. Additionally, Alex Stamos and the Stanford Internet Observatory team found that Clubhouse used Chinese servers even for conversations that only involved Americans, concluding that he can’t recommend using Clubhouse for sensitive conversations.

As a result of Clubhouse’s privacy failings, different data protection regulators are now on alert, and rightly so. In Germany, Clubhouse is facing court action with potential criminal penalties for its privacy infringements. In the US, consumer privacy advocates have alerted their contacts at the Federal Trade Commission and the California Attorney General’s Office.

At best, Clubhouse failed to account for privacy and security in its app rollout. With its funding, it should have put in place the necessary resources to address privacy and security concerns as it decided to continue rolling out the app to millions of users, globally. Why didn’t it?

Hazy Clubhouse Dues

Clubhouse has not yet been forthcoming about its business model, which could have significant privacy implications as demonstrated by other social media companies. Clubhouse has announced plans to use some of its funding as grants towards creators on its platform, but hasn’t divulged how it plans to make money for itself.

Today, users are under the impression that they’re downloading a free app. They don’t know what business model they’re agreeing to and whether they’ll be paying for it in other ways. Clubhouse needs to be transparent about its business model, so users can assess the related privacy implications, and be in an informed position to decide whether they want to engage with Clubhouse.

Startups can no longer “collect all the data, all the time, for all purposes” without regard to privacy. Clubhouse founders & investors need to keep up with consumers, regulators, and privacy advocates who are in agreement that privacy is one of the critical issues of our time. Failing to do so means failing on several accounts: failing to build in response to consumer privacy demand, failing to recognize the value of privacy, missing the opportunity to innovate in privacy, and being on the wrong side of history.

Despite what Clubhouse’s user sign-ups might falsely signal, people deep down don’t want to keep hanging out in a clubhouse with walls that have ears.

This post is the first of a series exploring Clubhouse’s privacy failures. The second part is about how Clubhouse’s app rollout signals to consumers it values growth over them and their privacy. The third part provides a privacy blueprint for Clubhouse to leverage privacy in scaling, instead of growing at privacy’s expense.

--

--

lourdes.turrecha
Privacy & Technology

Founder & CEO @PIX_LLC @PrivacyTechRise | Privacy & Cybersecurity Strategist & Board Advisor| Reformed Silicon Valley Lawyer | @LourdesTurrecha