Why do people criticise Zoom and what could other companies learn from this?

Illia Shenheliia
Apr 28, 2020 · 5 min read

Life and work in quarantine have forced people to look for a convenient tool for remote communication. Thus, Zoom has become a popular solution, hosting thousands of classes, workshops and meetings every day and helping people to communicate with colleagues, relatives, and friends. Due to this wave of hype, Zoom gained more active users in the first three months of 2020 than in the whole 2019, and the value of the company’s shares has increased by a record 40%.

The increased use of Zoom has also led users to pay greater attention to the security and privacy of the service. A number of critical vulnerabilities which compromise the security of the service have been discovered, leading some companies and organizations (such as SpaceX and NASA) to ban the use of Zoom for work purposes.

In this article, Illia Shenheliia, certified data protection specialist (CIPP/E) and Senior Associate at AURUM Law Firm, describes the mistakes made by Zoom and what conclusions other companies can draw from this case.

It should be noted that Zoom’s security and confidentiality problems existed before the COVID-19 pandemic. The only thing that’s changed is the popularity of the service, which has led to more questions being asked due to its newfound widespread relevance. The Zoom example shows that if a company plans to be successful, it should be thinking about privacy and data protection issues from the very beginning of its product development.

Why do people criticise Zoom?

On March 31, 2020, The Intercept published material which states that, contrary to officially published documents and information, Zoom does not use end-to-end encryption, meaning that it has access to the audio and video content of all user calls made through Zoom. Instead of the declared end-to-end encryption, Zoom uses TLS (Transport Layer Security Protocol) technology, which only protects data transmission and does not restrict the service itself from accessing it.

Matthew Green, a professor at Johns Hopkins University, notes that group calls are difficult to encrypt end to end because it is easier for the service to distinguish a speaker from a non-speaking person if it has direct access to unencrypted data. However, there are already examples of successful end-to-end encryption solutions for group calls, such as Apple’s FaceTime.

On March 26, 2020, Vice published an article stating that Zoom provided Facebook with the data of users who didn’t use the “Login/Register via Facebook” feature, which was not specified in Zoom’s privacy policy.

The author of the article claims that Zoom provided Facebook with information including when the user opened the application, the user’s device specifications, the time zone and city they were connecting from, which phone carrier they were using, and a unique advertising identifier created by the user’s device, which companies can use to target advertisements.

A couple of days after the publication of the aforementioned article, Zoom removed the part of their code which transmitted the data to Facebook but failed to ensure that users of the old version of the application switched to the updated one by making the update mandatory.

On March 24, 2020, non-profit consumer protection organisation Consumer Reports revealed that, according to Zoom’s privacy policy, user-generated content (i.e. meeting content, information about members in meetings, files, etc.) may be transferred to third parties such as advertisers.

Such a large amount of data, apparently, may be used to teach artificial intelligence (AI) algorithms speech, emotion, face or object recognition. But, as of now, there is no evidence of Zoom using any such technology. In any case, without the use of AI, this data is likely to be just a comprehensive amount of useless information.

On March 29, 2020, Zoom updated its privacy policy, which now establishes that they do not sell user-generated content to anyone, and neither is it used for advertising purposes.

What was violated by Zoom?

One of the main issues was that Zoom had not complied with the principle of transparency, which is a primary requirement of personal data protection legislation in virtually all jurisdictions, including the California Consumer Privacy Act (CCPA) and the European General Data Protection Regulation (GDPR). In its privacy policy, Zoom was supposed to explain how it processes personal data, to whom it is transmitted and how it is used. Public authorities consider the principle of transparency to be fundamental because, without organisations providing the necessary information, users remain defenceless and cannot exercise other rights, such as the right to access and delete data. In other words, if a user does not know how their personal data is processed, they cannot objectively understand whether the use of their personal data is being abused.

Due to the extensive and complex legal language of the privacy policy, Zoom has reserved the right to do almost anything with user data, which is certainly unacceptable. The more extensive and unclear privacy policy is, the worse it is, and users are more likely to complain.

Additionally, the fact that, contrary to what Zoom declares in its marketing materials, Zoom does not use the end-to-end encryption, may be regarded as unfair competition or deceptive trading practice. There are already some precedents where the U.S. Federal Trade Commission has investigated similar cases.

What conclusions can other companies draw from the Zoom example?

The following fundamental points are applicable to all modern technological companies:

  • Develop the product as if millions of people will be using it tomorrow. If you plan to make your product successful, you must be prepared for such success. Your technological component, as well as the project and legal documentation, must be unimpeachable even at the development stage, since you can’t always know when your product will receive increased attention.
  • Be honest. Do not embellish reality in marketing materials or statements and, especially, do not claim that you are using data protection technology which you aren’t.
  • Be transparent. The more you hide, the worse it will be for you. Your users should be able to understand how their data is being used and how it’s protected.
  • Consider privacy and confidentiality. For your product to be successful, users must trust you. If you can’t ensure privacy and data protection, you’ll lose users.
  • Privacy by design and by default. This means that you should consider privacy from the very beginning — at the design stage of the product — and collect as little personal data as possible, setting the most privacy-friendly settings by default.

The Zoom example demonstrates that even such a progressive and successful company was not ready for increased attention from users. At a critical point, instead of finally establishing itself as the leader among remote communication applications, Zoom risks losing a large percentage of users who care about the privacy of their data.

Take care of privacy and data security and wash your hands.

LawGeek by Aurum

We are passionate about writing simple stories on the most…

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store