Why do people criticise Zoom and what could other companies learn from this?
Life and work in quarantine have forced people to look for a convenient tool for remote communication. Thus, Zoom has become a popular solution, hosting thousands of classes, workshops and meetings every day and helping people to communicate with colleagues, relatives, and friends. Due to this wave of hype, Zoom gained more active users in the first three months of 2020 than in the whole 2019, and the value of the company’s shares has increased by a record 40%.
The increased use of Zoom has also led users to pay greater attention to the security and privacy of the service. A number of critical vulnerabilities which compromise the security of the service have been discovered, leading some companies and organizations (such as SpaceX and NASA) to ban the use of Zoom for work purposes.
In this article, Illia Shenheliia, certified data protection specialist (CIPP/E) and Senior Associate at AURUM Law Firm, describes the mistakes made by Zoom and what conclusions other companies can draw from this case.
It should be noted that Zoom’s security and confidentiality problems existed before the COVID-19 pandemic. The only thing that’s changed is the popularity of the service, which has led to more questions being asked due to its newfound widespread relevance. The Zoom example shows that if a company plans to be successful, it should be thinking about privacy and data protection issues from the very beginning of its product development.
Why do people criticise Zoom?
Absence of end-to-end encryption
On March 31, 2020, The Intercept published material which states that, contrary to officially published documents and information, Zoom does not use end-to-end encryption, meaning that it has access to the audio and video content of all user calls made through Zoom. Instead of the declared end-to-end encryption, Zoom uses TLS (Transport Layer Security Protocol) technology, which only protects data transmission and does not restrict the service itself from accessing it.
Matthew Green, a professor at Johns Hopkins University, notes that group calls are difficult to encrypt end to end because it is easier for the service to distinguish a speaker from a non-speaking person if it has direct access to unencrypted data. However, there are already examples of successful end-to-end encryption solutions for group calls, such as Apple’s FaceTime.
Illegal transfer of data to Facebook
The author of the article claims that Zoom provided Facebook with information including when the user opened the application, the user’s device specifications, the time zone and city they were connecting from, which phone carrier they were using, and a unique advertising identifier created by the user’s device, which companies can use to target advertisements.
A couple of days after the publication of the aforementioned article, Zoom removed the part of their code which transmitted the data to Facebook but failed to ensure that users of the old version of the application switched to the updated one by making the update mandatory.
Using user-generated content for advertising purposes
Such a large amount of data, apparently, may be used to teach artificial intelligence (AI) algorithms speech, emotion, face or object recognition. But, as of now, there is no evidence of Zoom using any such technology. In any case, without the use of AI, this data is likely to be just a comprehensive amount of useless information.
What was violated by Zoom?
Additionally, the fact that, contrary to what Zoom declares in its marketing materials, Zoom does not use the end-to-end encryption, may be regarded as unfair competition or deceptive trading practice. There are already some precedents where the U.S. Federal Trade Commission has investigated similar cases.
What conclusions can other companies draw from the Zoom example?
The following fundamental points are applicable to all modern technological companies:
- Develop the product as if millions of people will be using it tomorrow. If you plan to make your product successful, you must be prepared for such success. Your technological component, as well as the project and legal documentation, must be unimpeachable even at the development stage, since you can’t always know when your product will receive increased attention.
- Be honest. Do not embellish reality in marketing materials or statements and, especially, do not claim that you are using data protection technology which you aren’t.
- Be transparent. The more you hide, the worse it will be for you. Your users should be able to understand how their data is being used and how it’s protected.
- Consider privacy and confidentiality. For your product to be successful, users must trust you. If you can’t ensure privacy and data protection, you’ll lose users.
- Privacy by design and by default. This means that you should consider privacy from the very beginning — at the design stage of the product — and collect as little personal data as possible, setting the most privacy-friendly settings by default.
The Zoom example demonstrates that even such a progressive and successful company was not ready for increased attention from users. At a critical point, instead of finally establishing itself as the leader among remote communication applications, Zoom risks losing a large percentage of users who care about the privacy of their data.
Take care of privacy and data security and wash your hands.