Facial Recognition has quickly risen to become one of the most widely adopted technologies in both private and public sectors due to its ease of use, decrease in implementation costs with wide applications ranging from unlocking mobile phones to behavioral monitoring and credit score tracking. Retail is one sector which has quickly adopted the technology in multiple areas; the extent of which has largely gone unnoticed by the public. This paper aims to bring to light the large-scale unethical use of facial recognition technology by the Retail sector and highlights the numerous public safety risks it poses. It further offers solutions along technological, legal and ethical lines to help avert the privacy crisis. There is a need for stricter Government regulation concerning the Retail sector’s use of facial recognition technology and a greater onus on the businesses to honor their customer’s trust by giving appropriate disclosures and taking required consent before recording and keeping sensitive facial data. The paper also recognizes the need for data collection in the Retail sector and provides technological alternatives to accomplish those goals including HEXAWAVE, Traces AI and shopping cart trackers. Lastly, better data protection measures need to be employed for the collected facial data including the use pixelization, blurring, melding, or scrambling of the stored data to render it less invasive and avoid significant risks in case of data breaches.


Imagine you walk in the store and pick up a shirt you like. You continue to shop but later decide that you want to leave out the shirt you first picked. A minute after you keep the shirt on a store rack you get an email with a discount coupon for that shirt. Would you still go shopping if you knew you were being tracked using facial recognition technology?

The breakthrough technology of facial recognition enables retailers to perform customer tracking with great efficiency. In USA, 67% of retail stores might use or already use facial recognition technology. Recently, the pharmacy store Rite Aid installed facial recognition systems across 200 of their stores. The use of facial recognition technology has also rapidly increased in the public sector and is widely used by Governments. With the development of technology and tools making it cheaper to implement facial recognition, several retail chains have started to adopt this technology to track consumers, prevent crime and shoplifting in stores, and monitor employees. Stores including Apple, Albertsons, and Macy’s are using facial recognition technology, with several others planning to start using it in the future. This brings us to the question: What does the application of this technology mean to our privacy and is it ethical to collect such information about us? Imagine a store sending promotions based on products you put in your cart or look at but do not buy or maybe even preventing you from entering the store if you have a criminal record. Worse yet, data breaches could mean personally identified data out in public. It is, therefore, crucial to understand how the installation of facial recognition technology in retail stores will affect us consumers and look at ways we can prevent a potential invasion of privacy. In this white paper, we will give you a background of facial recognition technology and why it is a topic of concern. We will then go on to explain how legislation can aid in increasing consent and limiting data collection. In addition, we will explore how we can use innovative technology to increase privacy in this scenario.


There are both benefits and risk to any technology. Facial recognition technology too has proven to be beneficial in terms of its applications across industries. This technology is used by mobile phone makers in their products (e.g., Apple uses FaceID as a security gateway for the iPhone), Governments to monitor people for security purposes (e.g., facial recognition tracking at airports), and retailers to monitor shoppers in their store.

However, with the potential benefits of such a system come a wide variety of risks and ethical implications. Facial technology has raised numerous concerns about lack of regulation, privacy violations, racial bias in algorithms leading to discrimination, insufficient disclosures, and a lack of consent for data collection and sharing. Violation of privacy policies is turning out to be a major problem in the analytics world. As rightly pointed out by Helen Nissenbaum in A Contextual Approach to Privacy Online, it is often the case that companies collecting data imply consent from people rather than giving them an informed choice to opt-in. As a result, we are either slapped with a legally jargoned privacy policy that is not comprehended by the layman or not informed about the data collected and the respective privacy policies. With instances of data breaches across top enterprises such as Yahoo (August 2013), LinkedIn (June 2021), and Facebook (April 2019), people have become more aware of the risks of sensitive data being collected and exposed. Due to this, companies today face a significant challenge in data collection and storage and must identify the need for proper consent from customers about this data being collected. As with most technological systems, facial recognition software faces the same essential problem — Infringement of Privacy without Consent. In 2014, as a part of Ergys Ristani’s research for the Duke Multi-Target, Multi-Camera (MTMS) project, pictures of about 2,700 individuals were captured near Duke Chapel through eight surveillance cameras. Data from this video footage was posted on a public website and later became available in public databases across China. How ethical was it to collect video footage of individuals without consent and make this data publicly available? Were any of the subjects even informed about such data being collected in a public space? These datasets are argued to act as training data to improve the existing facial recognition algorithms without considering the ethical implications of a breach of privacy. How should the data be collected in an ethical manner? A survey of 480 researchers was conducted by

Nature to gain their insight into the privacy measures to be taken before using the images generated by facial recognition data. As seen from the survey results, about 40% of the researchers believed in obtaining informed consent from individuals before using their data in their research datasets.

Data collected from facial recognition technology is often used as training data to develop efficient facial recognition algorithms. However, this data must be collected ethically by giving individuals the right to optin and providing them full disclosure about the use of their data. The technology is ground-breaking in nature, and the applicability and advantages are far-reaching, but there is a dire need to establish public trust around Data collected from facial recognition technology is often used as training data to develop efficient facial recognition algorithms. However, this data must be collected ethically by giving individuals the right to opt-in and providing them full disclosure about the use of their data. The technology is ground-breaking in nature, and the applicability and advantages are far-reaching, but there is a dire need to establish public trust around.


In the retail industry, big brands such as Apple, Macy’s, H-E-B Grocery, and Albertsons are consistent users of facial recognition technology to track shoplifters and prevent fraud which can help in minimizing losses. However, being completely focused on the business metrics affected by this technology, companies are giving up user data privacy. Just because a person enters a store, their face should not be scanned and stored in the database without consent. The use of this facial recognition software in retail stores might also represent racial bias while identifying shoplifters, which is a severe issue.

The application of facial recognition algorithms in the retail sector is far beyond detecting shoplifters and preventing fraud. Companies can use it to identify loyal customers and offer them promotions and discounts, track user movements in the store to determine preferences, and track employees to measure productivity. However, sometimes this data is collected unethically without consent in the first place and could then be exposed by hackers and scammers. Profits could surge by using machine learning algorithms and generating deep insights from such data but prioritizing profits over privacy is a severe lapse in the ethical standards of a firm.


Addressing the data privacy issues within the retail industry, where the customers are the most critical stakeholder, is vital and should drive the company’s business strategy. Transparency should be necessary when collecting data that identifies individual human participants in the context of facial recognition. Transparency about a company’s data collection policies and use can help create trust with the consumers, likely increasing customer loyalty. The video collaboration provider Pexip collects facial data but has clear transparent policies about using it internally to improve its product offerings and never sell or allow access to third parties.

Improved measures to obtain consent and introduce transparency towards customers will also increase compliance to facial recognition laws, where they exist, and avoid litigation. Numerous laws exist to regulate the handling of sensitive facial recognition data. A notable law is Illinois’s Biometric Privacy Identification Act (BIPA). This is mainly in the limelight due to Patel vs. Facebook, where Facebook paid $550 million over non-compliance. BIPA requires private companies to seek written consent from consumers before collecting any biometric records and imposes a $5,000 per individual penalty for instances of non-compliance. Therefore, companies can avoid negative press and huge losses if they view consent as a necessity in their facial recognition terms of service. Facebook was found guilty of using data without proper consent which resulted in a hefty settlement. Brand equity and image play a large part in the retail industry, which is why complying with privacy regulations and finding proactive solutions to deal with these issues can help a company differentiate itself from its competitors and expand exponentially.

We have proposed multiple solutions in the subsequent section addressing the problem of privacy infringement while using facial recognition software, which can help companies set themselves apart as trustworthy brands in the public’s eyes.


A solution involves companies themselves deciding to stop the sales of facial recognition technology. For instance, Microsoft, Facebook, Amazon and IBM have all at least temporarily stopped selling or using their facial recognition software. It is a positive sign that several technology giants have responded to racial profiling concerns from the ACLU. However, this is not a lasting solution because it’s dependent on the companies themselves deciding to stop the sale of facial recognition technology and the problems of using facial data are delayed, not prevented from occurring. Halting the sales of facial recognition technology should be part but not the only solution.

The next step is for companies to let customers opt-in to use their facial recognition data. Customers may be willing to allow their photos to be used for specific business purposes or research studies, but they may not presently trust the companies’ intentions. This benefits the companies as facial data can provide valuable insights on countless projects such as retail customers’ shopping preferences or promotion opportunities. To see customers’ reactions when viewing a specific product or marketing promotion could offer a significant amount of helpful information. The downside is many customers do not currently trust the terms and consent forms as they’re highly confusing. Many may feel their personal data, such as facial expressions when eating, could be sold or used for harmful purposes. Lengthy and unclear terms and consent form only adds to this suspicion of not trusting how this facial data will be used. Consent forms could be a great way to show the company’s intentions, but they need to outline how customers’ data will be used clearly.

Another viable solution to improving the use of facial recognition technology is through US federal legislation. A pivotal part of this law is to keep the data in an encrypted form, so customers can feel secure that personal data will not be breached. Any kind of information that does not directly identify an individual is said to be “identifiable” information. If there are comprehensive laws differentiating between “identifiable” and “identified” information, companies may have an incentive to keep the data secured and encrypted. There are numerous benefits to passing federal legislation, such as providing legal precedents for companies to follow. For example, this legislation incorporating components on keeping the data in an identifiable form may lead to companies factoring this in when using or analyzing facial recognition data. Eventually, there will be a set of best practices on keeping customers’ data from being identified in a retail or any industry. A potential negative of the passing of federal legislation is the risk of having a law that is too broad as there is with data privacy in Europe. The current laws in Europe are broadly defined to include the same legislations for both identifiable and identified information. Certain aspects of the law may be geared for Big Tech but not for retail. This situation could mean there are facial laws passed that have no relevance for specific industries like retail. A potential result is an organization like Amazon may have to follow different facial recognition laws and guidelines based on the company division. This means that the government needs to ensure that the facial recognition laws passed apply to all companies and industries.


Apart from solutions focusing on diminishing the lack of consent and improving the federal legislation, technological approaches can help solve the problems highlighted. On one hand, companies can consider adopting alternative technologies instead of facial recognition at the stage of collecting data. One possible way is using items associated with the objects; another is using less identifiable features of the things. Liberty Defense uses the HEXWAVE system, a dynamic real-time 3D radar imaging that only captures items on an individual, irrespective of any individual’s facial or bodily features. Traces AI, a Y Combinator-backed computer vision startup, creates an artificial intelligence platform that can track people without facial recognition. Instead of using identified facial features, it uses a combination of different parameters from the visuals, like your hairstyle, whether you have a backpack, your type of shoes, and the variety of your clothing. And faces would be blurred before being sent to the cloud. In light of these, retail stores can abandon the usage of facial recognition and use carts/baskets-tracking or footstep-tracking methods as alternatives. The shopping cart tracking ideas were implemented early in 1937 at the Piggly Wiggly supermarket chain in Oklahoma City. Each shopping cart comes with a small computer that can be programmed to understand shoppers’ buying patterns. It reveals the speed of the shopper, how long it takes to make a selection, the preferred route, and the order in which items are placed in the cart. Recording step-by-step movements give much deeper insight into marketing effectiveness and product associations and the intentions, preferences, and habits of thousands of customers while keeping them anonymous. On the other hand, developing technologies that can improve the privacy protection of facial data and make it less invasive is also a feasible approach. There are four types of algorithms: pixelization, blurring, melding, and scrambling. Pixelization is commonly used in television news and documentaries to obscure the faces of suspects, witnesses, or bystanders to preserve their anonymity. The second approach for privacy protection removes details in the face by applying a Gaussian low pass filter. More precisely, the image is convolved with a 2D Gaussian


function. Blurring is sometimes preferred to pixelization to obscure privacy-sensitive information. Image melding modifies a person’s face (a target image) by mixing another person’s face (a source image) provided by the user. Scrambling algorithms vary a lot, for instance, pseudo-randomly flipping coefficients sign, rearranging the order of transform coefficients in each block of the image, and using machine learning methods. Companies can apply those technologies in facial data collection and storage stages, which keep the information less intrusive and identifiable instead of identified.


The applications and benefits of Facial Recognition technology are far-reaching in all sectors; however, its use in the retail industry desperately needs further scrutiny. Numerous major retail players have integrated Facial Recognition modules as part of their stores for the data collection on consumers due to its accessibility and cost-friendliness, which most consumers are not aware of and have not consented to. In this paper, we have outlined how the retail sector can be more responsible in its use of facial recognition technology and live up to its responsibility towards consumers and society. Facial Recognition collects identified data that is extremely sensitive and puts subjects at high risk in case of data breaches. Our policy recommendations encourage the retail sector to ponder over their use of this technology and data storage methods and provide technological alternatives to achieve the same goals with less risk for consumers. We also encourage more discourse and further legislation to regulate the use of facial recognition technology to restrict it from being the go-to option for the retail sector and uniformity between federal and state laws. We welcome the companies that have disbanded their facial recognition programs, such as Microsoft and Amazon, and encourage others to do the same or at least seek consent from unknowing consumers before collecting their sensitive data


  1. Biometric Information Privacy Act (BIPA). (2021, August 23). Retrieved from
  2. 2.Retail Stores Facial Recognition Civil Rights Organizations Ban. (2021, July, 14). Retrieved from
  3. 3.Claburn, T. (2021, May 30). Apple sued in nightmare case involving teen wrongly accused of shoplifting, driver’s permit used by impostor, and unreliable facial-rec tech. Retrieved from
  4. 4.Council, J. (2021, June 16). Lawmakers Re-Introduce Bill That Would Ban Facial-Recognition Technology. Retrieved from
  5. 5.F. Dufaux and T. Ebrahimi, “A framework for the validation of privacy protection solutions in video surveillance,” 2010 IEEE International Conference on Multimedia and Expo, 2010, pp. 66–71, doi: 10.1109/ICME.2010.5583552
  6. 6.Fisher, R. (2019, August 16). Traces AI: An Alternative to Facial Recognition. Retrieved from
  7. 7.Future, F. F. (n.d.). Ban Facial Recognition In Stores Ban Facial Recognition In Stores . Retrieved from
  8. 8.Gershgorn, D. (2021, July 14). Retail stores are packed with unchecked facial recognition, civil rights organizations say. Retrieved from
  9. 9.Lauren_feiner. (2021, June 14). Rules around facial recognition and policing remain blurry. Retrieved from
  10. 10.Liberty Defense. (2019, December 11). Facial Recognition: Emergence, Threat, and Alternatives. Retrieved from
  11. 11.Pascu, L. (2019, October 18). MegaFace facial recognition dataset origin raises privacy and liability concerns: Biometric Update. Retrieved from
  12. 12.Patel v. Facebook, Inc., №18–15982 (9th Cir. 2019). (n.d.). Retrieved from
  13. 13.Prang, A. (2021, November 02). Facebook to Shut Down Facial Recognition in Photos, Videos. Retrieved from
  14. 14.R. Jiang, A. Bouridane, D. Crookes, M. E. Celebi and H. Wei, “Privacy-Protected Facial Biometric Verification Using Fuzzy Forest Learning,” in IEEE Transactions on Fuzzy Systems, vol. 24, no. 4, pp. 779- 790, 1 Aug. 2016, doi: 10.1109/TFUZZ.2015.2486803
  15. 15.Reports, S. (2020, July 28). Rite Aid deployed facial recognition system in hundreds of U.S. stores. Retrieved from Richardson, R. (2019, June 14). Pictures of 2,000 Duke students were available on a public database accessed in China. Retrieved November 16, 2021, from
  16. 16.Shopping Carts Will Track Consumers’ Every Move. (2014, July 23). Retrieved from
  17. 17.Towey, H. (2021, July 19). The retail stores you probably shop at that use facial-recognition technology. Retrieved from
  18. Van Noorden, R. (2020, November 18). The ethical questions that haunt facial-recognition research. Retrieved from
  19. 19.Written by Steve Symanovich for NortonLifeLock. (n.d.). What is facial recognition? How facial recognition works. Retrieved from
  20. 20.Y. Nakashima, T. Koyama, N. Yokoya and N. Babaguchi, “Facial expression preserving privacy protection using image melding,” 2015 IEEE International Conference on Multimedia and Expo (ICME), 2015, pp. 1–6, doi: 10.1109/ICME.2015.7177394l



This Whitepaper was co-authored by Ahmad Muaaz Awan, Agrima Agarwal, Jordan Chessin, Shanay Shah and Yue Pan as part of Duke University, Fuqua School of Business’ course MANAGEMENT 545-Q in November 2021




Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Samsung ends rumors regarding the Galaxy S8’s launch date

Android Tablets are the future as per Google

The Free Creative Suite: Everything you need to deliver content at no cost!


Is Virtual Reality a more meaningful experience than Zoom for people delivering or receiving…

Reigniting moonshot thinking in government

El Correo Libre Issue 31

What are the positive factors of Memory FoamMattresses?

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Muaaz Awan

Muaaz Awan

More from Medium

“Everything I Never Told You” Chapters 1–4 Review

Hot or Not? Website Briefly Judges Looks

CS 371p Spring 2022: Andrew Luo

The Lone Reasoner