Amazon Suit Highlights Biometric Data Risks

Peter A. Halprin , Tae Andrews


June 1, 2023

amazon lawsuit biometric identification data privacy

A recent lawsuit filed against Amazon for alleged violations of New York City’s Biometric Identifier Information Law highlights new risks for businesses that use customers’ biometric data in their operations. As similar laws begin to take effect in other jurisdictions, many other companies face potential liability. Fortunately for risk professionals, there are a variety of insurance policies that may provide a defense against such suits and potentially cover any related damages or settlements.

Understanding New York City’s Biometric Law 

New York City’s Biometric Identifier Information Law requires commercial establishments that collect, retain, convert, store or share customers’ biometric identifier information to disclose those activities by placing a clear and conspicuous sign near all entrances and providing notice in plain language. The NYC law defines “biometric identifier information” as physiological or biological characteristics used by, or on behalf of, commercial establishments to identify individuals. These can include retina or iris scans, fingerprints, voiceprints, scans of hand or face geometry and any other identifying characteristics. The NYC law also prohibits commercial establishments from selling, leasing, trading, sharing or otherwise profiting from the exchange of biometric identifier information, and applies to places of entertainment, retail stores, restaurants, bars and other food and drink establishments. 

The NYC law creates a private cause of action, allowing individuals to bring actions on their own behalf against offending commercial establishments that do not comply. Critically, the NYC law has a “cure” provision that requires an aggrieved person to provide written notice to the commercial establishment that sets forth the alleged violations 30 days before filing suit. If the commercial establishment corrects (or “cures”) the violation within 30 days and provides written confirmation to that end, and that no further violations will occur, the aggrieved person may not initiate an action. 

The NYC law authorizes damages of $500 for each violation for failing to provide the required “clear and conspicuous” notice. For the selling, leasing, trading, sharing in exchange for anything of value, or otherwise profiting from the transaction of biometric identifier information, the NYC law authorizes damages of $500. For each intentional or reckless violation, the NYC law authorizes damages of $5,000. The NYC law also allows a prevailing party to recover its reasonable attorney’s fees and costs. 

The NYC law joins a wave of statutes enacted by various states and municipalities in recent years to protect consumers’ biometric data. Other states that have passed biometric privacy laws include Arkansas, California, Colorado, Connecticut, Illinois, Iowa, Oregon, Texas, Utah, Virginia and Washington. Indiana will follow suit in 2026. Illinois has been the leader in this space with its Biometric Information Privacy Act (BIPA), which similarly created a private cause of action for violations. This led to a flood of lawsuits against companies that use biometric data in their business operations. 

The Case Against Amazon Go

The first lawsuit filed under the NYC law illustrates the risk to businesses that use biometric data. In Perez v., Inc., the plaintiff alleges that Amazon violated the NYC law using its Just Walk Out technology at Amazon Go stores. At Amazon Go locations, customers can simply walk out of the stores carrying the goods they want to buy, without checking out with a cashier or scanning goods at a register. Amazon charges them later for the items. 

The complaint alleges that Amazon violated the NYC law by collecting its customers’ biometric data without providing the required clear and conspicuous notice. Specifically, the lawsuit claims that Amazon scans customers’ palmprints when they enter and also tracks their whereabouts within Amazon Go stores using scans of the sizes and shapes of their bodies to associate each person with the products they touch.

The complaint further alleges that Amazon transmits biometric data outside of its Amazon Go stores to its cloud services, where Amazon converts, analyzes and applies the information to make decisions about which customers have moved where and what items they have removed from or returned to shelves. Additionally, the plaintiff alleges that Amazon retains and stores the biometric information of its customers and, in some instances, shares or sells this data for profit. 

The lawsuit claims that Amazon did not post any signs notifying customers that it was using their biometric data for a period of approximately 14 months after the NYC law went into effect. Even after Amazon posted signs at one of its Amazon Go stores in mid-March, it failed to meet the “clear and conspicuous” requirement. The complaint also alleges that Amazon collects scans of the body size and shape of every customer to track their movements within Amazon Go stores, whether or not they use the palmprint scanner, even though the company claims that it does not collect such data. 

The financial implications of the suit could be significant. The plaintiff seeks to certify a class action composed of tens of thousands of customers. The complaint also alleges that a separate violation of the NYC law occurred each time a customer walked into any of the nine Amazon Go stores in New York City without the required signage. If the plaintiff prevails, Amazon’s damages could become astronomical. 

A recent decision under BIPA reinforces this concern. In that case, the Supreme Court of Illinois ruled that a separate BIPA violation occurred each time White Castle employees’ fingerprints were scanned

as they clocked in or out of their shifts. If the class action of approximately 9,500 current and former employees was certified, White Castle estimated the total damages could exceed $17 billion.   

Insurance Implications for Businesses Using Biometric Data 

Businesses that use biometric data may already have insurance policies in place to help cover claims and any resulting damages stemming from the NYC law or other biometric data laws. In a 2021 decision, the Supreme Court of Illinois held that commercial general liability (CGL) policies covered a BIPA claim brought by a customer against a tanning salon for allegedly scanning her fingerprints without her informed consent. The court held that the acts in her allegations constituted covered “oral or written publication of material that violates a person’s right of privacy.” By confirming that CGL policies can and do cover biometric privacy claims, the decision has widespread implications since many businesses with brick-and-mortar operations commonly carry CGL insurance.

Other policies may also cover biometric and data lawsuits, including cyber, directors and officers, errors and omissions, employment practices liability, and technology E&O.

As additional biometric data privacy laws begin to take effect in various states, companies that use such data should make it a priority to assess their insurance policies for biometric risk coverage.

Peter A. Halprin is a partner in Pasich LLP's New York office and represents commercial policyholders in complex insurance coverage matters with a focus on recovery strategies in relation to captive insurance, cyber crime, natural disasters, professional services, regulatory investigations and technology disputes.

Tae Andrews is a senior managing associate in Pasich LLP’s New York office.