If you’ve been considering implementing palm scanning tech into your company and want to conduct a data protection impact assessment on how this will affect your organization then after reading our piece below reach out to the Captain Compliance privacy team to help with the proper privacy requirements and get a free privacy audit courtesy of our compliance superhero team.
There is something deceptively intuitive about palm-scanning technology. Wave your hand, get access. No card, no PIN, no fumbling for a phone. The gesture is so natural, so frictionless, that it can obscure what is actually happening: the collection, storage, and ongoing processing of one of the most sensitive categories of personal data that exists. Your palm, unlike your password, cannot be reset.
For privacy professionals, palm-scanning technology sits at the intersection of some of the most demanding legal frameworks in operation today — biometric privacy statutes, sensitive data categories under comprehensive state privacy laws, cross-border transfer restrictions, sector-specific regulations, and a rapidly evolving litigation landscape. The technology is spreading fast. The governance infrastructure around it is patchy, jurisdiction-dependent, and full of gaps that organizations are currently discovering in the worst possible way: through class action lawsuits and regulatory enforcement actions.
This article is a practitioner’s guide to the privacy issues that palm-scanning technology raises and the compliance obligations that organizations deploying it — or considering deploying it — need to understand before a scan takes place.
What Palm-Scanning Technology Actually Is
Before the legal analysis, the technical foundation matters. Palm-scanning encompasses several distinct modalities, each with different data profiles and risk characteristics.
Palm vein scanners take an image of the veins inside a person’s hand and compare it to previously collected materials in a database. They have been shown to be more reliable than finger or retina scans. While fingerprints are easily affected by external factors like aging, disease, or the state of the skin, palm vein patterns generally remain the same throughout a person’s life. Furthermore, the palm vein pattern is larger in size than the finger or iris and the scan contains more data, which contributes to its accuracy.
There is also palmprint recognition, which analyzes the surface characteristics of the palm — principal lines, wrinkles, ridges, skin texture — and palm geometry systems, which assess the shape and spatial dimensions of the hand. Each modality creates what is commonly called a biometric template: a numerical representation of the palm’s unique features, derived from the raw scan and used for subsequent matching. This distinction between the raw image and the processed template is legally significant, as we will see.
With palm vein biometric payments, consumers wave their hands over a near-infrared sensor. The system creates a unique digital map of each consumer’s palm veins. When it’s time to pay, consumers wave their hands over a sensor at the checkout. The same underlying infrastructure has been deployed across retail payments, venue access control, employee timekeeping, healthcare patient identification, workplace security, and airport boarding — use cases that vary enormously in context, consent conditions, and risk profile, but that share the same fundamental privacy architecture.
Of the 70% of consumers who say they have used biometric authentication, only 8% claim to have used palm scanning. By contrast, 45% said they had used fingerprint scanning and 34% said they had used facial recognition technology. Palm scanning is still in relatively early adoption stages in the consumer context — but that gap is closing rapidly, and the legal frameworks governing it are already in place and being actively enforced.
Why Biometric Data Is Categorically Different
Before any specific legal analysis, it is worth being explicit about why palm-scanning data demands heightened privacy treatment compared to other personal information categories. The answer is not simply that regulators have decided to classify it as sensitive — it is that the underlying properties of biometric data make standard privacy remedies structurally inadequate.
If your credit card number is compromised, you cancel the card. If your email address is exposed, you change the address. If your password leaks, you reset it. If a database of palm vein templates is hacked, your biometric data could be exposed, even if it’s encrypted. This is a risk all biometrics face, but it still threatens privacy — and unlike any other data type, there is no remedy. You cannot get new palms. The irrevocability of biometric compromise is the core reason why the legal frameworks around it are so demanding, and why organizations deploying palm-scanning technology need to treat that data with a fundamentally different level of care than they would apply to conventional personal information.
There is a second reason that is less frequently articulated but equally important: the contextual integrity of biometric data is inherently fragile. Consumers want to know that if they intend for their biometric scan to allow them to make payments, merchants and players in the payment chain will use it only for payment transactions and no other purposes. The moment palm data collected for one purpose — retail payment — is used for another — behavioral tracking, advertising profiling, law enforcement assistance — the fundamental promise of the enrollment is broken. That broken promise is the source of most biometric enforcement actions and litigation to date.
The Legal Landscape: A Patchwork With Real Teeth
Illinois BIPA: The Gold Standard and the Litigation Engine
The Illinois legislature unanimously passed the Biometric Information Privacy Act (BIPA) in 2008. The law ensures that individuals are in control of their own biometric data and prohibits private companies from collecting it unless they inform the person in writing of what data is being collected or stored, inform the person in writing of the specific purpose and length of time for which the data will be collected, stored, and used, and obtain the person’s written consent. Biometric information includes retina or iris scans, fingerprints, voiceprints, hand scans, facial geometry, DNA, and other unique biological information.
BIPA also requires a publicly available retention and destruction policy, including deletion when the original purpose is satisfied or within three years of the last interaction. Critically, BIPA is currently the one legislation that makes it unlawful for private companies to use biometric technology to identify and track people without their consent. And unlike virtually every other U.S. privacy statute, it provides a private right of action without requiring proof of actual harm.
In Rosenbach v. Six Flags Entertainment Corp. (2019), the Illinois Supreme Court held that a person is “aggrieved” under BIPA and can sue without needing to prove any actual injury. In other words, companies can be liable for mere technical violations — such as failing to get written consent or to post a retention policy — even if no identity theft or other harm occurred.
The financial exposure this creates is significant. A prevailing plaintiff under BIPA may recover the greater of $1,000 for each negligent violation, $5,000 for each intentional or reckless violation, or actual damages, as well as reasonable attorneys’ fees and litigation costs. In 2023, the Illinois Supreme Court in Cothron v. White Castle Systems ruled that Section 15(b) claims accrue and are subject to separate damages for each scan taken without written informed consent. Following that ruling, the Illinois General Assembly amended BIPA in August 2024 to limit recovery to a single violation per individual rather than per scan — significantly limiting potential damages and updating the Act’s definition of “written release” to include an “electronic signature.” Even with this amendment, the exposure for organizations with large workforces or high customer volumes remains material.
For palm-scanning deployments specifically, the BIPA risk profile is acute. Employee timekeeping using biometric scanners has been one of the most litigated contexts under the statute. A striking example was a case against Pret A Manger, where the company’s fingerprint-based timeclock system for employees allegedly failed to meet BIPA’s notice and consent requirements. Pret A Manger settled the case for $677,000 to compensate about 800 workers.
Texas, Washington, and the Multi-State Framework
Texas and Washington require consent but you cannot sue — only the Attorney General can enforce. That enforcement asymmetry does not mean the risk is low. Texas got $1.4 billion from Meta in 2024 for violations of its Capture or Use of Biometric Identifier Act (CUBI) — a figure that demonstrates what AG-level enforcement looks like when it is actually deployed.
Over twenty states with comprehensive privacy laws classify biometric data as “sensitive,” requiring explicit consent. Colorado, effective July 2025, is among the strongest. New York’s legislature has repeatedly considered a Biometric Privacy Act modeled on Illinois’ BIPA, and bills are pending in states like Massachusetts and Missouri that would create BIPA-style frameworks. The legislative momentum is unambiguously in the direction of more protection, not less.
GDPR and International Frameworks
Outside the United States, palm scan data is classified as biometric data processed for the purpose of uniquely identifying a natural person — a special category under Article 9 of the GDPR, which applies the most restrictive processing conditions in the regulation. Processing special category data requires an explicit legal basis beyond the standard Article 6 bases, with explicit consent being the most commonly relied upon basis for commercial palm-scanning deployments in Europe.
The practical implication is that a palm-scanning deployment that meets U.S. state law standards — written consent before collection, published retention policy, secured storage — may still fall short of GDPR requirements if: the consent was not specific and granular enough; the data processing agreement with the biometric vendor does not meet Article 28 requirements; the data is transferred to a cloud provider outside the EEA without an appropriate transfer mechanism; or the Data Protection Impact Assessment (DPIA) required under Article 35 for systematic biometric processing was not completed before deployment.
The Real-World Litigation: Amazon and the Cloud Storage Problem
The Amazon One palm-scanning litigation offers a detailed case study in where palm-scanning programs go wrong from a privacy compliance perspective. Privacy concerns surrounded Amazon One’s technology since its launch. In 2021, a group of U.S. senators urged Amazon to disclose further details about its intentions concerning customer biometrics, expressing concerns about whether biometric data was being used for advertising and tracking purposes.
In contrast with biometric systems like Apple’s Face ID and Touch ID or Samsung Pass, which store biometric information on a user’s device, Amazon One reportedly uploads biometric information to the cloud, raising unique security risks. This architectural decision — centralized cloud storage rather than on-device processing — is one of the most consequential privacy design choices in any biometric system, because it concentrates risk in a single location and creates a high-value target for malicious actors.
A class action lawsuit alleged that Amazon violated NYC’s Biometric Identifier Information Law by collecting customers’ biometric data without providing the required clear and conspicuous notice. The lawsuit claimed that Amazon scans customers’ palmprints when they enter and also tracks their whereabouts within Amazon Go stores using scans of the sizes and shapes of their bodies to associate each person with the products they touch. The complaint further alleges that Amazon transmits biometric data outside of its Amazon Go stores to its cloud services, where Amazon converts, analyzes, and applies the information to make decisions about which customers have moved where and what items they have removed from or returned to shelves.
The lawsuit additionally alleged that the signage posted by Amazon “woefully fails to comply with the disclosure mandate” and that “the sign informs customers that Amazon will not collect biometric identifier information on them unless they use the Amazon One palm scanner to enter the Amazon Go store, even though Amazon Go stores do collect biometric identifier information on every single customer, including information on the size and shape of every customer’s body.”
Ultimately, Amazon ended its Amazon One palm ID system for retail as it closed its physical stores, with the technology having raised privacy and security concerns throughout its operational lifetime. The cautionary arc of Amazon One — from innovation to controversy to discontinuation — is instructive for any organization currently deploying or evaluating palm-scanning technology.
The Key Privacy Issues Organizations Must Address
1. Consent: Genuinely Informed, Not Technically Obtained
The most common failure mode in biometric compliance is treating consent as a box to check rather than a meaningful communication. For palm-scanning systems, consent must be:
- Granular and specific: Consumers must understand not just that a biometric system exists, but what specifically is being captured (palm image, vein pattern, or geometric template), how it is processed into a biometric template, where that template is stored, how long it is retained, and who has access to it.
- Prior to any collection: Before collecting any biometric data, entities must inform individuals in writing about the purpose and duration of data collection, storage, and usage. They must also obtain written consent from the individual. The sequence is non-negotiable — consent cannot be obtained retroactively.
- Free of coercion: In an employee context, this is particularly challenging. Where palm scanning is deployed as the sole method of timekeeping or access control, employees effectively have no meaningful choice but to enroll. Regulators and courts have increasingly scrutinized whether consent obtained in contexts of power imbalance is genuinely voluntary, and employment relationships create exactly that dynamic.
- Renewable and revocable: Consent obtained at enrollment does not last indefinitely. Organizations should build mechanisms for consent renewal and — critically — for consent withdrawal. A consumer or employee who withdraws consent must be able to do so without losing access to services or employment as a consequence.
2. Purpose Limitation and Scope Creep
Consumers want to know that if they intend for their biometric scan to allow them to make payments, merchants and players in the payment chain will use it only for payment transactions and no other purposes. The consumer expectation is clear. The organizational temptation to expand the use of high-quality biometric data to other purposes — behavioral analytics, attendance monitoring, security investigations — is equally clear. These two things are in fundamental tension, and the legal frameworks resolve that tension firmly in the consumer’s favor.
Secondary use of palm-scanning data without additional, specific consent is a BIPA violation, a GDPR Article 9 breach, and potentially a violation of multiple state comprehensive privacy laws’ provisions governing sensitive data. Organizations must implement technical and organizational controls — access restrictions, audit logging, data use registers — that make purpose limitation enforceable rather than merely aspirational.
3. Data Architecture and the On-Device vs. Cloud Storage Question
The privacy risk profile of a palm-scanning system is determined at the architectural level before a single palm is scanned. The key design decisions are:
- On-device vs. centralized storage: Systems that process biometric data on-device or at the edge and never transmit raw biometric templates to a centralized server have a fundamentally smaller attack surface than cloud-based systems. If a database of palm vein templates is hacked, your biometric data could be exposed, even if it’s encrypted. Minimizing what is stored centrally minimizes the catastrophic breach scenario.
- Raw image vs. mathematical template: Organizations should confirm that their palm-scanning vendor stores mathematical templates rather than raw images. Templates that cannot be reverse-engineered into usable biometric data offer meaningfully stronger privacy protection than retained images.
- Encryption and key management: Biometric templates must be encrypted at rest and in transit, with key management practices that ensure decryption requires explicit authorization. Access controls should be audited regularly.
4. Retention and Deletion
BIPA requires entities to develop and make publicly available a written policy that outlines their retention schedule and guidelines for permanently destroying biometric data when the original purpose for collecting it has been satisfied or within three years of the individual’s last interaction with the entity, whichever comes first. This retention ceiling applies regardless of whether the organization considers the data still useful.
Deletion must be genuine — not an archival or pseudonymization that preserves the template in a different form — and must extend to backup systems, vendor copies, and any downstream systems that have received the biometric data. Organizations that have deployed palm scanning without a documented, operationally implemented retention and deletion schedule are in violation of BIPA and similar frameworks regardless of what their consent documentation says.
5. Vendor Due Diligence and Data Processing Agreements
The majority of palm-scanning deployments rely on third-party technology providers — the scanner hardware, the template processing software, the cloud storage infrastructure are rarely all built in-house. Each of those vendor relationships creates a data processing arrangement that must be legally structured before any biometric data flows.
Under BIPA, the prohibition on disclosing biometric data to third parties without consent applies to vendor relationships. A private entity in possession of biometric information must not disclose or disseminate a person’s biometric information unless the subject of biometric information consents to the disclosure. This means that unless the enrollment consent document explicitly identifies the technology vendors who will receive and process the palm data, sharing it with those vendors may itself constitute a BIPA violation.
Under GDPR, vendor relationships must be governed by Data Processing Agreements meeting the requirements of Article 28, including audit rights, sub-processor restrictions, deletion obligations, and security standards. For biometric vendors operating outside the EEA, the cross-border transfer mechanism must be in place before data flows.
6. Notice and Signage Requirements
New York City implemented a biometric identifier law in 2021 requiring that any business that collects biometric information from customers post a conspicuous notice at the entrance. Although liveness detection and other protocols have reduced concerns about the privacy and security of palm technology, consumers should still be aware that the database where their biometric data is stored can still be hacked. It is also unclear whether retailers will improperly use the biometric data they collect for advertising and tracking purposes.
The signage requirement exists because biometric collection — particularly in public-facing retail or venue contexts — can occur passively, without consumers making a deliberate enrollment decision. Where any biometric data is being collected from individuals who have not specifically enrolled (body size and shape tracking in Amazon Go is the clearest example), the disclosure obligation extends to that passive collection, not just the voluntary enrollment.
7. The Employee Context Demands Separate Analysis
The privacy issues in an employee palm-scanning deployment — for timekeeping, workplace access, or healthcare worker authentication — differ from the consumer context in important respects. The power differential in the employment relationship makes voluntariness of consent structurally suspect. In Illinois, the McDonald v. Symphony Bronze ruling confirmed that employees can bring BIPA claims regardless of workers’ compensation exclusivity, meaning workplace biometric disputes do not stay out of court simply because they arise in an employment context.
Organizations considering employee palm-scanning deployments should: conduct a biometric impact assessment before implementation; consult with employee representatives or unions where applicable — the Walton v. Roosevelt University ruling created a labor management preemption defense in unionized workplaces, but achieving it requires biometric data to have been subject to collective bargaining; develop and distribute a standalone biometric data policy that meets all applicable state law requirements; and build a genuine opt-out mechanism with alternative means of accomplishing the same function.
What Good Palm-Scanning Compliance Looks Like
For privacy professionals advising organizations on palm-scanning deployments, the compliance architecture needs to be built before the devices are installed, not retrofitted after enrollment has begun. The non-negotiable elements are:
- A pre-deployment Data Protection Impact Assessment or equivalent risk assessment
- A biometric data policy meeting the most stringent applicable state law standard — which, for most deployments, means BIPA — made publicly available before any collection
- Enrollment consent that is specific, informed, prior, and documented — obtained through a process that is separate from any other agreement
- Explicit vendor contracts governing biometric data processing, storage, access, and deletion
- A retention schedule with technically enforced deletion — not just a policy commitment
- Physical signage at all collection points meeting local disclosure requirements
- Technical controls enforcing purpose limitation, including audit trails for biometric data access
- A documented deletion and data subject rights fulfillment process
Palm-scanning technology is not going away. While Amazon has deemphasized the technology, banks and payment firms are testing the biometric option, with JPMorganChase testing it in a company cafeteria and its payments business supporting palm-vein enrollment. The use cases will continue to expand in healthcare, financial services, access control, and consumer retail. The legal frameworks governing it — BIPA, CUBI, GDPR, and the growing roster of state comprehensive privacy laws treating biometric data as sensitive — are already in force and being actively enforced.
The organizations that will navigate this landscape successfully are not the ones that deploy palm-scanning technology and then hire lawyers when the litigation arrives. They are the ones that treat the palm as what it is: an irrevocable piece of a person’s identity that deserves governance architecture commensurate with that permanence.