
Attorney General Letitia James’ new consumer alert marks the start of a new compliance era in New York — one that blends algorithmic decision-making transparency with data privacy accountability. Here is what businesses and privacy teams need to know:
The Age of Personalized Pricing
Today is the day that New York’s Algorithmic Pricing Disclosure Act officially takes effect. So if you’re reading this you need to comply. The law requires companies that use algorithms to adjust or personalize prices based on consumer data to disclose that practice clearly at the point of sale. It is the first statute of its kind in the nation — and it signals that regulators are treating algorithmic pricing the same way they treat automated decision-making technology (ADMT) in employment, credit, and privacy contexts.
Algorithmic pricing, sometimes called surveillance pricing, allows companies to dynamically change prices based on a consumer’s personal profile — including data points such as location, income level, browsing behavior, and purchase history. This can result in two people paying different prices for the same item depending on what a company’s algorithm predicts they will tolerate.
What the Law Requires
Under the new statute, any business that uses algorithmic pricing must include a prominent disclosure near the listed price that reads:
“THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA.”
The disclosure requirement applies to both online and in-store pricing mechanisms where algorithms influence the cost consumers see. Failure to include this notice can result in penalties of up to $1,000 per violation — a serious risk for companies that operate dynamic pricing models across large digital catalogs or mobile apps.
Attorney General Letitia James’ Enforcement Warning
In a consumer alert issued last week, Attorney General Letitia James urged New Yorkers to report undisclosed algorithmic pricing and emphasized that the state would not hesitate to take enforcement action against violators. “New Yorkers deserve to know whether their personal information is being used to set the prices they pay,” she said. “I will not hesitate to act against those who use personal data to manipulate prices without disclosure.”
Her office described several common examples of algorithmic pricing in action:
- Hotel rooms that cost more when booked from higher-income ZIP codes.
- Retail apps that raise prices when customers browse while physically near a store.
- Dynamic e-commerce discounts that vary by location, loyalty status, or browsing behavior.
These examples highlight the intersection of ADMT and consumer protection — showing how automated systems trained on personal data can quietly influence financial outcomes.
Algorithmic Decision-Making Meets Pricing Compliance
Privacy professionals will recognize the parallels between this new pricing law and emerging global regulations governing automated decision-making. In both contexts, regulators are focusing on transparency and fairness — ensuring consumers are aware when algorithms make determinations that materially affect them.
Like other forms of ADMT, algorithmic pricing involves data-driven systems that make individualized decisions without direct human oversight. Under New York’s disclosure law, the act of determining a price based on a consumer’s profile qualifies as a decision with economic impact, triggering the need for disclosure and accountability.
Does Your Businesses Engage in Algorithmic Pricing?
Organizations operating in New York or offering goods and services to New York residents must act quickly to assess whether their systems engage in algorithmic pricing — and if so, how to comply with the new disclosure requirements. Key steps include:
- Conduct a pricing algorithm audit: Identify all tools, APIs, and third-party platforms that influence pricing decisions. Document whether personal data is used directly or indirectly to modify prices.
- Map data inputs: Determine what consumer attributes feed into pricing models — such as geography, device type, purchase history, or demographic categories — and assess whether those inputs create disparate impacts or privacy risks.
- Update UI/UX to include disclosure language: Ensure pricing pages, checkout flows, and loyalty app screens display the required notice where algorithmic pricing occurs.
- Establish recordkeeping: Maintain logs of when, how, and where algorithmic pricing is used. This will be critical if regulators request documentation during an inquiry.
- Coordinate with privacy and legal teams: Align algorithmic disclosure obligations with existing privacy frameworks under the CCPA, CPRA, and NYDFS cybersecurity regulations.
Consumer Awareness and Transparency
The New York Attorney General’s office also provided tips for consumers to identify whether they might be subject to algorithmic pricing. These include comparing online prices with those shown to other users, checking whether discounts are individualized, and noticing when prices change after actions like signing in, changing locations, or searching for related items elsewhere.
While these tips are designed to empower consumers, they also serve as a roadmap for regulators to investigate noncompliant businesses. Transparency will now be the key compliance indicator: if companies cannot clearly show when and how algorithmic systems influence prices, they risk both regulatory penalties and reputational damage.
Algorithmic Pricing in the Broader ADMT Landscape
New York’s law is part of a growing wave of ADMT regulation. From the EU’s AI Act to California’s draft rules on automated decision-making under the CPRA, governments are beginning to require clear notice, explainability, and consumer rights when algorithms influence pricing, hiring, lending, or access to services. These rules share a common principle: data-driven systems that affect people’s money or opportunities must be transparent and accountable.
The convergence of these trends means companies will need unified strategies for AI governance, data ethics, and privacy compliance. Algorithmic pricing can no longer be treated as a marketing optimization tactic — it is now a regulated practice with material risk exposure.
Practical Compliance Strategies
- Inventory and classify pricing models: Identify whether models use personal data directly or as a proxy (e.g., ZIP code, device ID, or browsing history).
- Embed disclosures dynamically: Automate the inclusion of required disclaimers wherever algorithmic pricing logic applies in apps or websites.
- Perform fairness assessments: Use independent audits or bias-detection tools to test whether pricing algorithms result in discriminatory outcomes.
- Integrate AI governance with privacy management: Platforms like CaptainCompliance.com can centralize these controls — linking data maps, consent management, and disclosure workflows.
- Prepare a regulator-facing report: Maintain a documented explanation of pricing models, risk mitigations, and consumer-facing notices.
What Comes Next
Attorney General James’ office has made clear that enforcement will begin immediately. Early investigations are likely to target retail and e-commerce sectors where algorithmic pricing is most prevalent. Companies that use third-party personalization engines should not assume vendors will handle compliance — the legal burden rests with the entity displaying prices to consumers.
Given the rapid rise of generative AI and predictive modeling tools, this new law serves as a warning: any algorithm that monetizes personal data without transparency will invite scrutiny. Organizations that get ahead of these requirements — by embracing disclosure, fairness, and accountability — will be far better positioned as algorithmic transparency becomes the next frontier in privacy law.
Transparency Is the New Price of Doing Business
New York’s Algorithmic Pricing Disclosure Act redefines the relationship between personalization and privacy. Just as cookie consent banners and privacy notices became ubiquitous in data governance, algorithmic pricing disclosures may soon become the new normal. The message from regulators is unmistakable: when algorithms touch consumers’ wallets, transparency is not optional — it’s mandatory.
For privacy professionals, this is the moment to expand compliance playbooks to include ADMT disclosures, algorithm audits, and fairness reviews. Those who invest early in automation and documentation will not only avoid fines but also build consumer trust in a market where data-driven pricing is increasingly under the microscope.
If you need help with AI Governance in New York State and want a leading privacy software solution please book a demo below with one of our privacy experts.