Are you paying more than other people? NY cracks down on surveillance pricing

Are you paying more than other people? NY cracks down on surveillance pricing

When you search for a product online, you might think you’re getting the same price as everyone else. Think again. Your price might be different based on everything from your location to what you’ve looked at online. Companies often use algorithms to set their prices that rely heavily on customers’ personal data. Now, the state of New York is forcing companies to come clean when they set prices using customer data.

Anyone using algorithms to adjust pricing for people in the state must now reveal when they’re doing it, thanks to legislation that the state began enforcing this week called the Algorithmic Pricing Disclosure Act.

Algorithmic pricing is also known as “surveillance pricing” because it relies on using a person’s personal data to offer them promotional pricing (or potentially higher prices, if the vendor thinks they’ll pay).

How software algorithms affect the prices you see

The Federal Trade Commission (FTC) warned about this in a report that it released in January this year. It had ordered eight companies (Mastercard, Revionics, Bloomreach, JPMorgan Chase, Task Software, PROS, Accenture, and McKinsey) to disclose the services they offer that use algorithms and consumer data to set or recommend individualized prices, as well as the data inputs, customer lists and potential impact on consumer pricing. From the report:

“A tool could be used to collect real-time information about a person’s browsing and transaction history and enable a company to offer—or not offer—promotions based on that consumer’s perceived affinity.”

This data could include where they are, who they are, what they’re doing, and what they’ve done in the past. The report suggests that companies could use a wide variety of customer data to achieve these goals, including everything from their geolocation to what they’ve looked at on a particular website.

For example, lingering over a particular item with their mouse or watching a certain percentage of a video on a website might alert companies that a consumer has a particular interest.

The same data could be used to create “buckets” of customers with similar profiles (called “segments” in marketing) that companies could use to target people with different pricing.

The FTC report had to use hypothetical examples, following push-back from the companies involved, but what it revealed was enlightening. A company might jack up prices of baby formula offered to a parent found searching for fast delivery, it said.

In another imagined case, a person visiting a car dealership and using an in-store kiosk to explore vehicles might be segmented as a first-time car buyer, the report said. The store might decide that they’re inexperienced about the financing options available, affecting the rates that they’re offered.

The FTC had issued a Request for Information (RFI) on the report, asking people for their own experiences of surveillance pricing. The public comment period was supposed to run until April 17, but the new FTC chair under the Trump administration, Andrew Ferguson, closed the RFI less than a week after the previous chair, Lina Khan, issued it.

Last week the state’s Attorney General, Letitia James, effectively re-opened it—at least for New York residents. She issued a consumer alert urging residents to help enforce the law, which threatens a $1,000 penalty each time a company violates it. The alert encouraged people to report companies they believe are using algorithms to determine pricing.

Under the New York law, businesses must display the exact text:

“This price was set by an algorithm using your personal data.”

They must display the text near the price shown, and they can’t use “protected class data” that is legally shielded from discrimination under the law. That includes ethnicity, national origin, disability, age, sex, sexual orientation, or gender identity. There are exceptions: insurance companies and other financial institutions are exempt under the Gramm-Leach-Bliley Act.

A long history of algorithmic pricing

Algorithmic pricing has been happening for years. For example, in 2013 Staples was found to be adjusting prices for different people according to their distance from a rival’s store. The retailer reportedly charged higher prices for households with lower incomes, although whether this was intentional or just an unintended by-product of the algorithm isn’t clear. That’s the problem though: algorithms can easily have unexpected results.

More recently, reporters have found people charged more for hotel rooms based on their IP addresses, while one report found Target charging more for goods viewed on its app when they were inside a Target store than when they were outside.

This pushback against surveillance pricing is spreading. California’s AB 325 bill amends the state’s Cartwright Act antitrust law to ban shared pricing algorithms that use competitor data between multiple businesses. Governor Gavin Newsom signed that into law last month, and it will take effect on January 1, 2026. He also passed SB 763, which increases civil and criminal penalties for violations of the Cartwright Act.


We don’t just report on data privacy—we help you remove your personal information

Cybersecurity risks should never spread beyond a headline. With Malwarebytes Personal Data Remover, you can scan to find out which sites are exposing your personal information, and then delete that sensitive data from the internet.



Source link