Business & Finance

Algorithmic And Surveillance Pricing Pushes Retail Into Legal Minefield


The widespread implementation of AI technology promises to transform virtually every retail process, from back-office operations to the front end, and even the way consumers engage in the shopping journey. McKinsey predicts that generative AI alone could deliver between $240 billion to $390 billion in value to retailers. “This, combined with the value of nongenerative AI and analytics, could turn billions of dollars in value into trillions,” it said.

More than its potential to boost revenue, AI is a power multiplier, giving retailers unprecedented ability to personalize consumer interactions to potentially deepen engagement, loyalty and customer delight. But as with any powerful tool, it can be used for good or ill.

Recent actions at the state and federal levels suggest regulators suspect malicious intent, especially regarding surveillance pricing and the privacy concerns it raises. Surveillance pricing, sometimes called personalized pricing, is a practice in which a retailer uses personal data and algorithms to set individualized prices for the same product. In effect, merchandise that the retailer buys at a fixed wholesale cost is sold to customers at different prices based on their data profile.

It’s easy to see the murky legal waters retailers enter with surveillance pricing. A once-level playing field in the prices people pay can be too easily titled in the retailer’s favor, raising concerns about fairness, discrimination, and potential consumer exploitation. It further casts doubt on how retailers collect, use and potentially misuse personal consumer information.

“Surveillance pricing is a minefield and the mines are exploding,” warned Crowell and Moring litigation attorney Joanna Forster, as she sees more state attorneys general and consumer protection chiefs leaning into this issue, both around the fairness and privacy issues. And as states move forward, she expects the federal government, which has already moved in this direction, to pick up its pace.

California Investigates Fairness And Privacy

On January’s Data Privacy Day, the California Attorney General Rob Bonta announced a large-scale investigation into how businesses are using consumers’ personal data to set targeted, individualized prices for product and services.

Such surveillance pricing practices may violate the California Consumer Privacy Act, which limits a business’ use of “personal information for purposes that are consistent with the reasonable expectations of consumers.”

On the face of it, using personal data to set individualized prices stretches the definition of what consumers would reasonably expect. “A more pernicious use is a business taking your zip code and using that as a proxy for socioeconomic class or protected class to charge people more who live in Beverly Hills or price gouge people living in depressed socioeconomic neighborhoods,” Forster explained.

For example, she said a Black customer might be charged more for a hair relaxer than someone who is White. “It could be discriminatory pricing or price gouging.” And when I asked whether retailers are actually using such practices, she declined to go on the record—but that example was hers, not mine.

The California investigation has wide-ranging national implications. The California Department of Justice is sending letters requesting information on how businesses use consumers’ shopping and internet browsing history, location, demographics, and other data to set prices for goods and services. Inquiries are going out to businesses with significant physical and online presence in California’s retail, grocery and hotel sectors.

Since California is recognized as the world’s fifth-largest economy, the attorney general is casting a wide net. “Practices like surveillance pricing may undermine consumer trust, unfairly raise prices and when conducted without proper disclosure or beyond reasonable expectations, may violate California law,” Attorney General Bonta said.

FTC On Price Algorithms And Surveillance Pricing

Following the FTC’s $60 million judgment against Instacart for deceptive business practices this past December—though none related specifically to AI pricing—Reuters reported that the FTC is now turning its attention to how Instacart uses AI to set different prices for different customers.

Research conducted by Consumer Reportsin association with Groundwork Collaborative and More Perfect Union, found that customers could be charged up to 23% more for the same item ordered from the same store at the same time. Research experiments into Instacart’s algorithmic pricing found it was being used by some of the nation’s largest grocery stores, including Albertsons, Costco, Kroger, Safeway, Sprouts Farmers Market, and Target.

“Like so many Americans, we are disturbed by what we have read in the press about Instacart’s alleged pricing policies,” FTC spokesperson Joe Simonson said in a statement this past December.

After the Consumer Reports news erupted, Instacart said that its retail partners have used its AI technology to test different prices, implying that the CR research was conducted during such tests. Subsequently, in a blog postInstacart announced it had ended all item price tests on the platform.

However, the damage was done. Consumers are becoming increasingly aware of algorithmically powered, surveillance pricing and how it might be used to charge them more.

Before the Instacart AI pricing controversy came to light, the FTC issued a preliminary study on surveillance pricing based on data from eight companies—not retailers, but intermediaries that collect detailed consumer data, including Mastercard, Revionics, Bloomreach and Accenture.

The preliminary report issued in January 2025 found that these intermediaries work with at least 250 clients, ranging from grocery stores to apparel retailers, and they may be using such personalized data to algorithmically set individualized prices.

“Initial staff findings show that retailers frequently use people’s personal information to set targeted, tailored prices for goods and services—from a person’s location and demographics, down to their mouse movements on a webpage,” then-FTC chair Lina Khan said in a statement, as she promised the FTC would continue to investigate surveillance pricing practices.

“Americans deserve to know how their private data is being used to set prices they pay and whether firms are charging different people different prices for the same good or service,” she continued.

While the administration has changed since the FTC first turned its attention to surveillance pricing, the fact that it is scrutinizing Instacart’s AI pricing algorithm suggests the Trump administration isn’t going to let it die.

“I don’t think the FTC under the current administration is going to sit on the sidelines, especially given their concern about prices, pricing transparency and the whole issue of affordability,” attorney Forster said. “While there is no federal privacy law, you’re going to see it come down to a matter of unfair competition or deceptive practices to consumers at the federal level—the FTC’s remit is fair trade, after all.”

She also believes that as investigations expand beyond data intermediaries to retailers that buy and use harvested consumer data, other federal agencies could become involved. “The SEC probably doesn’t want to feel like it’s left behind,” she noted.

New York Algorithmic Pricing Transparency

New York State has also taken aim at algorithmic pricing in a new law signed by Governor Kathy Hochul in May 2025, entitled the Preventing Algorithmic Pricing Discrimination Act. The law requires retailers that use personal data to set prices for individual consumers to prominently display the statement “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA,” next to the price, in all caps.

In July 2025, the National Retail Federation filed a lawsuit against the state’s algorithmic pricing law, which was subsequently dismissed by U.S. District Judge Jed Rakoff in October. The center of NRF’s argument against the law was that it violated retailers’ free speech rights under the First Amendment.

That argument didn’t hold water. “The judge ruled that First Amendment principles and theories don’t apply because it’s commercial speech and commercial speech can be compelled in certain instances,” attorney Forster explained. “And it’s notable that the New York law extends to all algorithmic pricing, not just surveillance or personalized pricing.”

NRF Makes Retailers’ Case

Beyond the legal nitty-gritty of NRF’s argument, the association’s press release concerning its lawsuit provides insight into how the industry is viewing algorithmic and surveillance pricing: as a net benefit to consumers to save them money.

“Algorithms are created by humans, not computers, and they are an extension of what retailers have done for decades, if not centuries, to use what they know about their customers to serve them better,” NRF chief administrative officer and general counsel Stephanie Martz said in a statement. “It’s just done at the scale of the modern economy. Stigmatizing tools that drive prices down turns offering deals into a liability, and consumers will end up paying more.”

The NRF further stated, “Studies have consistently shown that algorithmic pricing incorporating data on market conditions plays a powerful role in driving prices down because algorithms allow companies to be more responsive to supply and demand and so better optimize pricing to reflect market conditions.”

Be that as it may, surveillance and algorithmic pricing can seem like a technological godsend or something far more diabolical, depending on one’s vantage point. In the current political climate, it’s becoming clear that the regulatory agencies and legal authorities are siding with the consumer angels.

See Also:

ForbesRetailers Under Fire For Covert ‘Surveillance Pricing’

Please Subscribe. it’s Free!

Your Name *
Email Address *