One-Way Mirror: AI Pricing and the Fight for Consumer Transparency

The price you saw was not the price everyone saw. You just did not know it yet.
In February 2024, Wendy's CEO Kirk Tanner told investors that the fast-food chain would invest $20 million in digital menu boards to support “dynamic pricing and day-part offerings.” The reaction was immediate, visceral, and devastating. Consumers heard “surge pricing” and revolted. Social media erupted. Burger King capitalised on the moment by offering free Whoppers, its email subject line reading: “Surge Pricing? Not at Burger King!” Within days, Wendy's Vice President Heidi Schauer was forced to clarify to NPR that the company would not raise prices during peak hours, insisting the plan was merely about discounts during slower periods. The damage, however, was already done. Wendy's had accidentally revealed something the technology industry had been quietly building for years: an infrastructure designed to charge different people different prices for the same thing, calibrated by algorithms that know more about you than you might suspect.
That infrastructure is no longer theoretical. It is operational, expanding, and largely invisible to the consumers it targets. Across e-commerce, travel, entertainment, housing, and soon your local supermarket, artificial intelligence systems are ingesting vast quantities of personal data to estimate individual willingness to pay and adjust prices accordingly. The question confronting regulators, consumers, and the technology companies themselves is whether this represents a natural evolution of market efficiency or a fundamental breakdown in the social contract that underpins fair commerce.
How the Pricing Machine Learns What You Will Pay
To understand why AI-driven pricing has become such a flashpoint, you need to understand what these systems actually do. Traditional dynamic pricing is nothing new. Airlines have adjusted fares based on demand since the 1980s. Hotels shift rates around holidays and conferences. Uber's surge pricing algorithm, which multiplies fares during periods of high demand, has been the subject of academic study for over a decade. A 2016 National Bureau of Economic Research paper estimated that UberX generated approximately $6.8 billion in consumer surplus across the United States in 2015, suggesting that for every dollar spent by consumers, roughly $1.60 in surplus was generated.
A natural experiment on New Year's Eve illustrated the point. When Uber's surge pricing algorithm across all of New York City broke down for 26 minutes due to a technical glitch, the platform's average wait time spiked from 2.6 minutes to 8 minutes, and unfulfilled trip requests rose significantly. The algorithm, whatever consumers thought of it, was performing a genuine market function. But even Uber's model, which adjusts prices based on aggregate supply and demand rather than individual consumer profiles, has drawn regulatory backlash. Cities including Honolulu, Manila, New Delhi, and Singapore have banned or capped surge pricing. Research by Juan Camilo Castillo at the University of Pennsylvania, using Uber data from Houston in 2017, found that while surge pricing generally improved market outcomes, its effects were unevenly distributed, with price-sensitive riders bearing a disproportionate burden during peak periods.
What is happening now goes far beyond adjusting prices to reflect real-time supply and demand. The new generation of AI pricing tools analyses individual consumer behaviour, browsing history, purchase patterns, location data, device type, credit history, and demographic information to estimate what each specific person is willing to pay. Amazon reportedly adjusts product prices around 2.5 million times every day, updating 50 times more frequently on average than Walmart. The company considers both “global values” such as demand volume and stock levels, and “user values” including product visit frequency and time of purchase. Research indicates that loyal, returning customers may face higher prices than newcomers, as the dynamic pricing engine calculates each customer's loyalty level and sets prices accordingly.
The algorithmic approaches powering these systems are sophisticated and continually evolving. Reinforcement learning models analyse customer demand while accounting for seasonality, competitor pricing, and market uncertainty to arrive at revenue-optimal prices. Bayesian models incorporate historical pricing data and shift their estimates with every new data point. Behavioural pricing systems analyse individual customer actions in real time to offer personalised discounts or price adjustments based on predicted likelihood of purchase. A Valcon study found that while 61 per cent of European retailers have embraced some form of dynamic pricing, fewer than 15 per cent currently use algorithmic or AI-based strategies. That number is set to change rapidly: 55 per cent of European retailers are actively planning to pilot dynamic pricing with generative AI in 2026.
The business case is compelling. Reports indicate that AI-driven dynamic pricing can increase average order value by up to 13 per cent during peak sales periods, cut overstock by 6 per cent in a single quarter, and boost profit margins by as much as 25 per cent. For companies operating on thin margins in competitive markets, these are not marginal improvements. They are transformative. And the practice is spreading beyond the expected players. Researchers at the University of New South Wales have warned that personalised pricing could soon reach supermarkets, noting that consumers have no way of knowing whether the price they see for bread or bananas on a retailer's website is the same price that another consumer sees.
When Landlords Let the Algorithm Decide
The most striking demonstration of what happens when algorithmic pricing goes wrong did not occur in an online shop or a ride-hailing app. It happened in the American rental housing market, where millions of tenants discovered that their rent increases were being orchestrated by a single piece of software.
In August 2024, the United States Department of Justice, alongside the Attorneys General of eight states including California, North Carolina, and Colorado, filed a civil antitrust lawsuit against RealPage Inc. The complaint alleged that RealPage contracted with competing landlords who agreed to share nonpublic, competitively sensitive information about their apartment rental rates to train and run RealPage's algorithmic pricing software. The software then generated pricing recommendations for participating landlords based on their competitors' data. Prosecutors stated that one landlord reported starting to increase rents within a week of adopting the software and, within eleven months, had raised them by more than 25 per cent.
In January 2025, the DOJ expanded the case, adding six major multifamily property owners as co-defendants, including Greystar. Nine states subsequently reached a $7 million settlement with Greystar in November 2025. By that same month, the DOJ had reached a proposed settlement with RealPage itself. The company did not admit liability but agreed to stop using competitors' nonpublic data in its revenue management product, to restrict model training to historic data at least twelve months old, to redesign its software to remove mechanisms that prop up prices or encourage competitors toward common pricing ranges, and to accept a court-appointed monitor with broad access to review its code and model training documentation. The settlement terms are operative for seven years.
The RealPage case matters far beyond the housing sector because it established a legal framework for how algorithmic pricing tools can cross the line from legitimate optimisation into anticompetitive behaviour. When an algorithm aggregates private data from competitors and uses it to coordinate pricing upward, it functions as a mechanism for tacit collusion, regardless of whether any human explicitly agreed to fix prices. The DOJ's Antitrust Division head has promised an increase in probes of algorithmic pricing, and in March 2025, the agency filed a statement of interest regarding “the application of the antitrust laws to claims alleging algorithmic collusion and information exchange.”
Surveillance Pricing and the FTC's Unfinished Investigation
In July 2024, the Federal Trade Commission under Chair Lina Khan launched what it called a surveillance pricing inquiry, using its 6(b) authority to issue orders to eight companies: Mastercard, Revionics, Bloomreach, JPMorgan Chase, Task Software, PROS, Accenture, and McKinsey. The Commission voted 5-0 to issue the orders. Khan stated that “firms that harvest Americans' personal data can put people's privacy at risk. Now firms could be exploiting this vast trove of personal information to charge people higher prices.”
Speaking at the Fast Company Innovation Festival in September 2024, Khan elaborated: “Given just how much intimate and personal information that digital companies are collecting on us, there's increasingly the possibility of each of us being charged a different price based on what firms know about us.” She noted that while economists had long studied price personalisation, it was previously more of a “thought experiment,” but advances in data extraction and targeting had made it “much more possible to be serving every individual person an individual price based on everything they know about you.”
The preliminary findings, published in January 2025, revealed that instead of a price or promotion being a static feature of a product, the same product could have a different price or promotion based on consumer-related data, behaviours, preferences, location, time, and purchase channel. Some companies could determine individualised pricing based on granular consumer data, with the study citing examples such as a cosmetics company targeting promotions based on specific skin types and tones. The FTC found that at least 250 businesses, including grocery stores, apparel retailers, health and beauty retailers, and hardware stores, had adopted surveillance pricing strategies.
Then the investigation stalled. FTC Chair Andrew Ferguson, who replaced Khan, cancelled the public comment period, effectively ending the study. With new federal leadership signalling that continuing the investigation was not a priority, the unfinished inquiry left a regulatory vacuum.
That vacuum did not last long. In December 2025, Senator Mark R. Warner led Senators Gallego, Blumenthal, and Hawley in a bipartisan push urging the Trump administration to crack down on surveillance pricing, which the senators described as a practice that “eliminates a fixed or static price in favour of prices specially tailored to an individual consumer's willingness to pay.” State lawmakers across the country began introducing legislation to regulate practices that use personal data, AI, and frequent price changes, particularly in sectors like food and housing. The regulatory baton, at least in the United States, has been passed from the federal level to the states, creating a patchwork of approaches that may prove difficult for businesses to navigate and consumers to understand.
The Oasis Fiasco and the British Regulatory Response
If the American regulatory landscape is fragmented, the United Kingdom's has been galvanised by a single, furiously debated event: the Oasis reunion ticket sale.
On 31 August 2024, tickets for 17 shows across the UK and Ireland went on sale exclusively through Ticketmaster. Millions of fans endured long virtual queues and multiple site crashes. Many discovered that standing tickets, initially advertised at approximately £135, had risen to as much as £355 by the time they reached checkout. The backlash was enormous. UK culture minister Lisa Nandy pledged to look into Ticketmaster's use of dynamic pricing. The band itself issued a statement claiming that “Oasis leave decisions on ticketing and pricing entirely to their promoters and management” and that lead members Liam and Noel Gallagher had not known dynamic pricing would be used.
On 5 September 2024, the Competition and Markets Authority launched an investigation into Ticketmaster's conduct. The CMA's findings, published in March 2025, were revealing. The regulator found no evidence that Ticketmaster had used algorithmic real-time pricing in the traditional sense. Instead, the company had released a batch of standing tickets at a lower price, and once those sold out, released the remaining tickets at a much higher price. The CMA was concerned that consumers had not been given clear and timely information about how the pricing would work, particularly given that many customers had endured lengthy queues with no warning that prices would change.
The Oasis controversy accelerated regulatory action. In late 2024, the Sale of Tickets (Sporting and Cultural Events) Bill was introduced in Parliament, seeking to require ticket-selling platforms to display the full range of available tickets, their quantities, and prices to consumers before they joined online queues. More broadly, the CMA has positioned itself as a proactive regulator of online pricing practices. The Digital Markets, Competition and Consumers Act received Royal Assent in May 2024 and its new digital markets competition regime came into force on 1 January 2025. Under this framework, the CMA can decide whether consumer laws have been broken without having to go through the courts, and can fine companies up to 10 per cent of global turnover. The CMA has also launched enforcement actions covering online pricing practices, including drip pricing and pressure selling, using its new powers to order businesses to pay compensation to affected customers.
The CMA has acknowledged that pricing algorithms can benefit consumers by reducing transaction costs and market frictions, but it has also flagged the risk that algorithms could “facilitate collusive outcomes” and increase prices. In a notable observation, the CMA suggested that the risk of businesses colluding with one another over prices would actually diminish if there were extensive use of personalised pricing algorithms in digital markets, because each firm would be setting individual prices rather than converging on common ones. It is a counterintuitive argument that illustrates just how complex the regulatory challenge has become.
Europe Drafts Its Digital Fairness Rulebook
The European Union, rarely content to let a regulatory opportunity pass, is constructing what could become the most comprehensive framework for governing personalised pricing anywhere in the world.
The Digital Fairness Act, overseen by EU Commissioner Michael McGrath, is designed to address manipulative interface design, misleading influencer marketing, addictive design features, subscription traps, and, critically, unfair personalisation and pricing practices. The European Commission launched a public consultation on the DFA on 17 July 2025, which closed on 24 October 2025 and received 3,341 responses, the vast majority from consumers.
The results were striking. At least 77 per cent of respondents supported measures including greater consumer control over personalised advertising, restrictions on advertising that exploits vulnerabilities, a prohibition on personalised advertising targeting minors, and restrictions on personalised pricing based on personal data and profiling. The existing Consumer Rights Directive already requires traders to inform consumers if a price has been personalised based on automated decision-making, but businesses are not required to disclose the specific parameters or criteria used. The DFA is expected to go considerably further. The consultation also examined “drip pricing,” where a low price is initially presented but incrementally increased, and noted that rapid pricing changes putting consumers under psychological pressure to act quickly may be considered misleading or aggressive practices.
The formal draft is expected in Q3 2026, with final adoption expected in late 2027. The DFA is expected to apply broadly across the business-to-consumer digital economy, affecting e-commerce platforms, streaming services, telecoms, airlines, travel platforms, ride-hailing and delivery apps, and any business that uses personalised offers, automated subscriptions, or dynamic pricing.
For companies operating globally, the DFA represents a potentially seismic shift. The EU's track record with the General Data Protection Regulation demonstrated that European rules can set de facto global standards, as companies find it more efficient to comply everywhere than to maintain different systems for different jurisdictions. If the DFA mandates meaningful transparency about how personalised prices are calculated, businesses worldwide may have to disclose information they currently treat as proprietary.
Meanwhile, Australia's competition regulator, the ACCC, released the final report of its five-year Digital Platform Services Inquiry in June 2025. Across 14 reports, the ACCC broadly flagged risks emerging from generative AI integration into commercial operations, including algorithmic coordination and transparency in automated decision-making. The ACCC concluded that Australia's current laws cannot adequately deal with the harms arising from such a fast-evolving industry and recommended an economy-wide prohibition on unfair trading practices, along with mechanisms to force algorithmic disclosure.
What the Researchers Found About Who Actually Benefits
The most uncomfortable finding for advocates of AI-driven personalised pricing comes from Carnegie Mellon University's Tepper School of Business. A study published in Marketing Science by Yan Huang, Associate Professor of Business Technologies, Kannan Srinivasan, Professor of Management, Marketing, and Business Technology, and Param Vir Singh, Carnegie Bosch Professor of Business Technologies and Marketing, examined the interaction between personalised ranking systems and pricing algorithms on e-commerce platforms.
Their findings challenge the conventional wisdom that personalised pricing benefits consumers by showing them more relevant products at competitive prices. The researchers found that personalised ranking systems, which present products in order of estimated consumer preference, may actually encourage higher prices from pricing algorithms, particularly when consumers search for products sequentially on third-party platforms. This occurs because personalised ranking significantly reduces the ranking-mediated price elasticity of demand, diminishing the algorithmic incentive to lower prices. Conversely, unpersonalised ranking systems led to significantly lower prices and greater consumer welfare.
The implications are profound. As doctoral student Liying Qiu, who collaborated on the research, has noted, increased consumer data sharing may not always result in improved outcomes, even in the absence of explicit price discrimination. Personalised ranking, empowered by access to more detailed consumer data, can facilitate algorithms charging higher prices. Certain pricing algorithms may even learn to engage in tacit collusion in competitive scenarios, resulting in consequences harmful to consumer welfare.
This research suggests that the very infrastructure of modern e-commerce, the personalised interfaces that platforms use to show you products they think you want, can function as a mechanism for extracting higher prices. The consumer experience of being “understood” by a platform may simultaneously be the mechanism through which that consumer pays more.
The Information Asymmetry Problem, Supercharged
In 1970, the economist George Akerlof published “The Market for Lemons,” a paper that would eventually win him a share of the 2001 Nobel Prize in Economics alongside Michael Spence and Joseph Stiglitz. Akerlof demonstrated how information asymmetry between buyers and sellers could cause markets to break down entirely. When sellers know more about the quality of a product than buyers do, prices fall to reflect the buyer's uncertainty, which drives away sellers of genuinely good products, which further depresses buyer confidence, until the market collapses or only the worst products remain.
Governments responded to this problem with consumer protection legislation: lemon laws, mandatory disclosures, vehicle inspection requirements, and financial product transparency rules. These interventions worked precisely because they reduced the information gap between buyer and seller.
AI-driven personalised pricing creates a new form of information asymmetry that is qualitatively different from anything Akerlof described. In this case, the seller does not merely know more about the product than the buyer. The seller knows more about the buyer than the buyer knows about themselves, at least in economic terms. The algorithm has processed the buyer's browsing history, purchase frequency, price sensitivity, location, time of day, device, and potentially hundreds of other signals to arrive at a price that is optimised not for fairness, not for competition, but for the maximum amount the algorithm calculates this specific individual will accept.
This is not the invisible hand of the market at work. It is a one-way mirror. The consumer sees a price and assumes it is the price. The algorithm sees a consumer and calculates what it can get. The traditional economic assumptions that underpin competitive markets, informed buyers comparing transparent prices from competing sellers, simply do not hold when every buyer sees a different price and has no way of knowing it.
The economist's argument that price discrimination can theoretically improve welfare by allowing markets to serve price-sensitive consumers who would otherwise be priced out is valid in its own theoretical framework. But it assumes that sellers will actually lower prices for those consumers rather than simply charge everyone the maximum. Without transparency, there is no mechanism to verify that the welfare-improving version of personalised pricing is what consumers actually receive. And without transparency mandates, consumers have no tools to distinguish between a system that genuinely serves their interests and one that extracts every penny of surplus.
What Transparency Would Actually Require
If regulators mandate price transparency for AI-driven pricing, what would that look like in practice? The proposals currently circulating across multiple jurisdictions suggest several overlapping approaches.
The simplest is disclosure: requiring businesses to tell consumers when a price has been personalised. The EU's existing Consumer Rights Directive already mandates this, though without requiring businesses to explain how the personalisation works. The Digital Fairness Act may extend this to require disclosure of the parameters used, the data inputs, and the algorithmic logic.
A second approach is price comparison: requiring that consumers be shown the base or median price alongside their personalised price, so they can see whether they are paying more or less than average. This would create competitive pressure, as consumers who discovered they were consistently paying above the median might switch to competitors.
A third approach, favoured by some competition regulators, is algorithmic auditing: requiring companies to submit their pricing algorithms to independent review, much as the RealPage settlement requires a court-appointed monitor to review the company's code and model training documentation. This would allow regulators to detect collusive behaviour, discriminatory pricing patterns, or systematic exploitation of vulnerable consumers without requiring consumers to understand the algorithms themselves.
A fourth, more radical approach is prohibition: banning personalised pricing entirely in certain sectors, much as some jurisdictions have capped or banned surge pricing for ride-hailing services. The Oasis ticket controversy has prompted legislative proposals in the UK to regulate dynamic pricing in entertainment. The question is whether prohibition in essential sectors like food, housing, and healthcare would be proportionate, or whether it would simply drive the practice underground.
Each approach involves trade-offs. Full algorithmic disclosure could reveal proprietary business methods. Price comparison mandates could be gamed by setting artificial baselines. Auditing regimes are only as good as the auditors' technical capabilities and independence. Outright bans may prevent genuinely beneficial price adjustments that serve consumers well.
Navigating the Invisible Marketplace
The stakes of this debate extend well beyond whether your next pair of trainers costs 5 per cent more because the algorithm noticed you browsed them three times. They go to the heart of what kind of marketplace a digitally connected society wants to inhabit.
If personalised pricing becomes the universal default, the concept of a “price” in the way most consumers understand it ceases to exist. There is no longer a number attached to a product. There is a number attached to a relationship between a product and a buyer, mediated by an algorithm that neither party fully controls or understands. Every transaction becomes a negotiation in which only one side knows it is negotiating.
The Wendy's backlash, the Oasis ticket fury, the RealPage lawsuit, and the FTC's aborted surveillance pricing inquiry all point in the same direction: consumers find personalised pricing fundamentally unfair when they discover it, and they are deeply uncomfortable with the idea that algorithmic systems know enough about them to exploit that knowledge. The 77 per cent of EU consultation respondents who supported restrictions on personalised pricing are not outliers. They are the mainstream.
The counterargument from industry is not without merit. Dynamic pricing does allocate scarce resources more efficiently. It does enable businesses to serve price-sensitive consumers with lower prices. It does reduce waste by aligning prices with actual demand. But these benefits depend on transparency and genuine competition, neither of which is guaranteed in an opaque algorithmic marketplace. Research from the University of New South Wales has found that 70 per cent of consumers are comfortable with dynamic pricing when they perceive it as fair and transparent, suggesting that the issue is not the concept itself but the secrecy surrounding its implementation.
What is clear is that the regulatory frameworks governing these practices are being written right now, in Brussels, in London, in Canberra, in state legislatures across the United States. The EU's Digital Fairness Act, the UK's Digital Markets, Competition and Consumers Act, the ACCC's reform recommendations, and the patchwork of American state legislation are all attempting to answer the same fundamental question: in a world where algorithms can determine exactly how much you are willing to pay, does the consumer have a right to know?
The answer, increasingly and across jurisdictions, appears to be yes. The debate is no longer about whether transparency is necessary, but about how much transparency is enough, who enforces it, and how quickly the rules can keep pace with the algorithms they are meant to govern. For consumers who have spent years handing over their data in exchange for convenience, the price of that bargain is about to become visible, whether the algorithms like it or not.
References and Sources
NPR, “No, Wendy's says it isn't planning to introduce surge pricing,” 28 February 2024. https://www.npr.org/2024/02/28/1234412431/wendys-dynamic-surge-pricing
Axios, “Why fast-food fans flipped out over Wendy's pricing,” 29 February 2024. https://www.axios.com/2024/02/29/wendys-surge-pricing-ai-backlash-internet
Cohen, Hahn, Hall, Levitt, and Metcalfe, “Using Big Data to Estimate Consumer Surplus: The Case of Uber,” NBER Working Paper No. 22627, 2016. https://www.nber.org/papers/w22627
Hall, Kendrick, and Nosko, “The Effects of Uber's Surge Pricing: A Case Study.” https://www.uber.com/blog/research/the-effects-of-ubers-surge-pricing-a-case-study/
Castillo, J.C., “Who Benefits from Surge Pricing?”, University of Pennsylvania, 2019. https://economics.sas.upenn.edu/system/files/2020-01/JMP_Castillo.pdf
Pricefy, “How Amazon Uses Real-Time Data and Dynamic Pricing to Maximize Profits.” https://www.pricefy.io/articles/amazon-real-time-data-dynamic-pricing
AIMultiple, “Dynamic Pricing Algorithms in 2026: Top 3 Models.” https://research.aimultiple.com/dynamic-pricing-algorithm/
Master of Code, “AI Dynamic Pricing: Boost Profits by 10%, Sales by 13%.” https://masterofcode.com/blog/ai-dynamic-pricing
UNSW Newsroom, “AI is using your data to set personalised prices online,” October 2025. https://www.unsw.edu.au/newsroom/news/2025/10/AI-using-data-personalised-data-prices-online
UNSW Newsroom, “The rise of dynamic pricing: should AI decide what you pay?“, September 2025. https://www.unsw.edu.au/newsroom/news/2025/09/dynamic-pricing-AI-decide-what-you-pay
US Department of Justice, “Justice Department Sues RealPage for Algorithmic Pricing Scheme,” August 2024. https://www.justice.gov/archives/opa/pr/justice-department-sues-realpage-algorithmic-pricing-scheme-harms-millions-american-renters
US Department of Justice, “Justice Department Requires RealPage to End Sharing of Competitively Sensitive Information,” November 2025. https://www.justice.gov/opa/pr/justice-department-requires-realpage-end-sharing-competitively-sensitive-information-and
ProPublica, “DOJ and RealPage Agree to Settle Rental Price-Fixing Case.” https://www.propublica.org/article/doj-realpage-settlement-rental-price-fixing-case
Mintz, “Last Year's Rent: RealPage Reaches Settlement Agreement with the DOJ,” December 2025. https://www.mintz.com/insights-center/viewpoints/2191/2025-12-01-last-years-rent-realpage-reaches-settlement-agreement
Federal Trade Commission, “FTC Issues Orders to Eight Companies Seeking Information on Surveillance Pricing,” July 2024. https://www.ftc.gov/news-events/news/press-releases/2024/07/ftc-issues-orders-eight-companies-seeking-information-surveillance-pricing
FTC, “Behind the FTC's Inquiry into Surveillance Pricing Practices,” July 2024. https://www.ftc.gov/policy/advocacy-research/tech-at-ftc/2024/07/behind-ftcs-inquiry-surveillance-pricing-practices
Fast Company, “Lina Khan says the FTC is investigating surveillance pricing,” September 2024. https://www.fastcompany.com/91195551/lina-khan-ftc-federal-trade-commission-chair-surveillance-pricing-explained-what-is-it
FTC, “Surveillance Pricing Update & The Work Ahead,” January 2025. https://www.ftc.gov/policy/advocacy-research/tech-at-ftc/2025/01/surveillance-pricing-update-work-ahead
FTC, “Surveillance Pricing Study Indicates Wide Range of Personal Data Used,” January 2025. https://www.ftc.gov/news-events/news/press-releases/2025/01/ftc-surveillance-pricing-study-indicates-wide-range-personal-data-used-set-individualized-consumer
Future of Privacy Forum, “A Price to Pay: U.S. Lawmaker Efforts to Regulate Algorithmic and Data-Driven Pricing.” https://fpf.org/blog/a-price-to-pay-u-s-lawmaker-efforts-to-regulate-algorithmic-and-data-driven-pricing/
Senator Mark R. Warner, press release on surveillance pricing, December 2025. https://www.warner.senate.gov/public/index.cfm/2025/12/warner-leads-bipartisan-effort-to-push-ftc-to-crack-down-on-surveillance-pricing-with-holiday-shopping-season-underway
NPR, “Ticketmaster 'dynamic pricing' subject to U.K. investigation into Oasis ticket sales,” September 2024. https://www.npr.org/2024/09/06/g-s1-21316/oasis-reunion-ticketmaster-dynamic-pricing
Variety, “Oasis Tickets: U.K. Opens Probe Into Ticketmaster's 'Dynamic Pricing',” September 2024. https://variety.com/2024/global/global/ticketmaster-dynamic-pricing-oasis-uk-government-investigation-1236127481/
Arts Professional, “Oasis concerts: Watchdog says 'no evidence' Ticketmaster used dynamic pricing,” March 2025. https://www.artsprofessional.co.uk/news/oasis-concerts-watchdog-says-no-evidence-ticketmaster-used-dynamic-pricing
Womble Bond Dickinson, “DMCC Act 2024 explained.” https://www.womblebonddickinson.com/uk/insights/articles-and-briefings/digital-markets-competition-and-consumers-act-2024-explained-cmas
CMA, “CMA launches major consumer protection drive focused on online pricing practices.” https://www.gov.uk/government/news/cma-launches-major-consumer-protection-drive-focused-on-online-pricing-practices
Pinsent Masons, “CMA: collusion could be addressed with personalised pricing.” https://www.pinsentmasons.com/out-law/news/cma-addressing-collusion-with-personalised-pricing
European Parliament, Digital Fairness Act Legislative Train Schedule. https://www.europarl.europa.eu/legislative-train/theme-protecting-our-democracy-upholding-our-values/file-digital-fairness-act
Slaughter and May, “Digital Fairness Act: European Commission publishes responses to consultation,” December 2025. https://thelens.slaughterandmay.com/post/102m222/digital-fairness-act-european-commission-publishes-responses-to-consultation
Osborne Clarke, “Digital Fairness Act Unpacked: Unfair Pricing Practices.” https://www.osborneclarke.com/insights/digital-fairness-act-unpacked-unfair-pricing-practices
ACCC, “Digital Platform Services Inquiry final report,” June 2025. https://www.accc.gov.au/about-us/publications/serial-publications/digital-platform-services-inquiry-2020-25-reports/digital-platform-services-inquiry-final-report-march-2025
Huang, Srinivasan, and Singh, “Personalization, Consumer Search, and Algorithmic Pricing,” Marketing Science, Vol. 44, No. 6, 2025. https://www.cmu.edu/tepper/news/stories/2025/0602-ai-driven-personalized-pricing-may-not-help-consumers
CMU Tepper School, Liying Qiu doctoral research profile. https://www.cmu.edu/tepper/news/stories/2025/0519-doctoral-student-liying-qiu-studies-ai-consumer-behavior-and-market-dynamics
Akerlof, G., “The Market for Lemons: Quality Uncertainty and the Market Mechanism,” Quarterly Journal of Economics, Vol. 84, No. 3, 1970, pp. 488-500.
Nobel Prize in Economics 2001, Akerlof, Spence, and Stiglitz. Econlib. https://www.econlib.org/library/Enc/bios/Akerlof.html

Tim Green UK-based Systems Theorist & Independent Technology Writer
Tim explores the intersections of artificial intelligence, decentralised cognition, and posthuman ethics. His work, published at smarterarticles.co.uk, challenges dominant narratives of technological progress while proposing interdisciplinary frameworks for collective intelligence and digital stewardship.
His writing has been featured on Ground News and shared by independent researchers across both academic and technological communities.
ORCID: 0009-0002-0156-9795 Email: tim@smarterarticles.co.uk