Anthropic is changing how it handles consumer data, giving all Claude users until September 28 to decide whether their conversations will be used to train future AI models.
Previously, Anthropic promised not to train its systems on consumer chat data and to delete user prompts within 30 days. Under the new rules, the company will retain data for up to five years and use chats and coding sessions for training unless users opt out. The policy applies to Claude Free, Pro, Max, and Claude Code users, but business customers, such as Claude Gov, Claude for Work, Claude for Education, and API clients, remain exempt, mirroring OpenAI’s enterprise carve-outs.
Anthropic frames the change as an opportunity for users to “help us improve model safety” and boost Claude’s skills in coding, analysis, and reasoning. But analysts note the deeper reality: Anthropic needs vast amounts of real-world data to stay competitive against OpenAI and Google. Millions of Claude interactions could provide exactly the kind of training material that strengthens its models.
The move reflects a wider industry trend as AI firms face scrutiny over retention practices. OpenAI is currently contesting a court order requiring it to indefinitely store all consumer ChatGPT conversations, including deleted ones, a demand the company calls “sweeping and unnecessary.”
Related: Anthropic Reaches Settlement in AI Book-Training Lawsuit with Authors
Critics say Anthropic’s rollout highlights a familiar problem: confusing consent design. Existing users will see a pop-up titled “Updates to Consumer Terms and Policies,” with a large black “Accept” button and a much smaller toggle, pre-set to “On” for data training permissions. Observers warn many will click through without realizing they’re agreeing to share their chats.
Privacy experts argue that meaningful consent is increasingly elusive in AI, while the U.S. Federal Trade Commission has warned against companies burying disclosures in fine print. Whether the agency, currently short-staffed, will act on Anthropic’s changes remains uncertain.