ChatGPT Under the Gavel: The Court Order That’s Stirring Up AI Privacy Concerns

June 5, 2025

In a move that’s sent ripples through the tech world, been shared on business chat groups and prompted more than a few raised eyebrows, a U.S. federal court has instructed OpenAI to preserve all ChatGPT user logs, including chats that us users asked to be deleted. Yes, even that time you asked ChatGPT to help you write a strongly worded email to your neighbour about their leylandia hedges.

This surprising legal twist comes from a lawsuit filed by major media outlets, most notably The New York Times, who are accusing OpenAI of using copyrighted material without permission when training its models. The claim is that ChatGPT might regurgitate paywalled or protected content and user chats could potentially demonstrate how that happens.

The logic is this: users might delete chats to cover up evidence of having asked the model to reproduce copyrighted content. So, to avoid a digital paper trail going up in smoke, the court ordered OpenAI to hold onto everything, whether deleted, temporary or otherwise.

Imagine your recycle bin at work suddenly being part of a national investigation. That’s essentially what’s happening here. Yup the possibility of someone knowing you asked that question…

Unsurprisingly, OpenAI isn’t thrilled. They’ve responded by arguing the order is too broad, possibly even overreaching. In their view, it unfairly paints all deleted chats as suspicious, when in reality most people probably just didn’t want their 2am musings on biscuit rankings immortalised forever.

They also point out that retaining all data, especially that which users believed was erased, could erode trust and infringe on privacy expectations.

The Positives:

  • Preserving Evidence Legal processes benefit from clear records. If a user did misuse the tool to access or replicate protected content, those logs might be key evidence.

  • Transparency  It creates a record of how the tool is being used in the wild, which could be useful in refining regulation or improving safety.

The Negatives: 

  • Privacy Backlash The biggest worry is user trust. People use ChatGPT for everything from homework help to mental health prompts. Knowing your words might be retained indefinitely could change how freely people interact. Not great for the brand.

  • Slippery Slopes: Today it’s copyright, but what might tomorrow bring? Will all digital assistants have to archive our every request? Where’s the line between accountability and surveillance?

If you’re the type to regularly tidy up your chat history, assuming deleted means deleted, this isn’t, with this order, even ‘temporary chats’ might stick around, just in case someone wants to examine them for copyright misuse. It’s the digital equivalent of burning your diary, only to have a court order demand it back in perfect condition and hey presto!

This also opens the door to bigger philosophical questions. Who owns your prompts? What constitutes private digital space? And should AI interactions be considered ephemeral, like a chat with a friend, or permanent, like a court deposition? Why not give users options in settings ?

Ultimately, this situation highlights the tricky balance between innovation, responsibility, and rights. Courts want clarity and compliance (as do we all). Companies want user trust and freedom to build. And users? Mostly, they just want to ask if it’s going to rain tomorrow or if we all do live withint 6 foot of a rat, without it turning into a legal archive.