News

IT NEWS

OpenAI forced to preserve ChatGPT chats

OpenAI has protested a court order that forces it to retain its users’ conversations. The creator of the ChatGPT AI model objected to the order, which is part of a copyright infringement case against it by The New York Times and other publishers.

The news organizations argued that ChatGPT was presenting their content in its responses to the point where users were reading this material instead of accessing their paid content directly.

The publishers said that deleted ChatGPT conversations might show users obtaining this proprietary published content via the service.

The issue was up for debate in a January, where Judge Ona T. Wang suggested that users who heard about the legal case might delete those conversations to cover their tracks. She denied the publishers’ request for a preservation order at the time, but also asked why OpenAI couldn’t segregate and make anonymous data from users who had requested deletion. OpenAI failed to address this, Wang said, leading to her order, granted May 13.

OpenAI served with court order

Wang’s order last month said:

“OpenAI is NOW DIRECTED to preserve and segregate all output log data that would otherwise be deleted on a going forward basis until further order of the Court (in essence, the output log data that OpenAI has been destroying), whether such data might be deleted at a user’s request or because of ‘numerous privacy laws and regulations’ that might require OpenAI to do so.”

ChatGPT already retains user conversations by default, using them to train its AI model for future conversations. However, it provides an option to turn off that setting, causing all conversations with a user to be forgotten. The service also has an ad hoc temporary chat feature, which deletes a chat as soon as it’s concluded.

In a letter objecting to the order, ChatGPT said that was being forced to compromise users’ privacy.

“OpenAI is forced to jettison its commitment to allow users to control when and how their ChatGPT conversation data is used, and whether it is retained,” it said. “Every day the Preservation Order remains in place is another day OpenAI’s users are forced to forgo the privacy protections OpenAI has painstakingly put in place.”

Read OpenAI’s full response here:

The publishers have no evidence that the deleted conversations contain more of their content, OpenAI added. It warned that users frequently share sensitive details in conversations that they expect to be deleted, including everything from financial information to intimate discussions about wedding vows.

Engineering the retention of data would take months, the AI giant added.

The background to the case

Three publishers (The New York Times, the New York Daily News and the Center for Investigative Reporting) had been suing OpenAI separately for copyright infringement. In January this year, the publishers joined their cases into a single lawsuit.

OpenAI argued that it could use the content under fair use rules because its AI model transformed the content, breaking it into tokens that it then blends with other information to serve its users.

ChatGPT has a memory

Even when it does delete chats, ChatGPT retains a separate memory of details shared in conversations that it can use to understand you better. These might include details you enter about your friends and family, or about how you like your conversations formatted. The service allows users to turn off references to these memories, or to delete them altogether.

Caution is key when giving information to any online service, especially AI services, where conversations are often fluid and free-flowing. It’s also a good idea to think twice before sharing anything you’d rather others didn’t see.