Skip to Main Content
  • About Us
  • People
  • Capabilities
  • News & Insights
  • Events
  1. Insights
  2. Publications

Updating Litigation Hold Policies for the Age of Generative AI

Connecticut Employment Law Blog | Blog

By: Daniel A. Schwartz

November 18, 2025

Lawyers

Biography Photo of Daniel Schwartz
Daniel A. Schwartz

Partner

860.251.5038

dschwartz@goodwin.com
  • -

I recently got back from the American Bar Association Annual Labor & Employment Law Conference — an event I’ve talked about before on this blog.

There were a number of great CLE programs — far too many to list. Not surprisingly, Generative AI remained a hot topic and the sessions caused me to continue to think about how we might tackle this in litigation.

So, this is the first of two parts dealing with Generative AI in the lawsuit context. One of the sessions confirmed what I had been thinking about for a while: Generative AI has changed discovery obligations for employers, and most companies haven’t updated their litigation hold processes to reflect this reality.

When an employee files a discrimination claim or you anticipate litigation, your legal team sends a litigation hold notice. The notice typically covers emails, texts, documents, and electronic communications.

But what about the employee’s ChatGPT conversation history? Their Claude.ai chats? Their Microsoft Copilot queries?

These tools are quickly becoming workplace staples. Employees use them to draft performance reviews, prepare for difficult conversations, analyze HR policies, and craft complaints about workplace issues.

This data is discoverable, and employers need to preserve it.

What You Must Preserve

When an employer receives notice of a claim or reasonably anticipate litigation, its duty to preserve evidence extends to GenAI interactions if they’re relevant to the claims.

This includes:

  • Company-provided AI tools. If your organization licenses ChatGPT Enterprise, Microsoft Copilot, or similar tools, employee usage data exists and must be preserved. Work with IT to identify what data your AI vendors retain and for how long.
  • Employee personal AI accounts used for work. If employees use personal ChatGPT accounts to draft work emails or review company documents, those conversations are discoverable. Your litigation hold should instruct employees to preserve this data.
  • AI-assisted document creation. Emails, memos, or complaints drafted with AI assistance need metadata showing AI involvement. Some tools track revision history that reveals AI contributions.
  • Integration with workplace tools. Many companies now use AI features embedded in Slack, Microsoft Teams, or Google Workspace. These interactions may need preservation alongside traditional communications.

The Timing Problem

Most employees don’t think about preserving their AI chat history. Most AI tools don’t automatically archive conversations beyond a limited period.

If you wait too long to identify and preserve this data, the data disappears.

Some AI providers delete conversation history after 30 days. Others retain data indefinitely but allow users to delete at any time.

Once an employee knows litigation is coming, they might clean up their AI conversation history without realizing this violates preservation obligations.

Practical Steps to Take Now

So what are some things employers can do now?
First, update your litigation hold template to explicitly reference GenAI tools. Include instructions on how to preserve ChatGPT conversations, export Claude.ai chats, and save Microsoft Copilot interactions.

Next, work with IT to identify which AI tools your organization licenses and what data retention policies apply. Understand where the data lives, how long vendors retain information, and what format allows for export.

In addition, companies may want to train employees on preservation obligations. Most employees don’t think of their ChatGPT chats as “documents” subject to litigation holds.

Lastly, consider AI usage in your document retention policies. Should employees be allowed to use personal AI accounts for work tasks? If so, how do you ensure that data is preserved when needed?

Why This Matters

Courts expect parties to preserve relevant evidence. Failure to preserve GenAI data when you knew or should have known about litigation can result in sanctions, adverse inference instructions, or even dismissal of defenses.

This isn’t theoretical. Employees are using AI tools daily. When litigation arises, the day is quickly coming when judges may ask whether your company preserved AI-related data.

If you need assistance, as always contact your litigation counsel to make sure you are following best practices when it comes to litigation holds.

Keep in Touch

Stay current with our latest insights

Manage Subscriptions
  • Lawyers
  • Capabilities
  • Events
  • Diversity, Equity and Inclusion
  • Pro Bono and Community
  • Blogs and Resource Centers
  • Insights
  • Podcasts
  • Dobbs Decision Resource Center
  • About Us
  • Careers
  • Contact Us
  • Disclaimer
  • Privacy Policy
  • Terms of Use
  • Accessibility Statement

© Shipman & Goodwin LLP 2025. All Rights Reserved