OpenAI Court Order Sparks Legal and Privacy Risks for AI Companies

OpenAI Court Order Sparks Legal and Privacy Risks for AI Companies

ntroduction

A recent court order in The New York Times Co. v. Microsoft Corp. & OpenAI, Inc. has triggered alarm bells across the AI, data privacy, and eDiscovery communities. On May 13, 2025, U.S. Magistrate Judge Ona T. Wang of the Southern District of New York issued a preservation order requiring OpenAI to retain and segregate all output log data that would otherwise be deleted—including data subject to user deletion under private agreements or privacy laws. OpenAI’s motion for reconsideration was denied.

Despite Judge Wang’s attempt to reassure stakeholders during the May 27 conference—emphasizing the non-public nature of preserved data and the provisional scope of the order—the core tensions remain unresolved. The clarification may soften immediate reputational concerns for OpenAI, but it does little to resolve the structural conflicts now emerging between litigation-driven discovery mandates and established data privacy regimes. As this preservation obligation persists, even temporarily, it forces AI developers, enterprise clients, and compliance professionals to confront a new legal reality: discovery imperatives may override privacy rights, private agreements, and global regulatory frameworks. The following analysis explores the depth of that disruption

I. The Preservation Order: A New Standard for AI Data?

In her May 13 order, Judge Wang compelled OpenAI to “preserve and segregate all output log data that would otherwise be deleted on a going forward basis.” This includes data that OpenAI:

  1. Had promised to delete under user agreements,
  2. Was obligated to delete under data privacy laws such as the GDPR or CPRA, and
  3. Might have otherwise treated as ephemeral.

OpenAI objected, citing technical burden, data minimization principles, and potential conflicts with international privacy obligations. Nonetheless, Judge Wang denied the motion for reconsideration, affirming that the integrity of litigation discovery trumps routine operational practices.

This could create an extraordinary scenario: AI-generated data is now discoverable content subject to indefinite retention under court supervision, regardless of user intent.

II. Private Agreements in Legal Crosshairs

Many AI providers’, promise users they can delete or control their data depending on which subscription you use, etc. These protections are often essential for enterprise adoption, especially where clients operate under strict internal compliance frameworks (e.g., SOC 2, ISO 27001, HIPAA, GLBA).

A. Contractual Conflicts

The preservation order creates a real possibility of breach of contract claims. If OpenAI cannot honor deletion requests due to litigation holds, enterprise clients relying on these commitments may be materially misled. This risk extends to:

  1. Data Processing Agreements (DPAs): Common in SaaS relationships, these often include destruction timelines that now conflict with court-mandated retention.
  2. Service-Level Agreements (SLAs): If SLAs define retention or destruction schedules, they may no longer be accurate.

Absent indemnity or revised contract language, AI providers and their customers may be left unprotected from cascading legal liabilities.

III. Regulatory Collisions: GDPR, CPRA, HIPAA & More

A. General Data Protection Regulation (GDPR)

Under Article 17 of the GDPR, data subjects have the right to erasure ("right to be forgotten"). Judge Wang’s order overrides this right by compelling retention of personal data—even where erasure has been requested and consent withdrawn.

This raises direct concerns under:

  1. Articles 6(1)(a), 6(1)(f): The lawful basis for processing may be eroded if a subject objects and no overriding legitimate interest exists.
  2. Article 5(1)(c): Data minimization principles are violated when data no longer needed must still be stored for litigation.

Given OpenAI’s data flows likely involve EU residents, and given its clients include European enterprises, this creates regulatory exposure under Article 83, which allows fines up to 4% of global turnover.

B. California Privacy Rights Act (CPRA)

Under the CPRA, California residents have the right to:

  1. Know what data is collected (Cal. Civ. Code § 1798.110),
  2. Request deletion (Cal. Civ. Code § 1798.105), and
  3. Limit data use (Cal. Civ. Code § 1798.120).

C. HIPAA and Sectoral Risks

If users—including patients, providers, or third-party contractors—input Protected Health Information (PHI) into AI systems, the resulting outputs may themselves contain or reflect PHI. Under the current preservation order, these outputs must now be retained indefinitely, potentially conflicting with obligations under the Health Insurance Portability and Accountability Act (HIPAA) and its implementing regulations. This risk is especially acute for covered entities or business associates who integrated OpenAI tools under the assumption that data would be ephemeral or subject to prompt deletion. If retention protocols contradict the organization's HIPAA-compliant data minimization and destruction policies, they may face heightened compliance exposure—including civil penalties under 45 C.F.R. § 160.404. AI vendors and healthcare clients must now reassess their data governance controls, consent workflows, and breach notification readiness in light of these indefinite retention obligations.

IV. Operational & Strategic Consequences

A. AI Vendors Now Must Operate in Litigation-Ready Mode

By imposing a preservation obligation over all output logs, the court transforms OpenAI’s infrastructure into a perpetual discovery vault. The effects are:

  1. Increased costs for infrastructure, indexing, and security,
  2. New risks of data breach or unauthorized access, and
  3. More frequent invocation of litigation hold procedures by default.

This places AI vendors and customers alike in constant discovery posture, even before any wrongdoing is alleged.

B. User Consent & Ethical Fallout

This order threatens the foundational assumption behind AI use: that ephemeral interactions are private or deletable. Users now face a lack of transparency over retention. From a data ethics perspective, this undermines:

  1. Informed consent models,
  2. Trust-based design, and
  3. AI explainability and governance protocols that limit data scope.

Enterprise customers may reevaluate integrations where the provider cannot guarantee autonomy over output retention.

V. Legal and Compliance Playbook for AI Stakeholders

For AI Developers:

  1. Revise Terms of Use and Privacy Policies to reflect potential preservation orders.
  2. Enable granular opt-outs for categories of data where preservation would not apply.
  3. Establish data segmentation protocols for litigation-hold vs. standard logs.

For Corporate Counsel & CIOs:

  1. Conduct contract audits of all DPAs, SLAs, and procurement templates with AI vendors.
  2. Add force majeure or litigation carve-outs related to data retention obligations.
  3. Confirm your exposure under the CPRA, GDPR, and sector-specific statutes.

For DPOs and Privacy Leaders:

  1. Flag AI output logs as a distinct data category in your data map.
  2. Create a protocol for dual obligation handling when deletion requests collide with preservation demands.
  3. Monitor regulatory developments in EU-U.S. data transfer law post-Schrems II for potential conflicts.

Conclusion

It is too early too tell what will happen and Judge Wang has tried to assuage fears by noting this order is only temporary but, regardless, it should give everyone reason to pause, rethink and reassess the operation framework of AI tools.

Sources

  1. Order Denying Reconsideration, id. (May 16, 2025), available at https://docs.justia.com/cases/federal/district-courts/new-york/nysdce/1%3A2023cv11195/612697/559.
  2. See https://news.bloomberglaw.com/ip-law/judge-seeks-to-allay-openai-privacy-concerns-in-copyright-case?utm_source=chatgpt.com.
  3. Letter from Counsel for OpenAI to Hon. Ona T. Wang (May 15, 2025), available at https://chatgptiseatingtheworld.com/wp-content/uploads/2025/05/Letter-of-OpenAI-re-logs-May-15-2025.pdf.
  4. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 (General Data Protection Regulation), 2016 O.J. (L 119) 1. See https://eur-lex.europa.eu/eli/reg/2016/679/oj/eng.
  5. Cal. Civ. Code §§ 1798.100–1798.199.100 (2023) (California Privacy Rights Act). See https://oag.ca.gov/privacy/ccpa.
  6. See https://iapp.org/news/a/post-schrems-ii-understanding-baden-wurttembergs-updated-guidance-on-international-data-transfers.
Go To Top