Anthropic Pentagon Ban in Court Fight
Anthropic Pentagon ban prompted an emergency injunction filing to pause the supply chain risk designation and could preserve government contract business.

KEY TAKEAWAYS
- Anthropic sought a preliminary injunction to pause the Pentagon supply chain risk designation and federal usage ban.
- A federal judge signaled skepticism at the March 24 hearing and said a ruling would follow within days.
- The ruling will determine whether Anthropic can keep government-contractor work and shape procurement labeling for U.S. AI vendors.
HIGH POTENTIAL TRADES SENT DIRECTLY TO YOUR INBOX
Add your email to receive our free daily newsletter. No spam, unsubscribe anytime.
Anthropic filed an emergency injunction on March 25, 2026, to block the Anthropic Pentagon ban, seeking to preserve its ability to work with government contractors while a federal judge considers the challenge and prepares a ruling in the coming days.
Court Fight and Relief Sought
Anthropic v. Trump Administration is pending in the U.S. District Court for the Northern District of California. The company requested a preliminary injunction to pause the Pentagon’s supply chain risk designation and a parallel federal directive, allowing it to continue serving government contractors during litigation.
At a March 24 hearing in San Francisco, U.S. District Judge Rita Lin questioned whether the government’s actions were narrowly tailored to operational security needs. She expressed skepticism about the rationale, suggesting the measures appeared punitive rather than focused. Lin noted that if the concern was operational, the Pentagon could simply stop using Anthropic’s model instead of imposing a broader ban.
Designation Rationale and Impact
The Pentagon applied a supply chain risk designation to Anthropic under a national-security authority, marking the first such designation of a U.S. domestic company. President Trump issued a directive ordering federal agencies to cease using Anthropic technology.
Justice Department attorney Eric Hamilton told the court the designation resulted from Anthropic’s negotiating stance with military officials, which the Pentagon said undermined trust and raised concerns about potential sabotage or software manipulation.
The Pentagon stated it has no interest in using Anthropic’s Claude models for mass surveillance or fully autonomous weapons, arguing such uses are already prohibited by existing military policies.
Anthropic, led by CEO Dario Amodei, has established two nonnegotiable guardrails: banning mass surveillance of Americans and prohibiting fully autonomous weapons without human input. The company said its technology was the only AI system deployed in classified U.S. military systems before the designation. It also told the court that once government systems approve and run its software, it cannot remotely change or shut off the models. Anthropic contends the designation violates the First Amendment and amounts to an attempt to cripple the business.
The Justice Department acknowledged the supply chain risk label does not legally bar contractors with military ties from using Claude on nonmilitary work. However, it said a public social-media warning from Defense Secretary Pete Hegseth last month urging contractors to avoid Anthropic has created profound uncertainty and harmed the company’s business. Government counsel noted limits on the Defense Department’s authority over independent contractor relationships.
The statutory definition the government invoked characterizes supply chain risk as the risk that an adversary may sabotage, maliciously introduce unwanted functions, or otherwise subvert a national-security system. Judge Lin framed the core legal question as whether the government’s labeling of a supplier in procurement channels is lawful, a matter separate from the broader policy debate over AI uses.
A near-term ruling will determine whether Anthropic can maintain commercial ties to defense contractors while challenging the designation and will influence how broadly procurement labels can exclude domestic AI vendors from government supply chains.





