TLDR A federal judge in San Francisco issued a preliminary injunction blocking the Pentagon’s ban on Anthropic’s Claude AI Judge Rita Lin called the ban “classicTLDR A federal judge in San Francisco issued a preliminary injunction blocking the Pentagon’s ban on Anthropic’s Claude AI Judge Rita Lin called the ban “classic

US Judge Blocks Trump’s Ban on Anthropic’s AI Technology

2026/03/27 18:03
3 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

TLDR

  • A federal judge in San Francisco issued a preliminary injunction blocking the Pentagon’s ban on Anthropic’s Claude AI
  • Judge Rita Lin called the ban “classic illegal First Amendment retaliation”
  • The dispute started after Anthropic refused to allow Claude to be used for lethal autonomous weapons or mass surveillance
  • Anthropic held 32% of the enterprise AI market as of 2025, ahead of OpenAI at 25%
  • The injunction is paused for seven days to allow the government time to appeal

A US federal judge has temporarily blocked the Trump administration’s ban on government use of Anthropic’s AI technology, pausing measures that the company said could cost it billions in lost revenue.

Judge Rita Lin of the US District Court for the Northern District of California issued the preliminary injunction on Thursday. She put the order on hold for seven days to give the government a chance to appeal.

The case centers on a deal struck in July 2025 between Anthropic and the Pentagon. The contract would have made Claude the first frontier AI model approved for use on classified networks.

Negotiations broke down in February 2026. The Pentagon wanted to renegotiate, demanding Anthropic allow military use of Claude “for all lawful purposes” and without restrictions.

Anthropic refused. The company said its technology should not be used for lethal autonomous weapons or mass domestic surveillance of Americans.

The Defense Department then designated Anthropic as a national security supply chain risk. Anthropic sued in federal court in Washington, DC, on March 9, arguing that Defense Secretary Pete Hegseth had overstepped his authority.

Judge Questions the Government’s Reasoning

A 90-minute court hearing took place in San Francisco on March 24. Judge Lin questioned government lawyers on whether Anthropic was being punished for publicly criticizing the Pentagon.

During the hearing, an Anthropic attorney noted that the Pentagon can review any AI model before deploying it. Anthropic also has no way to stop a model from working, change how it operates, or see how the military is using it.

What Both Sides Argued

A government lawyer argued that Anthropic destroyed trust during contract negotiations by trying to dictate Pentagon policies. The lawyer said the government was concerned about the risk of “future sabotage” from Anthropic.

Judge Lin rejected that reasoning. She said the Justice Department had no “legitimate basis” to conclude that Anthropic’s stance on restrictions could lead it to become a saboteur.

Anthropic held 32% of the enterprise AI market as of 2025, ahead of OpenAI at 25%, according to Menlo Ventures. A government-wide ban would have put that position at risk.

The case is Anthropic v. US Department of War, 26-cv-01996, US District Court, Northern District of California.

The post US Judge Blocks Trump’s Ban on Anthropic’s AI Technology appeared first on CoinCentral.

Market Opportunity
Comedian Logo
Comedian Price(BAN)
$0.05654
$0.05654$0.05654
-0.08%
USD
Comedian (BAN) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.