TLDR The Pentagon’s CTO said Anthropic’s Claude AI has built-in policy preferences that could compromise military effectiveness. Anthropic became the first AmericanTLDR The Pentagon’s CTO said Anthropic’s Claude AI has built-in policy preferences that could compromise military effectiveness. Anthropic became the first American

The Pentagon Says Claude Has a “Soul” — And That’s the Problem

2026/03/12 21:47
3 min di lettura
Per feedback o dubbi su questo contenuto, contattateci all'indirizzo crypto.news@mexc.com.

TLDR

  • The Pentagon’s CTO said Anthropic’s Claude AI has built-in policy preferences that could compromise military effectiveness.
  • Anthropic became the first American company ever labeled a supply chain risk by the Defense Department.
  • Defense contractors must now certify they do not use Claude in Pentagon-related work.
  • Anthropic sued the Trump administration Monday, calling the move “unprecedented and unlawful” and warning hundreds of millions in contracts are at risk.
  • Despite the ban, Palantir CEO Alex Karp confirmed his company is still using Claude for U.S. military operations.

The Pentagon designated Anthropic as a supply chain risk earlier this month, making it the first American company to receive that label. The designation has historically been used against foreign adversaries.

Defense Department CTO Emil Michael explained the decision Thursday in an interview on CNBC’s “Squawk Box.” He said Claude’s built-in “constitution” — a document Anthropic uses to shape the model’s behavior — creates policy preferences that could affect how the AI performs in military settings.

Anthropic published the most recent version of Claude’s constitution in January 2026. The company says it plays a “crucial role” in training its models and “directly shapes Claude’s behavior.”

The supply chain designation means defense contractors and vendors must now certify they are not using Claude in any work they do for the Pentagon.

Anthropic was founded in 2021 by researchers who left OpenAI. It has built a strong enterprise business, including early contracts with the Defense Department.

Anthropic pushed back hard on the Pentagon’s move. On Monday, the company filed a lawsuit against the Trump administration, calling the supply chain designation “unprecedented and unlawful.”

In the filing, Anthropic said it is being harmed “irreparably” and that hundreds of millions of dollars in contracts are now in doubt.

Pentagon Denies Active Outreach to Companies

Michael dismissed Anthropic’s claims that the government was actively contacting companies and warning them not to use Claude. He called those claims “rumors.”

He also acknowledged that the transition away from Claude will take time. The DOD has a transition plan in place, he said, noting that removing deeply integrated AI tools is more complex than deleting a desktop application.

Claude Still in Use for Military Operations

Despite the designation, Claude is still being used in some military contexts. CNBC previously reported that the AI was used to support U.S. military operations in Iran.

Palantir CEO Alex Karp confirmed Thursday that his company, one of the largest defense contractors in the U.S., is still using Claude.

The post The Pentagon Says Claude Has a “Soul” — And That’s the Problem appeared first on CoinCentral.

Disclaimer: gli articoli ripubblicati su questo sito provengono da piattaforme pubbliche e sono forniti esclusivamente a scopo informativo. Non riflettono necessariamente le opinioni di MEXC. Tutti i diritti rimangono agli autori originali. Se ritieni che un contenuto violi i diritti di terze parti, contatta crypto.news@mexc.com per la rimozione. MEXC non fornisce alcuna garanzia in merito all'accuratezza, completezza o tempestività del contenuto e non è responsabile per eventuali azioni intraprese sulla base delle informazioni fornite. Il contenuto non costituisce consulenza finanziaria, legale o professionale di altro tipo, né deve essere considerato una raccomandazione o un'approvazione da parte di MEXC.