Trump Bans Anthropic AI in Federal Agencies — Pentagon Flags Claude as Security Risk

Trump Bans Anthropic AI in Federal Agencies — Pentagon Flags Claude as Security Risk
Trump Bans Anthropic AI in Federal Agencies — Pentagon Flags Claude as Security Risk
The U.S. government has taken unprecedented action against domestic AI firm Anthropic, directing all federal agencies to immediately stop using its AI model Claude and officially designating the company a supply chain risk to national security, a classification historically reserved for foreign adversaries like Huawei.

The standoff reached a critical point on February 28, 2026, when President Donald Trump announced on Truth Social that all federal agencies must “IMMEDIATELY CEASE all use of Anthropic’s technology.” He allowed a six-month phase-out period for departments such as the Department of War (DoW), which were already heavily integrated with the company’s products.

ywAAAAAAQABAAACAUwAOw==

Within hours, Defense Secretary Pete Hegseth followed with his own declaration on X, formally designating Anthropic a Supply-Chain Risk to National Security and announcing that “no contractor, supplier, or partner that does business with the United States military may conduct any commercial activity with Anthropic”.

The dispute centers on two narrow exceptions Anthropic requested to the lawful use of Claude: mass domestic surveillance of Americans and fully autonomous weapons.

The Pentagon demanded full, unrestricted access to Claude for “all lawful purposes,” but Anthropic’s CEO Dario Amodei refused, stating the company “cannot in good conscience accede” to those demands.

Anthropic had been the first frontier AI company to deploy models on the U.S. government’s classified networks, operating under a $200 million DoW contract since June 2024. For months, both parties engaged in private negotiations that ultimately broke down. In a last-ditch effort, the Pentagon issued Anthropic an ultimatum: comply by 5:01 PM ET on Friday or face being blacklisted.anthropic+1

Anthropic alleges that a Pentagon contract offer framed as a compromise “was paired with legalese that would allow those safeguards to be disregarded at will”.

Amodei argued that today’s frontier AI models are not reliable enough for fully autonomous weapons systems — a position he says protects American warfighters and civilians and that mass surveillance would constitute a fundamental violation of Americans’ civil rights.

Anthropic has vowed to challenge any supply chain risk designation in court, arguing the action is legally unsound under 10 USC 3252, which limits the designation’s scope strictly to Department of War contract use, not broader commercial relationships. This means individual customers, API users, and non-DoW contractors remain completely unaffected by the designation.

However, the broader industry impact could be severe. Anthropic depends on cloud computing infrastructure from Amazon, Microsoft, and Google — all of which hold defense contracts.

A strict interpretation of Hegseth’s language banning any entity that “does business with the military” from working with Anthropic could theoretically threaten those cloud relationships. Legal experts have warned the designation sets a “dangerous precedent,” noting it waters down a tool historically reserved for entities tied to foreign governments.

Trump warned Anthropic of “major civil and criminal consequences” if it fails to cooperate during the phase-out period. Anthropic has stated it remains committed to supporting lawful national security use cases and will work to ensure a smooth transition for U.S. troops and ongoing military operations.

The company maintains that no amount of government pressure will shift its position on autonomous weapons or domestic surveillance.

Follow us on Google News, LinkedIn, and X for daily cybersecurity updates. Contact us to feature your stories.

The post Trump Bans Anthropic AI in Federal Agencies — Pentagon Flags Claude as Security Risk appeared first on Cyber Security News.


Discover more from RSS Feeds Cloud

Subscribe to get the latest posts sent to your email.

Leave a Reply

Your email address will not be published. Required fields are marked *

Discover more from RSS Feeds Cloud

Subscribe now to keep reading and get access to the full archive.

Continue reading