In a bold move that has stirred the tech landscape, former President Donald Trump has directed all federal agencies to cease their use of AI technology from Anthropic, a prominent artificial intelligence developer. This unexpected command, shared on Truth Social, comes following Anthropic’s refusal to grant the US military unrestricted access to its AI tools, igniting a significant conflict between the tech firm and the White House.
The Escalating Tensions
The clash unfolded over several days, with Anthropic CEO Dario Amodei and US Defence Secretary Pete Hegseth engaged in heated discussions. Hegseth’s assertion that Anthropic poses a “supply chain risk” marks a troubling precedent, as it would be the first time a US company has faced such a designation publicly. Anthropic has made it clear that it intends to challenge this label in court, arguing that it would not only be legally dubious but set a perilous benchmark for future government negotiations.
The tensions intensified as Anthropic voiced its concerns about potential military applications of its AI technology, specifically in contexts of mass surveillance and the deployment of fully autonomous weapons. Hegseth and the Pentagon have insisted that Anthropic must comply with “any lawful use” of its products. Despite the pressure, Anthropic has stood firm on its principles, stating, “No amount of intimidation or punishment from the Department of War will change our position on mass domestic surveillance or fully autonomous weapons.”
Trump’s Directive and Its Implications
Trump’s announcement means that Anthropic’s tools will be phased out of government operations within six months. The implications extend beyond just Anthropic, as other firms contracting with the military may also be forced to sever ties with the AI developer. Prior to this directive, Anthropic had expressed a willingness to facilitate a smooth transition should the Pentagon opt to discontinue using its technology.

In his social media commentary, Trump warned Anthropic to “get their act together” during the transition or face severe repercussions, hinting at the potential for significant civil and criminal consequences. This rhetoric underscores the high stakes involved, both for Anthropic and the broader tech industry.
Industry Reactions and Broader Implications
Notably, the unfolding drama has drawn support for Anthropic from rival executives, including OpenAI’s Sam Altman. In an internal memo, Altman expressed solidarity with Amodei, stating that OpenAI shares similar concerns regarding the military’s use of AI for questionable purposes. While OpenAI has reached an agreement with the Pentagon for its AI models, the contrasting approaches of these two companies highlight a growing divide within the sector.
The recent developments have raised eyebrows across the tech landscape, with a former Department of Defence official noting that Anthropic’s reputation may have actually benefited from the controversy. “This is great PR for them and they simply do not need the money,” the official remarked, suggesting that Anthropic’s financial health remains robust despite the military contract’s worth of $200 million (£149 million).
A Fight for Principles in AI
As this saga continues, it is clear that the battle between Anthropic and the US government is about more than just business. It encapsulates a significant ideological struggle over the role of technology in warfare and surveillance, raising fundamental questions about ethics in AI development and deployment.

Why it Matters
The fallout from Trump’s directive and the ongoing confrontation with Anthropic could have lasting consequences for the future of AI regulation in the United States. As companies grapple with government demands and ethical considerations, the outcome of this dispute may well set critical precedents for how AI technologies are utilised in military and civilian contexts. The resolution of these tensions will not only impact Anthropic’s future but also shape the broader dialogue around AI governance, accountability, and ethical use in society.