Commercial tech and AI companies entering the federal market face a hard lesson: Federal contracts do not work like commercial software licenses. GSA’s proposed AI clause is where that lesson gets expensive.
If your company sells software or AI-powered services commercially, your deal model is built on familiar assumptions: You license your product, you retain your IP, you use customer data to improve your models, and your terms of service govern the relationship. The federal government buys differently, and the gap between commercial and government expectations is wider for AI than it has ever been for any other technology category.
GSA’s proposed AI clause, GSAR 552.239-7001, makes that gap explicit. For companies using or planning to use a GSA Multiple Award Schedule to reach federal customers, the clause creates obligations that are structurally incompatible with standard commercial software licensing. Here are some of those potential conflicts and why you should consider before you sign a contract with a government contractor customer.
The Commercial Model Does Not Transfer
The federal government has spent 30 years trying to buy commercial software on commercial terms with reasonable success. For AI, GSA has concluded that framework does not work. The proposed clause explicitly overrides commercial licensing terms — including EULAs, terms of service, and end-user license agreements — and claims ownership and use rights that no commercial AI vendor currently offers in its standard licensing.
Three specific conflicts surface for almost every commercial AI company:
- Your IP ownership model
Commercial licensing means you own your model and your outputs; your customer licenses the results. The government’s position is different. The clause claims full ownership of all “custom developments,” i.e., any modifications, customizations, configurations, or enhancements made to your AI system during contract performance. If you configure your model to use government terminology, connect it to their data sources, or tune its outputs for a specific agency use case, the resulting implementation could belong to the government. As a result, you may not be able to use it commercially without the contracting officer’s express authorization. If your product roadmap assumes you can take learnings from a government deployment into your commercial product, that assumption needs to be revisited now. - Your data use model
Most commercial AI services improve over time using interaction data. The government does not permit this. The clause bars both you and your AI vendors from using any government-submitted data or generated outputs to train, fine tune, or improve your model. It also requires “eyes off” data handling and data localization restrictions. For AI companies whose product improvement cycle depends on training data from customer interactions, this is not a paperwork issue. It likely requires either a dedicated, architecturally segregated government deployment or a fundamental change in how you handle government data from day one. - Your supply chain
Here is the issue that catches commercial tech companies most off guard: The clause holds you responsible for the compliance of any entity that “directly or indirectly provides, operates, or licenses” an AI system you use in performance — whether or not that entity is your subcontractor. If your product runs on a foundation model from a major AI lab, that lab is likely a service provider under the clause. If the lab’s standard terms allow it to use API call data for model improvement, as many do, you are exposed, even if you have no direct contract with the lab and no way to compel its compliance.
| Practical scenario: A company uses OpenAI’s API to power its document analysis product. Under the proposed clause, OpenAI is a service provider. The company is responsible for ensuring OpenAI does not use government-submitted documents to train its models. Whether OpenAI will agree to those restrictions is a question that needs an answer before the first government customer submits a document, not after. |
The “American AI” Problem — Especially for International Companies
The clause prohibits use of AI components “manufactured, developed, or controlled by non-U.S. entities” and requires use of only “American AI Systems” — a term the clause does not define. The ambiguity is significant given how interconnected global AI supply chains are. A U.S. company whose foundation model incorporates open-source code with international contributors, uses cloud infrastructure with non-U.S. staff, or employs a third-party model developed in the UK or Canada faces real uncertainty about whether its AI stack qualifies.
For international companies entering the U.S. federal market — allied defense technology companies in particular — this provision requires specific legal analysis before SAM.gov registration, not after award. The Trade Agreements Act already provides a framework for allied-country products on GSA Schedules, but whether the “American AI” requirement will be interpreted consistently with the TAA framework is an unresolved question that GSA has not addressed.
What This Means Practically
The federal market is large, growing, and genuinely accessible to technology companies that have not traditionally pursued government work. Federal AI spending more than doubled from 2023 to 2024. The opportunity is real.
But the rules are different — on IP ownership, data use, domestic sourcing, and supply chain responsibility — and they are not negotiable in the way commercial terms are. The companies that compete successfully in this market over the long term are those that understand these rules before they sign their first contract, not those that discover them during their first compliance review.
If you are a commercial technology or AI company considering a GSA Schedule, an international company evaluating the U.S. federal market, or an existing schedule holder that has not yet assessed your AI exposure, the right time to engage on these questions is now, before the Refresh 32 mass modification arrives with a 60-day clock.
If you have questions about how federal procurement rules affect your AI product or your plans to enter the federal market, please contact Nathaniel Greeson or Aron Beezley.
