New proposal:
New proposal to prevent catastrophic escalation:

The Chip Security Act: Separating Fact From Fiction

The Chip Security Act (CSA) requires companies to verify that advanced AI chips remain in authorized locations to prevent diversion to foreign adversaries. For further background on the Act, see our explainer. Opposition to the CSA rests on fundamental misconceptions about what the law requires and how it works.

MYTH #1: “The CSA requires a security backdoor, kill switch, or spyware.”

FACT: The CSA explicitly forbids these mechanisms.

Nvidia says it wants “No backdoors. No kill switches. No spyware.” in its chips.

Neither do we. That's why we support the CSA, which prohibits these vulnerabilities. The bill’s text states that it may not be construed as requiring “any chip security mechanisms that may hinder the capability or functionality of a covered integrated circuit product, such as a kill switch or geofencing mechanism.” 

Ping-based location verification (PBLV) serves as a clear example of how the CSA's requirements can be met without compromising security. This approach measures the travel time of a signal to determine a chip's approximate location within ~50 miles. Because it can be implemented with a software update, it requires no changes to chip hardware. 

PBLV would NOT create a backdoor. A backdoor provides hidden, undocumented system access. But PBLV-enabling software would be transparent and controlled by the chip's owner, using existing authenticated channels for fleet management (the same secure channels Nvidia already uses for remote attestation services). The verification process would operate independently from computational workloads, much like how temperature monitoring runs separately from AI training. As Nvidia itself notes, “software features, controlled by the user, are not hardware backdoors.” 

PBLV would NOT create a kill switch. A kill switch would enable remote deactivation of GPUs. But PBLV-enabling software couldn't alter performance or disable operations. A PBLV-enabled chip would continue processing AI workloads exactly as designed, even if moved to a different location. Unlike actual kill switches, PBLV could not affect chip operation. 

PBLV would NOT enable spyware. Spyware is malware designed to access a device without the user's consent, collecting personal and sensitive information. But PBLV-enabling software would not allow anyone to see what models chips are training, what data they're processing, or what computations they're running. Its sole function would be to confirm a chip's approximate geographic region to counter smuggling. 

MYTH #2: “Location verification creates privacy risks.”

FACT: Location verification collects far less data than companies already gather, and the CSA does not permit government surveillance.

The operational data that companies like Nvidia collect on their own customers is far more invasive than any location data the CSA would require reporting. Even though PBLV doesn't create backdoors or kill switches, some worry about privacy implications of location tracking. But chip operators track power consumption and temperature, among other data, for each GPU. Nvidia actively promotes these extensive monitoring capabilities to its customers. If monitoring power patterns that could reveal AI training schedules is acceptable to boost companies' profits, then verifying a chip remains within a 50-mile radius is a reasonable measure to protect national security.

Concerns about government surveillance are also unfounded because the CSA puts primary verification responsibility in corporate—not government—hands. Companies receive location data, notifying authorities only if they obtain credible evidence of diversion or tampering. There's no real-time data feed to agencies. The data stays with companies unless there's a problem to report.

MYTH #3: “The CSA will create the Clipper Chip 2.0.”

FACT: The CSA’s location verification differs fundamentally from the Clipper Chip’s decryption backdoor.

The CSA would not recreate the Clipper Chip. The Clipper Chip was a 1990s hardware encryption device with a built-in government backdoor. The device required embedding special decryption keys in physical chips, allowing law enforcement to read encrypted communications. Security experts and privacy advocates argued the backdoor would weaken encryption and enable exploitation by malicious actors. The government abandoned the program amid fierce backlash. Critics argue the CSA repeats these mistakes. 

But these comparisons are misguided. For one, the CSA’s text directly refutes the comparison by explicitly forbidding backdoors like the Clipper Chip’s. And even setting aside the statutory prohibition, any comparison between PBLV and the Clipper Chip breaks down for at least three reasons.

First, implementing PBLV requires no hardware changes. The Clipper Chip required embedding new cryptographic hardware into devices. In contrast, PBLV could be implemented with a software update. 

Second, PBLV-enabling software cannot access data. The Clipper Chip was designed to read the content of private communications. Conversely, software to enable PBLV could not access computational content or monitor workloads. 

Third, the CSA creates a transparent and company-controlled process, not a secret government program. The Clipper Chip was a classified government program with secret algorithms. But the CSA is different: It creates a transparent, company-controlled process with unclassified reporting to Congress. Companies would control the entire verification process themselves, only notifying the government after confirming a diversion. 

MYTH #4: “Location verification creates new vulnerabilities.”

FACT: Software-based verification leverages existing infrastructure.

Location verification leverages infrastructure already built into AI chips. Modern GPUs include remote attestation capabilities that allow chips to prove their configuration and state to remote parties. Nvidia markets this feature for confidential computing; major cloud providers use it to ensure workload integrity. PBLV would simply extend this existing infrastructure to include geographic verification through a software update, introducing no new hardware vulnerabilities.

MYTH #5: “The technology isn’t feasible.”

FACT: Nvidia itself admits feasibility, and calls for further study are a delay tactic.

Nvidia’s own statements refute industry questions about technical feasibility. A letter from technology and semiconductor trade associations argues the technology is untested and requires years of additional review.

But Nvidia has already said that software-based location verification is “technically feasible to develop.” We’ll take their word for it. And we needn't rely solely on their admission: Working prototypes already exist on Nvidia H100 chips. Similar technologies have operated for decades in other contexts, from GPS systems to network latency measurements. Chips already transmit extensive telemetry about temperature and power consumption; adding software to verify location to this existing data stream requires minimal technical change.

Even so, the trade associations claim that the technology must be “validated at a global deployment scale” before the Chip Security Act can be implemented. This creates a Catch-22: They oppose deployment, then cite lack of deployment as evidence it won't work. Meanwhile, chips continue flowing to adversaries. This proposed delay serves corporate interests at the expense of national security.

The economic context may help to explain industry resistance: Companies profit from every chip sold, regardless of where it ends up. As White House AI Czar David Sacks explains, companies “turn a blind eye” to chip smuggling because “it’s profitable.” Industry objections should not delay action when those same companies profit from the problem.

Given the massive scale of ongoing chip diversion, delay is dangerous. An estimated 140,000 chips—worth $5-7 billion—were smuggled to China in 2024 alone. A feasibility study for technology that Nvidia admits is feasible constitutes unnecessary delay.

MYTH #6: “This will hurt U.S. exports and competitiveness.”

FACT: Verification enables expanded exports by resolving the security dilemma.

Verification unlocks responsible exports by solving the ally-access vs. diversion tradeoff. The Information Technology Industry Council claims that tracking requirements could "make U.S. semiconductor companies less competitive." The Software & Information Industry Association similarly raises competitiveness concerns.

This gets things exactly backwards. Without verification, we face a dilemma that strengthens adversaries regardless of which path we choose.

Option one: Restrict exports broadly. When we can't track chips, the alternative to combat diversion is to block shipments to regions with smuggling risk. While China currently lacks capacity to meet global demand, these restrictions still push other nations to explore alternatives. Broad export controls create incentives for them to invest in Chinese technology development rather than relying on restricted American supply. Each restriction accelerates China’s efforts to build indigenous capacity and strengthens their narrative that the U.S. is an unreliable partner.

Option two: Allow for widespread smuggling. Prioritizing market access means more chips flow through smuggling networks to adversaries. 

Verification breaks this deadlock. Location tracking allows expanded exports to trusted partners while preventing diversion. Allied nations stick with superior American technology because they can reliably access it. American companies grow their legitimate markets. China loses its smuggling pipeline. Rather than forcing a choice between security and competitiveness, verification delivers both.

MYTH #7: “Secure facilities have no compliance options.”

FACT: The CSA explicitly accommodates diverse operational environments.

The CSA provides explicit non-networked compliance paths for air-gapped facilities. The trade associations argue that CSA compliance is impossible because “many data centers operate in an ‘air-gapped’ environment.” But the CSA provides multiple compliance paths, including “on-site audits or inventories at the end-user's approved destination” and “certifications by a U.S.-headquartered entity, or its subsidiaries” that maintain secure control of the chips. These alternatives ensure various operational environments can comply without network connectivity.

The Bottom Line

The CSA verifies where chips are, not what they compute. It bans backdoors, kill switches, and spyware. It offers flexible compliance options. The technology exists, industry concedes feasibility, and the diversion problem costs billions while strengthening adversaries. The choice is clear: implement verification or watch American technology power threats against American interests.

Footnotes