AI trained in California operates globally. A gene-edited embryo in China carries modifications that will propagate through future generations everywhere the descendants live. Sulfur particles released into the atmosphere above one country alter the climate for every country. A synthetic pathogen engineered in any laboratory is a risk to every population. The technologies the book explores — and the new ones that have emerged since — are global by nature. The governance systems meant to manage them are national by design.
International governance works tolerably well for some domains. Nuclear non-proliferation, while imperfect, has prevented the worst outcomes for decades. Trade agreements coordinate economic policy across borders. Climate accords, however inadequate, establish shared frameworks. But emerging technologies resist these models for several reasons.
Speed. Diplomatic frameworks take years to negotiate. AI capabilities advance in months. By the time an international agreement on AI governance is finalized, the technology it addresses may be two generations old.
Diffusion. Nuclear weapons require enrichment facilities that can be detected by satellites. AI models require only computing hardware and data — both globally distributed and increasingly accessible. Synthetic biology tools are becoming cheaper and more portable. The material control mechanisms that work for nuclear technology do not translate.
Fragmentation. The US, EU, and China have fundamentally different approaches to technology governance. The EU prioritizes rights and precaution (the AI Act). The US prioritizes innovation and market flexibility. China prioritizes state control and industrial policy. These differences are not bugs to be resolved — they reflect genuine differences in values, political systems, and strategic interests. Harmonization may be impossible; coordination is difficult but necessary.
Competitive dynamics. Each major power fears that regulating its own technology sector will hand an advantage to competitors who do not. This creates a race to the bottom in governance — or at minimum, a reluctance to act unilaterally on regulation that might constrain domestic innovation.
Everyone Has a Role extends beyond individuals to nations and institutions. The book argues that technology governance cannot be left to any single group — not to scientists, not to corporations, not to governments. The international version of this argument is that governance cannot be left to any single country or bloc.
Responsible Innovation in Practice offers an approach that does not depend on international agreements (which may never come). It argues for embedding responsible practices into the innovation process wherever it occurs — through professional norms, institutional policies, and industry standards that cross borders even when regulation does not.
Dual-Use Research and Biosecurity — developed through the book's discussion of Inferno and gain-of-function research — provides a model for thinking about technologies where the knowledge itself is the risk. Biosecurity governance has developed precisely because pathogens do not respect borders, and the lessons (both successes and failures) are transferable to other domains.
Risk and Innovation helps frame the trade-off. The goal is not to eliminate risk — that would mean eliminating innovation. The goal is to manage risk in ways that are proportionate, transparent, and accountable across jurisdictions. This is extremely difficult. It is also necessary.