"Why does it feel like nobody asked me about any of this?"

Gene editing. Autonomous weapons. Surveillance infrastructure. AI systems that shape what you see, think, and buy. Geoengineering proposals that would alter the atmosphere. These technologies affect everyone. Almost none of them were developed with meaningful public input. The feeling that nobody asked is not paranoia — it is an accurate description of how technology governance currently works.

Why This Question Is Hard

The democratic deficit in technology governance is not a conspiracy. It is a structural problem with multiple causes.

Speed is one factor. Technology development moves faster than democratic deliberation. By the time a legislature understands a technology well enough to regulate it, the technology has already been deployed, markets have formed around it, and changing course is expensive and politically difficult. This is the Collingridge dilemma in its political form.

Expertise is another. Many emerging technologies are genuinely difficult to understand. The public cannot meaningfully participate in decisions about gain-of-function research governance or AI alignment if the underlying concepts are inaccessible. This creates a dependency on experts — who have their own interests, biases, and blind spots.

Capital shapes the landscape. Technologies are developed by companies that answer to investors, not to the public. The decision to build a frontier AI model, to pursue heritable gene editing, or to deploy facial recognition is made in boardrooms, not ballot boxes. Regulation, where it exists, is reactive — it responds to harms that have already occurred rather than shaping what is developed in the first place.

And there is a diffusion of responsibility. No single decision-maker chose the current technological landscape. It emerged from millions of individual decisions — by researchers, investors, engineers, regulators, and consumers — none of whom were thinking about the cumulative effect. The result is a world that nobody exactly chose but that everybody inhabits.

What the Book Brings to This

Everyone Has a Role is the book's most direct response to this feeling. It argues that technology governance is not just for experts, policymakers, and corporate leaders. Parents, teachers, voters, consumers, and community members all have legitimate stakes in how technology is developed and deployed — and they have more power to influence outcomes than they typically realize.

Responsible Innovation in Practice offers a framework that goes beyond regulation. It argues for embedding public deliberation into the innovation process itself — not as an afterthought, but as a core component. This means bringing diverse voices into technology development before products are launched, not after harms are discovered.

Permissionless Innovation names the dynamic directly. The ethos of "move fast and break things" — building and deploying without waiting for permission or consensus — has produced extraordinary innovation and extraordinary disruption. The book does not argue against innovation. It argues that the "permissionless" part has consequences, and that those consequences are borne disproportionately by people who had no say.

Don't Panic — the book's closing argument — is relevant here too. The feeling of powerlessness in the face of technological change can lead to disengagement, which is itself a form of abdication. The book's message is that engagement, even imperfect and partial, matters — that the alternative to expert-only governance is not ignorance but participation.

Explore Further