The first commercial biological computer shipped in 2025. It is the size of a shoebox, priced at roughly thirty-five thousand dollars, and runs on approximately 800,000 living human neurons grown from induced pluripotent stem cells and plated onto a grid of electrodes. You can rent time on it over the cloud. The marketing term for this is “Wetware-as-a-Service.” It is not a metaphor.
When the book was written, the idea of computing on biological substrates was a curiosity. DNA data storage existed as a proof of concept, and academic neuroscience had been stimulating neuronal cultures for decades. None of it looked like computing in any useful sense. Since then, three distinct lines of work have matured enough to demand a category of their own.
Neuron-silicon hybrid systems are the most commercially visible. In 2022, the Australian company Cortical Labs published “DishBrain” — a culture of 800,000 living neurons that learned, over a few minutes, to play the video game Pong. In March 2025, Cortical Labs launched the CL1, described as the world’s first commercially available biological computer. Researchers can buy the hardware outright or rent cloud access to remote units. The Swiss startup FinalSpark runs its own neuron-powered platform, using dopamine and other neurotransmitters as chemical reward signals during training. These are real, paying-customer products. The devices are small and the applications are narrow, but the category is no longer speculative.
Organoid intelligence (OI) is the research programme that treats cultured human neural tissue — particularly the brain organoids and assembloids now routine in neuroscience labs — as a substrate for biological computation. The term was introduced in 2023 by a group at Johns Hopkins led by Lena Smirnova. OI is distinct from artificial intelligence (which is silicon) and from brain-computer interfaces (which connect existing brains to computers, rather than building new computers from neural tissue). The claim is not that organoids will outperform neural networks at a given task. The claim is that the energy efficiency, learning dynamics, and embodiment profile of living neurons are different enough to merit their own line of research — and that the substrate matters in ways that the silicon/biological distinction has historically elided.
DNA data storage is the third line, and the one closest to market. The 2020 Microsoft-Twist-Illumina-Western Digital alliance produced working demonstrations of writing and reading arbitrary digital data encoded as synthetic DNA. Catalog, a startup, has demonstrated DNA-based computation on the same substrate. The worldwide market is still small — under $125M in 2024 estimates — but the density numbers are extraordinary: theoretical DNA storage approaches an exabyte per cubic millimetre. Whether this ever becomes cost-competitive with tape for cold archival storage is genuinely uncertain. It is already technically real.
The most interesting dimension is the governance vacuum. Biological computing falls through the gaps between existing regimes. AI regulation treats software-on-silicon; it does not have hooks for hardware that is literally alive. Biomedical regulation handles the organoid source tissue but has not developed a framework for its commercial deployment as infrastructure. BCI regulation handles connections to existing brains, not the creation of new neural systems. A company can currently sell rackmounted living human neurons as a compute service and encounter effectively none of the oversight that would attend any of the adjacent activities done separately. This is the book’s Permissionless Innovation pattern applied to a substrate nobody drafted the rules for.
The environmental argument for biological computing is genuine and overstated. Living neurons use vastly less energy per operation than silicon — by some reckonings, the human brain runs at about twenty watts for capabilities that would require megawatts of conventional compute. Whether that efficiency translates at scale to any useful application is not established. The case that wetware could be the energy-responsible alternative to the AI data centre stack has been made; it is not yet proven. Given the prominence of AI data centre energy demand in current policy conversations (see Fusion, SMRs, and the Energy Stack), the incentives to overclaim are significant.
The moral-status question threads through all of this. The CL1’s 800,000 neurons are human neurons. They were derived, originally, from a donor’s stem cells. The company treats the product as hardware — a computing substrate, not a research sample — and there is no regulatory apparatus that says otherwise. This is where biological computing and the organoids question become inseparable: the “what are we computing on?” question is the same question the organoid ethicists are asking, reframed as a commercial matter.
Applying the three-level rule honestly:
What the book directly addresses. The book’s Technological Convergence framework (developed in the Transcendence chapter) is the best-suited tool the book provides. Biological computing is a textbook convergence case: biology × computing × materials science, with governance that was built for each domain separately. The book’s Hype vs. Reality framework also applies — OI in particular has attracted more press than its current capabilities warrant, and the discipline of counting assumptions matters here more than usual.
What the frameworks suggest when extrapolated. The book’s AI and Superintelligence frameworks ask the right questions about capability, manipulation, and the Could We? Should We? pattern — but they assume silicon substrates. Applied to wetware, the questions stretch. What does AI alignment mean when the system is biological? What does manipulation mean when the substrate can experience? What does shutdown mean for a computer that is also, in some limited sense, alive?
Where the frameworks reach their limits. The book’s treatment of mind uploading and the Transcendence question assumes that substrate either is or is not consciousness-bearing, and that the question matters chiefly for the identity of the entity being uploaded. Biological computing inverts this: the substrate is consciousness-candidate tissue from the start, and the identity question becomes whose tissue and what relationship does that impose on the commercial entity running it. This is genuinely new territory. The book’s frameworks invite the question but do not settle it.
Cronenberg’s eXistenZ (1999) is the closest cinematic sibling and is already part of Claude’s film recommendations. The Matrix is the obvious reference but so over-invoked in this territory that it flattens rather than sharpens the question. Possessor is again relevant, for the substrate-of-experience angle.