"If an AI creates something beautiful, who does it belong to?"

An AI generates a stunning image from a text prompt. A musician uses AI to compose a symphony. A novelist uses an LLM to write chapters of a book. A graphic designer loses their job to a tool that can produce in seconds what took them hours. In each case, the question arises: who is the author? Who owns the result? And is the current legal and moral framework for creative ownership remotely equipped to answer?

Why This Question Is Hard

Copyright law, in most jurisdictions, requires a human author. The US Copyright Office has ruled that purely AI-generated images cannot be copyrighted. But the boundary between "AI-generated" and "AI-assisted" is blurry and getting blurrier. A person who writes a carefully crafted prompt, iterates through dozens of variations, makes creative choices about composition and style, and curates the final output is exercising creative judgment. At what point does that judgment constitute authorship?

The training data problem is equally thorny. AI models that generate art were trained on billions of images, texts, and musical compositions created by humans. The creators of that training data were overwhelmingly not compensated, not credited, and not consulted. The legal question — whether training an AI model on copyrighted works constitutes fair use or infringement — is being litigated in courts worldwide. The philosophical question — whether it is morally acceptable to build commercial products on the unconsented labor of millions of creators — sits underneath.

The labor displacement dimension makes this more than an abstract debate. Illustrators, voice actors, copywriters, and designers are watching their livelihoods erode as AI tools make their skills commercially reproducible at a fraction of the cost. The counterargument — that AI democratizes creative tools, allowing people without formal training to produce professional-quality work — is real but cold comfort to the professionals it displaces.

And there is a deeper question about what we value in creative work. If a poem moves you, does it matter whether a human or a machine wrote it? If an image is beautiful, does the absence of human experience behind it diminish its beauty? The book's argument in The Role of Art and Culture — that art is how we process technological change — suggests that human intent, experience, and struggle are part of what makes art meaningful. But this is a contested claim, not an obvious truth.

What the Book Brings to This

The Role of Art and Culture is the starting point. The book argues that science fiction films matter because they are how we collectively work through our fears and hopes about technology. If the art that helps us navigate technological change can itself be produced by the technology, we are in recursive territory — and the question of whether AI-generated art can serve the same function as human-created art is genuinely open.

Power, Privilege, and Access provides the equity lens. The companies that control the most powerful generative models control a new means of cultural production. The artists whose work trained those models received nothing. The users who benefit from cheap creative tools are, in many cases, the clients who used to pay human artists. The redistribution is from creative workers to technology platforms and their users — a transfer that the Man in the White Suit's framework predicted with precision.

Corporate Responsibility asks what obligations the companies that built these tools have — to the artists whose work was used in training, to the workers being displaced, and to the cultural ecosystem that AI-generated content is disrupting. So far, the answer has largely been: the same obligations they give themselves, which is to say, few.

Could We? Should We? — the book's central thread — applies at the civilizational level. We can build machines that produce art. Should we? Under what conditions? With what protections for the people who are affected? These are not questions that market forces will answer well, because market forces optimize for cost and convenience, not for the health of a culture.

Explore Further