Directed by Steven Spielberg | Based on the short story by Philip K. Dick
In the year 2054, Washington D.C. has virtually eliminated murder thanks to the "Precrime" unit. Three psychics, known as "precogs," float in a pool and receive visions of murders before they happen, allowing a specialized police force to arrest would-be killers before they act. Chief John Anderton is the unit's most devoted believer, until the precogs predict that he himself will commit a murder in thirty-six hours. Suddenly on the run from the system he championed, Anderton discovers that the system he trusted is built on a foundation far more fragile than anyone wants to admit.
This page discusses key plot elements from Minority Report, including the nature of the film's central twist. The film is a taut, intelligent thriller that rewards viewing, so consider watching it first if you have not. But the ideas it opens up are worth exploring regardless.
Minority Report was released in 2002, but the technologies it explores have become far more relevant since. The chapter uses the film as a springboard for examining predictive algorithms, machine learning, big data, and the growing use of surveillance technologies to anticipate and prevent crime. While we do not have psychics floating in pools, we do have algorithms that claim to predict criminal behavior, software that flags potential offenders, and surveillance systems that monitor vast populations in real time. The questions the film raises about these technologies are no longer science fiction.
At the center of the chapter's analysis is a deceptively simple problem: what does it mean to punish someone for something they have not yet done? The Precrime system in the film operates on the assumption that its predictions are infallible. But the film reveals that they are not. The "minority report" of the title refers to the fact that the three precogs do not always agree. When one sees a different future from the other two, that dissenting vision is suppressed. The system's authority depends on the illusion of certainty, and that illusion is maintained through a convenient lie.
The chapter connects this to real-world developments in predictive policing and algorithmic decision-making. Algorithms trained on historical crime data inevitably absorb the biases embedded in that data. If certain communities have been disproportionately policed and arrested in the past, the algorithms will flag those same communities as high-risk in the future, creating a self-reinforcing cycle of surveillance and suspicion. The film's vision of a system that appears objective but is actually deeply flawed turns out to be uncomfortably close to reality.
Beyond prediction, the chapter explores the broader implications of ubiquitous surveillance and big data. Minority Report imagines a world where personalized advertising follows you through public spaces, retinal scanners track your every movement, and privacy has essentially ceased to exist. This was speculative in 2002. Today, much of it describes the world we already inhabit, from facial recognition systems deployed in cities worldwide to the vast quantities of personal data harvested by tech companies. The chapter asks what happens to a society when everything is recorded, when every action is tracked, and when the very concept of a private life begins to dissolve.
Minority Report's concerns about surveillance and algorithmic control connect directly to Ghost in the Shell, which imagines a world of pervasive digital surveillance and hacking. The theme of convenient lies that sustain flawed systems echoes through Never Let Me Go. And for more on the promises and perils of artificial intelligence, see Ex Machina.