Twenty years ago, Minority Report already understood why predictive police would fail

Twenty years ago, Minority Report already understood why predictive police would fail

Twenty years ago

"There is nothing wrong with the system. It's perfect, ”explains precrime division captain John Anderton (Tom Cruise). "I agree. If there is a mistake, it's human, ”replies Danny Witwer (Colin Farrell), a federal agent hunting down errors in that very system. In reality they are both wrong, as will be understood in the course of Minority Report: Steven Spielberg's film - based on a short story by Philip K. Dick - released in Italy on September 27, 2002. Exactly twenty years ago.

In a historical phase like ours, in which artificial intelligence systems decide the future of some people on the basis of the expected behaviors, the moral theme investigated in the story and in the film is extremely topical: it is possible to arrest someone for a crime that never committed, just because a tool predicted it would?

The issue is addressed in the opening part of the film: “Let's not fool ourselves, we are arresting individuals who have not broken any law”, always states Witwer. To those who reply that precogs - the three individuals with extrasensory powers who can see crimes before they happen - observe the future and are never wrong, Witwer replies: “It's not the future if you stop it. Isn't that a fundamental paradox? " .

This is where Anderton / Tom Cruise enters the debate, confidently saying that predetermination is something we exploit every day. Saying this, he throws a ball at Witwer who catches it before he falls.

"Why did you take it?"

"Why would it fall"

"Are you sure?"

“Yes”

“But she didn't fall. You took it. The fact that you prevented it from happening doesn't change the fact that it was going to happen anyway "

Chapter closed, for the moment (but things are obviously not that simple: we'll come back to it later). For those who have dealt with artificial intelligence in recent years, however, the stench of burning begins to be felt at another very specific moment, that is, when Anderton - always replying to Witwer, who explained how some idolize precogs as if they were divine beings ( another curious parallelism with artificial intelligence) - he explains: "They are filters for pattern recognition, nothing else".

Pattern recognition, which allows you to find correlations within a flood of data, is precisely the capacity of deep learning algorithms, obviously including those used for predictive police, i.e. the artificial intelligence software used by law enforcement agencies in various parts of the world (including Italy) to test to predict in which areas and times crimes are most likely to occur or even where and when a serial robber might act.

Refined statistical tools of pattern-recognition which, if equipped with a sufficient number of data, are able to complete even the most difficult tasks, in an apparently infallible way. Once again, in short, if there is a defect it can only be human, for example an error in the collection or cataloging of data (an error which in fact is often the basis of the so-called "algorithmic prejudice").

But is it really so? Can the flaw be only human and not in the system itself, be it the deep learning algorithms or the precogs? It is a theme that opens the very complicated question of free will (also recently dealt with in TV series such as Westworld or Devs). In extreme synthesis, such a position inevitably implies that - if we could perfectly collect and catalog the data and have it processed by an infallible algorithm (or precog) - then it would really be possible to predict the future.

Obviously, It is not so. Even if we deny the existence of free will (and therefore we think that human behavior follows the same inescapable laws of physics as a falling ball), the data and unknowns that we should calculate would simply be too many. In the science fiction of Minority Report it may in fact happen that - as explained by Iris Hineman, the involuntary creator of the PreCrime program - among the precogs there is a disagreement forecast (which is promptly eliminated) and therefore "every now and then those accused of a precrime could, even they just might have an alternate future, ”as Hineman explains. In addition, the ability of the director of PreCrime, Lamar Burgess, to engineer the deception against Anderton, and that of Anderton himself to resist (partially) the prediction of the precogs, demonstrates that the system itself is fallible: it is not the error must necessarily be human.

Not only that: the same mechanism that brings Anderton face to face with the alleged killer of his son is a classic self-fulfilling prophecy: the character played by Tom Cruise retraces the path that will lead him into a trap precisely because he is investigating the possibility of being set up. It is a science fiction paradox that shows not so much the impossibility of escaping the future, as of predicting it.

The most surprising aspect, however, is that when dealing with artificial intelligence in the real world, things are not so different. The most impressive case is related to the use of predictive police. In 2013, the artificial intelligence algorithm that helped Chicago law enforcement detect potential suspects estimated that this Robert McDaniel had a 99.9% chance of being involved in a shooting, either as an attacker or as a victim. br>
Since then, McDaniel has effectively been the victim of armed assaults twice: in 2017 and 2020, surviving both. About an algorithm worthy of the precogs? According to an investigation conducted by The Verge, things went very differently: it was precisely the prediction of the algorithm that condemned McDaniel. How is it possible? According to the reconstruction, the reporting of the predictive police system triggered constant police surveillance of McDaniels, who however lived in one of the most problematic neighborhoods in Chicago.

Seeing the wheel often in front of his house, and without McDaniel ever being involved in any particular crimes, neighborhood residents began to suspect that there might be some kind of connection between McDaniel and the police. For example, that McDaniel could be an informant. As Matt Stroud wrote in his investigation, "Algorithm signaling did the damage its creators hoped to avoid: it predicted a shooting that wouldn't have happened if it wasn't planned." Not so different from what happens in Minority Report with the Burgess trap.

The theme of self-fulfilling prophecies, in the reality of predictive police, also concerns other aspects. These tools - which the European Union may soon ban - tend to create vicious circles, in which, for example, black people are increasingly stopped for checks even if they have done nothing (because "statistically" they have greater likely to be criminals) and where more difficult neighborhoods are being patrolled more and more, leading to more complaints and thus even more patrolling. In short, statistics only reinforce and justify behavior based on the statistics themselves.

A repressive mechanism, based only on numbers and in which all attention is given to the repression of the most difficult neighborhoods and populations left on the sidelines, no longer paying attention to the development of certain areas, to the reasons why some areas are difficult and to active policies that allow problems to be solved.

Again, something very similar happens in Minority Report, where it is explained that only six years before the year in which the film is set (2054) "the murder rate had reached epidemic proportions" and "it seemed that only a miracle could stop the bloodbath". A miracle in the form of precog, pre-crime and repression, which however lays the foundations for everything to return to as before as soon as this system is retired.

It is another element in common with our reality, in which we relies on predictive police, surveillance, facial recognition and all other weapons potentially useful to suppress crime (restricting freedoms) without ever evaluating and fighting the causes. Between our reality and the science fiction of Minority Report - where among other things we also find autonomous cars, smart speakers, personalized ads and more - there is however a crucial difference: at least in the Washington of the film the ultimate decision on whether to use it or not of the PreCrime was up to the citizens through a referendum.







Powered by Blogger.