The Internet has recently blown up a sensational statement by US researchers. They stated that they had created and trained a neural network capable of determining a person’s propensity for criminal activity from photography. That is, to understand who you have to deal with, you need to analyze his photo! Naturally, a lot of questions arose for the developers.
However, they have not yet been able to answer them. A press release published by Nathaniel Ashby and Ruzbet Sadeghyan has already been removed from the Harrisburg University website under a flurry of criticism. There are too many omissions and hints without specificity. Their statements about the methodology for determining “whether a person becomes a criminal”, or not, were called into question, and the authors of the publication were accused of immoral attitude to the human person.
From the minimum information that their already remote release managed to extract, it becomes clear that we are talking about a neural network that has undergone deep training. Allegedly, she is able to identify the smallest (and not noticeable) details on high-quality photos of subjects that indicate certain inclinations of one or another individual. It is known that the new release will be posted on the site after it is corrected and corrected inaccuracies. However, in theory, the creation of such a neural network is more than possible.
If we assume that such a neural network actually exists, then it may turn out that the world, in the usual sense, will encounter a wall of mistrust where it will be possible to judge a person simply by looking into his face, and the saying “written on his face” will take real meaning. Such developments will undoubtedly be of interest to special services, the police and even large companies that have sufficient budgets to introduce such technology for personnel work.
The ubiquitous introduction of “technology for recognizing criminals” will be a breakthrough in the field of security. However, it’s worth remembering what errors that cannot be completely excluded can cost specific people, whom AI will identify as potential criminals.