A group of professors, professors, and researchers wrote an open letter to the Spanish government requesting the establishment of an investigative committee and waiting for the use of facial recognition and analysis systems in public and private companies, at least until the court judges and European institutions debate whether it is appropriate In what way should they be allowed.
RENFE recently issued a tender to develop a Facial recognition and analysis system Among other things, this should make it possible to identify the passenger’s gender, race and even emotional state. Its image processing can also remind “fights” or “anti-social attitudes.”
Eventually, the railway operator withdrew the advertisement, but this is just one example where the use of these systems is beginning to spread in passenger transportation companies, security companies, education, work, health care, leisure and other environments. Its extensive use worries some experts.
As many as 70 professors, professors, researchers and professionals in the fields of philosophy, computer and social sciences signed the petition, Open the envelope More experts can join in and ask the Spanish government to provide Investigation Committee Study whether it is necessary to suspend the use and commercialization of facial recognition and analysis systems by public and private companies.
This time out It will continue until the Grand Court and the European legislature debate which method, under what conditions, what guarantees and what goals, if possible, should allow the use of these systems,” Potentially harmful effects Generally speaking, the well-being, interests, basic needs and rights of the Spanish people”.
The signatories asked the government to “intervene quickly in these systems before they continue to expand and become de facto standards, even though they imply intervening in the private sphere without their explicit consent.”
“They are in danger-they added- Basic issues of social justice, human dignity, equality, equal treatment and tolerance. The system for recognizing and analyzing people’s images (their faces, gestures, hairstyles, postures and body movements, clothes, textures and/or skin colors), and the machine learning algorithms that are extended to support calculations, have serious problems. These problems have been It has been extensively documented and discussed by the scientific community as well as government and private entities.”
Five main questions
The signatories highlighted five difficulties.One is to associate a person with a certain characteristic or trend (usually through Personal score) based on demographic data “The problem is very serious”, especially when the system makes operational decisions that affect individuals based on predictions that are only valid at the group level.
Another disadvantage is No accepted scientific model In the field of psychology, anthropology or sociology Indicate that the type of nose, specific muscles or gait are sufficient to predict future personal behavior.
For example, the probability of crime, learning history or engineering knowledge, or performing correctly in a certain job does not depend on any of those variables collected and analyzed by facial recognition methods to classify people and make decisions.
On the other hand, these systems are considered Black box Due to their opacity, it is difficult to know how they make their decisions and what criteria they are based on: “Although they may be more transparent technically and theoretically, the current design is not specifically designed to allow accountability that meets the requirements of a democratic society. “.
The boarding gate with a biometric facial scanner at the Atlanta Airport in the United States. /John Paul Van Wert/ Rank Studios 2018
In addition, they are systems Not too strong, Because the quality of the results is highly dependent on contextual issues, which can lead to false positives and false positives. For example, if the actual lighting conditions are different from the lighting conditions used during training, the results may be incorrect.
Finally, “For Sample bias will seriously affect the quality of predictions The system when the different groups in the training data are not represented equally”. The case of radiology applications shows extremely promising prediction success rates among fair-skinned people, but it is worse if their dermis is darker.
The possible benefits will not outweigh the negative effects
For all these reasons, the signatories concluded that: “Because these systems have serious flaws and risks, the potential benefits they may provide will not offset their potential negative effects in any way, especially for Groups and collectives that often suffer unfair treatment Discriminatory treatment: This includes women, LGTBIQ + 13 people, racists, immigrants, people with disabilities, or people at risk of poverty and social exclusion”.