In a national database in Argentina, tens of thousands of entries detail the names, birthdays and national IDs of suspected criminals.
The database, known as the Consulta Nacional de Rebeldías y Capturas or CONARC, began in 2009 as part of an effort to improve law enforcement for serious crime.
But CONARC comes with some serious problems. First of all, it is a spreadsheet file without password protection, which can be easily found through a Google search and downloaded by anyone.
On the other hand, many of the alleged offenses for which certain persons are registered are not so serious, while others are not specified at all.
However, the most alarming is the age of the youngest alleged offender, identified only as GM, who is cited for “crimes against persons (malicious) – serious injuries.” MG was apparently born on October 17, 2016, which means he is about four years old.
What is the live facial recognition system for criminals?
O investigație de la Human Rights Watch found that not only are children regularly added to CONARC, but the database also feeds a live facial recognition system in Buenos Aires, implemented by the city government.
Buenos Aires began testing live facial recognition on April 24, 2019. Implemented without any public consultation, the system met with immediate resistance.
In October, a national civil rights organization filed a lawsuit to challenge it. In response, the government has drafted a new bill – which is now undergoing legislative processes – to legalize facial recognition in public spaces.
The system was designed to connect to CONARC from the beginning. While CONARC itself does not contain any photographs of the alleged offenders, the system is combined with identity documents and photographs from the national register.
The software uses photos of suspects to search for people in real time through the city’s subway cameras. Once the system signals a person, they warn the police to make an arrest.
Risks of the facial recognition system
The system has since led to numerous false arrests, for which the police have not established a management protocol.
“There seems to be no mechanism to correct errors in either the algorithm or the database,” said Hye Jung Han, a children’s rights lawyer at Human Rights Watch, who led the research.
“This is a signal to us that the government has procured a technology that it does not fully understand in terms of all the technical implications and human rights.”
All of this is already deeply worrying, but adding children to the equation makes things much worse. Although the government has publicly denied that CONARC includes minors, Human Rights Watch found at least 166 children listed in various versions of the database between May 2017 and May 2020.
Unlike in the case of GM, most of them are identified by full names, which is illegal. Under international human rights law, children accused of a crime must have the protection of their privacy throughout the proceedings.
Unlike MG, most were 16 or 17 years old at the time of entry – although, mysteriously, there were a few children between the ages of one and three.
Ages are not the only apparent errors in the system. There are conflicting details and sometimes multiple national IDs listed for the same person. Because children change physically faster than adults, their photo IDs are more at risk of being overtaken.
All of these factors put children at increased risk of being misidentified and falsely arrested. This could create an unjustified criminal record, with potentially long-term repercussions for their education and employment opportunities. It could also have an impact on their behavior.
“The argument that facial recognition has a terrible effect on freedom of expression is more amplified for children,” says Han.
“You can imagine that a child, who was falsely arrested, would be extremely self-censored or careful about how he behaves in public. And it is still too early to try to realize the long-term psychological impact. “
The Argentine system is not unique
While Buenos Aires is the first city Han has identified as using live facial recognition to track children, she worries that many other examples are hidden from the public.
It is often easy to forget in debates about these systems that in the case of children, special attention is needed. But this is not the only concern, Han adds.
“The fact that these children are under this type of invasive surveillance – the human rights and social implications of this technology are still unknown.”
But actions affecting children are expected to end up being bad for the general public.