Computer vision, or artificial, has multiple applications, from automatically labeling users on social networks to the autonomous vehicle, but also contributes – each time more – to mass surveillance technology, which threatens the right to privacy and certain freedoms.
A study that publishes Nature And headed by Stanford University shows “the broad links between computer vision and surveillance,” the researchers write.
The team reviewed more than 40,000 research works and the patents that have derived from them to conclude that, of all potential applications of technologies that interpret images, most focuses on the detection of human beings.
The researchers indicate that the number of articles on computer vision that contributed to surveillance patents multiplied by five from the nineties until 2010.
Computer vision is the ability to see in machines, to extract the space-time structure of images/videos to completely interpret a scene. Its applications range from the car without driver, robotics, protein design or climate change modeling.
However, research indicates that, of all potential applications of technologies that interpret images, most focuses on the detection of human beings for the development of mass surveillance.
You may be interested: CEO of Openai says he spoke with Microsoft’s CEO about a future association
Artificial vision applications focus on mass surveillance
The examples include technologies for the detection of members (facial recognition); Human detection in daily activities (shopping or group events) and the analysis of human spaces (households, streets and offices).
In a subset of 100 articles and random patents, 90% of the articles and 86% of patents extracted data related to human beings.
The first two nations producing studies that have registered patents related to surveillance are, by a wide margin, the United States and China.
“While the general narrative is that only a small part of computer vision research is harmful, what we find instead is an omnipresent and normalized surveillance,” in the words of Abeba Birhane, one of the signatories and the University of Standford.
The study also focuses on the use of ambiguous or unclear language to normalize and even hide the existence of surveillance, for example when referring to human beings as objects.
In addition, among other conclusions, it indicates that with these practices and their omnipresence fundamental rights such as intimacy, freedom of expression and circulation are threatened.
In an article that accompanies the study and comments, the researcher Jathan Sadowski, from the University of Australia writes that “the map they prepare from the applications of computer vision research reveals an intimate relationship between academic research and surveillance applications by the army, the police and companies with profit.”
With EFE information.
Follow us on Google News to always keep you informed