Mekztek

Ukraine is using facial recognition technology to identify and notify the families of Russian soldiers who have died.

ukraine facial recognition

Ukraine’s vice-prime minister told Reuters that the country is employing facial recognition software to assist identify the remains of Russian soldiers killed in action and track down their relatives to inform them of their deaths.

Mykhailo Fedorov, Ukraine’s vice-prime minister and head of the ministry of digital transformation, told Reuters that his government has been using facial recognition software from Clearview AI to track down the social media accounts of Russian soldiers who had died.

ukraine facial recognition

“As a courtesy to the mothers of those soldiers, we are disseminating this information over social media to at least let families know that they’ve lost their sons and to then enable them to come to collect their bodies,” Fedorov said in an interview, speaking via a translator.

Ukraine’s Ministry of Defense started utilizing Clearview technology last month, which scrapes photographs from the web to match faces in uploaded photos. Ukraine’s usage of Clearview was initially reported by Reuters earlier this month, but it was unclear how the technology would be used at the time.

Following the Russian invasion, Clearview offered its service to Ukraine for free, claiming that its search engine contains more than 2 billion photographs from VKontakte, a famous Russian social media service. A request for comment from VKontakte was not returned.

Clearview AI, a New York-based software company, has come under fire from consumers and governments all over the world for its privacy abuses.

Italy penalized the corporation €20 million earlier this month for violating EU consumer privacy laws and ordered it to destroy any data on Italian residents. Clearview AI had already been ordered to halt processing all user data by the UK Information Commissioner’s Office and French authorities.

In addition, the corporation is fighting a lawsuit filed by consumers in US federal court in Chicago under the Illinois Biometric Information Privacy Act. The case is currently pending to see if the company’s collection of photographs from the internet breached privacy laws.

Clearview has stated that its actions were legal and that face matches should only be used as the first step in investigations.

The technology’s reliability has also been questioned in other papers. According to studies, facial recognition software frequently fails to recognize Black and brown faces, resulting in policing bias. Such claims have been refuted by Clearview.

Facial recognition can be unreliable when used to identify the dead, according to Richard Bassed, head of the forensic medicine department at Monash University in Australia, and fingerprints, dental records, and DNA remain the most popular ways of establishing someone’s identification.

However, obtaining pre-death samples of such data from enemy fighters is difficult, opening the door to novel approaches like facial recognition.

 

Nonetheless, according to Bassed, who has been developing the technique, cloudy eyes and wounded and expressionless features can make facial recognition worthless on the dead.

 

According to Albert Fox Cahn, the creator of the privacy advocacy group Surveillance Technology Oversight Project, “If the technology is truly only used for identifying the dead, which I’m quite skeptical of, the biggest risk is misidentification and wrongfully telling people that their loved ones have died,”

 

The Armed Forces Medical Examiner System in the United States has stated that automated facial recognition is not widely accepted in the forensic field.

 

There are also concerns about what Clearview AI will do with the data it obtains, including “pictures of battlefield casualties,” according to Cahn, in addition to issues about reliability and privacy breaches.

Read Also:

Russia is introducing Rossgram, To replace Instagram After the Ban of Instagram in the country

Wireless Bluetooth Speaker Oraimo SoundPro (Portable 10W)

“I have no transparency around how about data is used, retained, and shared,” he said. “But it’s hard to imagine a situation where it is harder to enforce any restrictions on the use of biometric tracking than an active warzone. Once the technology is introduced into the conflict for one reason, it will inevitably be used for others. Clearview AI has no safeguards against that sort of misuse of the technology, whether it’s investigating people at checkpoints, interrogations, or even targeted killings,” he said.

 

In a statement, Clearview said it is ensuring that everyone who has access to the technology is properly trained on how to use it safely and responsibly. “War zones can be dangerous when there is no way to tell apart enemy combatants from civilians. Facial recognition technology can help reduce uncertainty and increase safety in these situations,” the company said. According to the company, studies have proved that the software is bias-free and can correctly identify the correct face from a lineup of over 12 million photographs with a 99.85% accuracy rate.