Thursday, March 28, 2024 | Ramadan 17, 1445 H
broken clouds
weather
OMAN
23°C / 23°C
EDITOR IN CHIEF- ABDULLAH BIN SALIM AL SHUEILI

Facial recognition tech ‘faulty’

shutterstock_1314016142
shutterstock_1314016142
minus
plus

WASHINGTON: Facial recognition systems can produce wildly inaccurate results, especially for non-whites, according to a US government study released on Thursday that is likely to raise fresh doubts on deployment of the artificial intelligence technology.


The study of dozens of facial recognition algorithms showed “false positives” rates for Asian and African American as much as 100 times higher than for whites.


The researchers from the National Institute of Standards and Technology (NIST), a government research centre, also found two algorithms assigned the wrong gender to black females almost 35 per cent of the time.


The study comes amid widespread deployment of facial recognition for law enforcement, airports, border security, banking, retailing, schools and for personal technology such as unlocking smartphones.


Some activists and researchers have claimed the potential for errors is too great and that mistakes could result in the jailing of innocent people, and that the technology could be used to create databases that may be hacked or inappropriately used.


The NIST study found both “false positives,” in which an individual is mistakenly identified, and “false negatives,” where the algorithm fails to accurately match a face to a specific person in a database.


“A false negative might be merely an inconvenience — you can’t get into your phone, but the issue can usually be remediated by a second attempt,” said lead researcher Patrick Grother.


“But a false positive in a one-to-many search puts an incorrect match on a list of candidates that warrant further scrutiny.”


The study found US-developed face recognition systems had higher error rates for Asians, African Americans and Native American groups, with the American Indian demographic showing the highest rates of false positives. — AFP


SHARE ARTICLE
arrow up
home icon