The use of facial recognition systems powered by algorithms and software continues to raise controversy given their potential use by law enforcement and other government agencies. For over a decade, the Department of Commerce’s National Institute for Standards and Technology (NIST) has evaluated facial recognition to identify and report gaps in its capabilities. Its most recent report in 2019 quantified the effect of age, race, and sex on facial recognition accuracy.
The greatest discrepancies that NIST measured were higher false-positive rates in women, African Americans, and particularly African American women. It noted, “False positives might present a security concern to the system owner, as they may allow access to impostors. False positives also might present privacy and civil rights and civil liberties concerns such as when matches result in additional questioning, surveillance, errors in benefit adjudication, or loss of liberty.”
Continue reading “A Third-Way Approach to Regulating Facial Recognition Systems”