New Orleans faces criticism over secret facial recognition surveillance

New Orleans faces criticism over secret facial recognition surveillance

Technology
Webp 3r2l9nmmbri3huekmox6348shtyh
Alexandra Reeve Givens President & CEO at Center for Democracy & Technology | Official website

ORGANIZATIONS IN THIS STORY

LETTER TO THE EDITOR

Have a concern or an opinion about this story? Click below to share your thoughts.
Send a message

Community Newsmaker

Know of a story that needs to be covered? Pitch your story to The Business Daily.
Community Newsmaker

Residents of New Orleans have unknowingly been part of a mass surveillance program. The Washington Post uncovered that the New Orleans Police Department (NOPD), in collaboration with Project NOLA, used an untargeted facial recognition system. This practice bypasses city laws and raises concerns about civil rights due to known inaccuracies in such systems.

These technologies pose risks of misidentifying individuals, potentially leading to wrongful arrests. Historical data from similar systems shows high error rates, indicating a threat to civil liberties.

After two years of operation without public knowledge, the city paused the program following public backlash. Despite this, some city officials are considering amending laws to legalize the technology. Critics argue that legitimizing an unvetted system would be a mistake.

Untargeted facial recognition scans all faces in an area against a database, unlike targeted systems which focus on specific individuals using high-quality images. Untargeted systems often operate automatically and can be inaccurate due to environmental factors like lighting and camera angles.

Real-world examples highlight these failures. In the UK, South Wales Police reported a 91% false positive rate at public events. Similarly, London's Metropolitan Police found only 10 out of 27 alerts were accurate over several years.

The U.S. National Institute of Standards and Technology found error rates could exceed 20% with poor quality images in targeted matching scenarios, further questioning untargeted system reliability.

Bias in facial recognition affects marginalized communities disproportionately. Testing environments for these technologies often do not reflect real-world conditions, contributing to their unreliability outside controlled settings.

Project NOLA and NOPD have not provided evidence of their system's accuracy or fairness. There is no transparency regarding match frequency or errors resulting in wrongful actions by law enforcement.

Even with perfect accuracy, mass surveillance threatens privacy and freedom by enabling constant monitoring of civilians' activities.

The situation in New Orleans illustrates the dangers of deploying untested AI technologies on the public. Critics urge abandoning the program instead of altering laws to legitimize it.

ORGANIZATIONS IN THIS STORY

LETTER TO THE EDITOR

Have a concern or an opinion about this story? Click below to share your thoughts.
Send a message

Community Newsmaker

Know of a story that needs to be covered? Pitch your story to The Business Daily.
Community Newsmaker

MORE NEWS