Advocates want guardrails on facial recognition technology

[ad_1]

Lawmakers are considering legislation that would place safeguards on the use of facial recognition software by law enforcement. (AP, File)

Police would need a warrant to use facial recognition software while investigating serious crimes and state police would be tasked with centralizing all law enforcement searches under legislation supporters argue reduces bias and puts guardrails on the use of the technology.

A bill filed by Reps. Orlando Ramos and Dave Rogers and Sen. Cynthia Creem allows police to use facial recognition in “emergency” situations without a warrant and calls for individuals charged with a crime who were identified using the technology to be provided notice that they were subject to a search.

Creem, a Democrat from Newton who serves as Senate majority leader, said facial recognition technology “is dangerous, both in its ability to facilitate government surveillance and its track record of misidentifying people in criminal investigations.”

“Unfortunately, this technology is currently being used by our law enforcement agencies without the necessary safeguards to make sure our privacy and due processes (are) protected,” Creem told the Judiciary Committee at a Tuesday hearing inside the State House.

Both the Senate and House versions of the bill are based on the March 2022 recommendations of a special legislative commission that called for police be able to use facial recognition for serious crimes with “safeguards to guarantee civil rights and due process,” the ACLU of Massachusetts said in a summary of the proposal.

A version of the bill cleared the Judiciary Committee last year and received initial approval in the Senate. The House voted 149-4 to add a Ramos amendment on facial recognition that mirrored this session’s bill to a $5.3 billion bill that later passed the chamber.

But the language did not make it to former Gov. Charlie Baker’s desk by the end of the legislative session.

ACLU of Massachusetts Technology for Liberty Project Director Kade Crockford said Montana recently passed legislation that was largely based on the work the Massachusetts commission put forward.

“It would be a real shame if we didn’t benefit from all that hard work and enact these recommendations and put them into law here,” Crockford said at the Judiciary Committee hearing.

ACLU of Massachusetts Racial Justice Program Director Traci Griffith said studies often show that facial recognition software harbors racial, gender, and age biases. Research conducted in Massachusetts by Dr. Joy Buolamwini of MIT’s Media Lab found facial analysis algorithms misclassified Black women as much as 33% of the time, Griffith said.

“She was experimenting with various off-the-shelf facial recognition tools, and noticed that the systems could not recognize her face. It was only when she literally put on a white mask that the technology recognized her existence,” Griffith said of Buolamwini work at the committee hearing.

UMass Amherst Computer Sciences Professor Erik Learned-Miller said if facial recognition software is used on high-quality photos like a passport or driver’s license pictured, it “may be very accurate.”

But if the images are grainy or poorly lit like in surveillance footage, the system’s “accuracy might be terrible,” he said.

“They are simply not accurate enough in many scenarios so that they can be trusted as more than an investigative lead,” he said during the hearing.

[ad_2]

Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top