As California legislators consider a bill restarting police use of facial recognition come details about another innocent man wrongly jailed by officers using AI to find suspects.
The New York Times and other news publications are reporting on a man who was wrongly jailed in his home state of Georgia for the actions of a man captured on camera committing a crime in Louisiana, halfway across the continent.
(Randal Quran Reid was released after five days in jail. Reid has a mole on his face that the matching software missed and he weighs notably less than the man in the video.)
A three-year ban on the use of facial recognition by local and state law enforcement agencies in California expired with little notice in January.
Though temporary, it was seen as a model for officials who have not at least taken steps that privacy advocates say should be mandated and in place before using algorithms – with human intervention or not.
Those steps include radical transparency, relevant legal standards, regimented training, trust-building community programs and others.
One of the more surprising elements of the California story is that one of the assembly member who co-sponsored the expired law, Democrat Phil Ting is pushing for a law allowing facial recognition in law enforcement but with certain rules.
One of the bills circulating in the state capital, AB642, tries to put a friendly face on the technology and practice. Another bill, AB 1034, would also allow AI-assisted identification so long as a body-worn cameras were not involved. The bill would be effective until 2034.
An article in the trade publication Government Technology, quotes a Public Safety Committee member and former police sergeant saying people should not expect a right to privacy.
Powerful as that statement is, it does not address mistaken identifications like the one examined at length in the New York Times.
It draws a picture of law enforcement and judicial systems that is sobering. Judges, detectives and officers appear to be becoming detached from the real – and expensive – human costs of handing so much control of citizens to software. As California legislators consider a bill restarting police use of facial recognition come details about another innocent man wrongly jailed by officers using AI to find suspects.
The New York Times and other news publications are reporting on a man who was wrongly jailed in his home state of Georgia for the actions of a man captured on camera committing a crime in Louisiana, halfway across the continent.
(Randal Quran Reid was released after five days in jail. Reid has a mole on his face that the matching software missed and he weighs notably less than the man in the video.)
A three-year ban on the use of facial recognition by local and state law enforcement agencies in California expired with little notice in January.
Though temporary, it was seen as a model for officials who have not at least taken steps that privacy advocates say should be mandated and in place before using algorithms – with human intervention or not.
Those steps include radical transparency, relevant legal standards, regimented training, trust-building community programs and others.
One of the more surprising elements of the California story is that one of the assembly member who co-sponsored the expired law, Democrat Phil Ting is pushing for a law allowing facial recognition in law enforcement but with certain rules.
One of the bills circulating in the state capital, AB642, tries to put a friendly face on the technology and practice. Another bill, AB 1034, would also allow AI-assisted identification so long as a body-worn cameras were not involved. The bill would be effective until 2034.
An article in the trade publication Government Technology, quotes a Public Safety Committee member and former police sergeant saying people should not expect a right to privacy.
Powerful as that statement is, it does not address mistaken identifications like the one examined at length in the New York Times.
It draws a picture of law enforcement and judicial systems that is sobering. Judges, detectives and officers appear to be becoming detached from the real – and expensive – human costs of handing so much control of citizens to software. Read More