Hereâ€™s a map of more than 550 businesses that take part in the City of Detroitâ€™s Project Green Light program. Under Project Green Light, launched in 2016, participating businesses pay for surveillance cameras to be installed and monitored by the Detroit Police Department for use in criminal investigations.
The camera system has been used to catch suspects in violent crimes, theoretically gives partners priority 911 service and is meant to be a crime deterrent.
Mayor Mike Duggan touted the programâ€™s success in his State of the City speech in March, pledging to expand the system with cameras at public intersections. City officials have tried to drum up resident support for the â€œNeighborhood Real-Time Intelligence Programâ€ through community groups, according to the Free Press.
But the DPD surveillance program has come under increased scrutiny, with new attention to a previously little-known element, its pioneering facial recognition capabilities. In the “America Under Watch” reportreleased last month, researchers with the Georgetown Law Center on Privacy & Technology raise questions about the ramifications for the right to free speech, privacy concerns and racial bias in policing. The report focuses on Detroit and Chicago as the two cities where law enforcement agencies have gone the furthest to deploy facial recognition software.
That report led to two hearings held by the U.S. House Committee on Oversight and Reform hearing on May 22 and June 4, where Republicans and Democrats alike condemned the technology.
â€œWith little to no input, the city of Detroit created one of the nationâ€™s most pervasive and sophisticated surveillance networks,â€ Rep.Rashida Tlaib said at the May 22 hearing. â€œNow we have for-profit companies pushing so-called technology that has never been tested in communities of color, let alone been studied enough to conclude that it makes our communities safer.”
Republicans including Michigan Rep. Justin Amash also shared concerns. Even one of the law enforcement experts seemed to conclude by the end of the hearing that the implications were more alarming than he had thought, Michigan Radio noted. Multiple committee members called for a moratorium on the technologyâ€™s use, according to the Metro Times.
So what exactly is this technology, and how is Detroit using it? Itâ€™s not exactly â€œMinority Report,â€ but the implications might concern you.
In 2017, Detroit signed a $1.05 million, three-year contract with DataWorks Plus for its facial recognition surveillance system, which â€œprovides continuous screening and monitoring of live video streams.â€
According to the America Under Watch report, the contract includes:
- â€œinvestigative face recognition software and an application that enables an unlimited number of DPD officers to run face recognition searches on their mobile devices. All three face recognition capabilities are configured to compare unknown faces in photo or video against Detroitâ€™s database of 500,000 mug shot photos. DPDâ€™s Crime Intelligence Unit is additionally authorized to run face recognition searches on Michiganâ€™s Statewide Network of Agency Photos (SNAP), a database that includes state driverâ€™s license photos.â€
Report authors Clare Garvie and Laura M. Moymany note that while many of the Project Green Light Partners are late-night businesses like gas stations and liquor stores, there are also a number of community centers, including schools, churches, clinics and apartment complexes signed up. They argue that these locations raise the question of whether the program runs the risk of violating residentsâ€™ constitutional rights, since attending these locations â€œreveals deeply personal information about a residentâ€™s â€˜religious, political, or social views or activitiesâ€™ or â€˜participation in particular noncriminal organization or lawful event.â€™â€
â€œWhile these activities may occur in public, most of us do not expect to be sharing our attendance at a church service or an addiction treatment center with law enforcement,â€ they write.
The DPD insists they donâ€™t use the facial recognition software in real time, or in any way that violates constitutional rights. Police department policy for facial recognition states that it â€œâ€¦will not violate First, Fourth, and Fourteenth Amendments and will not perform or request face recognition searches about individuals or organizations based solely on their religious, political, or social views or activities; their participation in particular noncriminal organization or lawful event; or their races, ethnicities, citizenship, places of origin, ages, disabilities, genders, gender identities, sexual orientations, or other classification protected by law.â€
While there are stricter guidelines, DPD policy does allow officers to collect images of peopleâ€™s faces at First Amendment-protected gatherings as part of criminal investigations.
In response to the Georgetown report, Police Chief James Craigwrote a scathing letter condemning its tone for leading â€œmany readers to believe that DPD is engaged in Orwellian activities causing them fear and consternation that their constitutional rights are routinely being violated.â€
Craig went on to insist that citizens are not regularly being monitored, and that the Real Time Video Feed Facial Recognition name isnâ€™t an accurate description.
While video feeds are monitored in real time, he wrote, â€œif there is an articulable, reasonable suspicion that an individual is observed or reported to have committed a crime, only then is their still image provided for analysis with the facial recognition program.â€
However, part of the researchersâ€™ point is that itâ€™s worrying on its own that law enforcement has greater capacity to violate constitutional rights with little public oversight, even if the technology isn’t being put to use in that way. And no, it’s not being used in real time, the police say, but that’s not the only troubling part of facial recognition.
New York Times tech columnist Farhad Manjoomade a similar casein a column about San Franciscoâ€™s decision to ban facial recognition technology, saying he worries â€œthat weâ€™re stumbling dumbly into a surveillance state,â€ and â€œthe only reasonable thing to do about smart cameras now is to put a stop to them.â€
Garvie told Manjoo that face recognition software gives police the new ability to â€œconduct biometric surveillance,â€ comparing it to taking â€œmass fingerprint scans of a group of people in secret.â€
Manjoo continued with a look at one of the sloppy ways facial recognition has been used — in New York, when the face-scanning system didnâ€™t find a match to a security camera image of a suspect, police determined the man looked like actor Woody Harrelson, then ran a Google image of the celeb through the software to find a match, leading to an arrest. The program errors fit a more familiar pattern when it comes to race — these technologies consistently do a worse job of identifying individuals of color, leading to misidentifications.
Back in 2018, the American Civil Liberties Union of Michigan sounded the alarm about Project Green Light, arguing that the constant surveillance violates privacy, without significant evidence that these programs stop crime. For now, though, businesses still seem to be buying into the cityâ€™s line on safety through surveillance. Three new ones signed up, just this month. –Kate Abbey-Lambertz