Can the Bay Area Control the Monitoring Tools It Developed?

Can the Bay Area Control the Monitoring Tools It Developed?

By Blair Morris

September 23, 2019

A photo of passersby walking under a surveillance camera that is part of a facial recognition technology test at Berlin Suedkreuz station in Berlin, Germany.
Passersby walk under a surveillance video camera that is part of a facial recognition technology test at Berlin Suedkreuz station in Berlin, Germany. Steffi Loos/Getty

San Francisco just became the very first city to ban usage of facial recognition technology by federal government entities. Oakland may be next.

Updated: Might 14, 2019 Editor’s Note: On Tuesday, May 14, San Francisco’s Board of Supervisors voted to pass the facial recognition ban.

OAKLAND, Calif.– On the first Thursday of each month, about a dozen people meet in a dim, sepia-toned space in Oakland’s City Hall. This is the city’s Privacy Advisory Commission– a group of volunteers from each Oakland district, an agent from the Oakland Police Department and the Mayor’s office, and the city’s chief personal privacy officer, Joe DeVries. They collect to talk about a host of problems related to the city’s growing usage of security technology: how information is utilized, kept, and shared. Just who, in this tech-saturated city, is tracking whom?

A predecessor to the first Personal privacy Advisory Commission assembled five years ago, after news of a planned security center appeared. The federally funded “Domain Awareness Center” was at first planned just for the Oakland port, but city officials proposed expanding it in 2013 The stated function of the $11 million expanded task was to combat crime and better react to potential emergencies. (Oakland had been fighting with an increase in violent crime considering that 2005) The proposal looked for to blanket the city with video cameras, gunshot detectors, and automated license plate readers so that the actions, motions, and connections of suspects might be tracked– and unwanted incidents, preempted. “It’s everything about effectiveness and automation into the response when it concerns public safety and emergency action,” the city’s then-chief info officer, Ahsan Baig, told GovTech.

However personal privacy activists raised the alarm about mass security, particularly highlighting the threat that this technology would discriminate against communities of color in bad communities. Later on, their concerns were vindicated when e-mails exposed that the genuine function of this initiative was spying on protesters, and the scope of the job was rolled back

In the wake of the DAC debate, Oakland’s City Council put together an advertisement hoc committee to set rigorous data-retention policies and finest practices for the center to follow. That committee suggested that what the city actually required was something more permanent: The technology was bound to develop, and its uses to increase.

Oakland’s commission isn’t the only such formal community privacy board in the nation– Seattle has one, too– however it might be the most aggressive. Under the leadership of local privacy advocate Brian Hofer, it assisted pass legislation controling surveillance in the area, consisting of a openness ordinance governing the conditions under which the Oakland Cops Department complies with the FBI and guidelines for “technology impact reports” on each new piece of smart-city surveillance gear the city rolls out.

May’s conference had a full agenda, and the state of mind was particularly charged. They discussed the final draft of city-wide Privacy Principles, developed by Berkeley Law’s Samuelson Law, Innovation, and Public Policy Center. If approved by the City board next month, this file is suggested to serve as a lodestar for the city, as it tries to optimize both openness and personal privacy– interests that Erik Stallman, the associate director of the Samuelson Center, stated at the conference that he believes are not “irreconcilable.”

But the commission were likewise there to discuss another possible privacy milestone: Thanks to the Personal privacy Advisory Commission’s work, Oakland and San Francisco were steps away from ending up being the very first U.S. cities to ban facial acknowledgment innovation, an increasingly typical and powerful form of technology that permits computer systems to recognize faces in photos and video footage and match them with law enforcement databases.

Cities are altering quickly. Stay up to date with the CityLab Daily newsletter.

Under a Security and Neighborhood Security Ordinance passed last year, brand-new kinds of surveillance innovation now must go through a public approval process prior to being released in Oakland. However a modification proposed by Hofer and commission member Heather Patterson would likewise specifically restrict city departments– and any public entities that have data-sharing arrangements with the city– from utilizing facial recognition innovation, or the details recorded through it. (The amendment does not ban personal usage of this innovation, nevertheless, so personal landlords can deploy it in a domestic or business building.) If the city’s Public Safety Committee authorizes the amendment this month, the City Council will vote on it in early June.

San Francisco’s restriction mirrors Oakland’s: It’s bundled as part of the city’s proposed Stop Secret Monitoring Regulation, presented by supervisor Aaron Peskin in January. It passed today.

Right now, neither cities’ police departments are utilizing facial acknowledgment innovation. San Francisco stopped testing it in 2017 But preemptively prohibiting the innovation fits into a more comprehensive effort to establish San Francisco and Oakland as “ digital sanctuaries“– locations where federal government data gathering and sharing is underpinned by openness, responsibility, and equity. Facial acknowledgment innovation is especially perilous, personal privacy supporters say, since it can do something unmatched: turn one’s visual presence into potentially incriminating info that lands someone, without their knowledge, in a database used by law enforcement.

” Facial acknowledgment is the ideal tool and system for oppression of rights and liberties that Oaklanders treasure,” stated Matt Cagle, the innovation and civil liberties lawyer at the ACLU of Northern California, throughout public remark in assistance of the amendment at the Might conference. “The harms are disproportionately experienced by people of color, immigrants, and other neighborhoods that are traditionally targeted by the federal government for oppression.

Due to the fact that public locations where crowds collect under cams– to commemorate or object or simply exist– are such fertile area for gathering this details, the fight over facial recognition is an uniquely city one. And the Bay Area, states Hofer, has an unique obligation to lead the effort to control it.

” We’re where all this innovation is being established,” he stated. “I think it is essential that– in the home of technology– that we establish these guidelines and standards.”


In 2001, the city of Tampa secretly utilized video cameras geared up with facial scanning software application on a stadium loaded with Super Bowl viewers. The system compared the faces of fans with a mugshot database, revealing that 19 individuals in the crowd of 100,000 had outstanding warrants for misdemeanors

Such automated computer-assisted technology had actually been in advancement for recognizing and validating criminal suspects because the 1960 s. But the 2001 event was the very first time lots of people realized that the ability for a computer to summon names for human faces was no longer in the realm of science fiction. And plenty were not pleased about it: When reports of the incident became public, the ACLU penned a letter to Tampa city leaders, demanding a full explanation for the “Snooper Bowl.”

This would show to be the beginning of an ongoing series of disputes in between civil liberties advocates and police. As facial acknowledgment innovation enhanced, authorities touted its many public security applications: the ability to spot passport and other types of identity fraud, identify active shooters and uncooperative suspects, help forensic examinations, and assistance discover missing children

In June, for instance, Anne Arundel County police used facial acknowledgment to rapidly identify the male who killed five reporters at the Capital Gazette paper in Annapolis, Maryland. With the shooter in custody however uncooperative, authorities compared his photo with countless images from driver’s license images and mug shots in the Maryland Image Repository System. “The Facial Acknowledgment System carried out as developed,” Stephen T. Moyer, Maryland’s public safety secretary, stated in a statement to the Baltimore Sun “It has actually been and continues to be a valuable tool for fighting crime in our state.”

Facial recognition also has applications beyond the world of crimefighting, advocates say. It can assist the blind recognize the expressions of individuals they interact with, and help historians identify soldiers who died in the Civil War.

But privacy advocates caution that this is a tool that can be used to target susceptible populations. Authorities in China, for instance, use it to profile and manage the Uighur Muslim minorit y. In the U.S., scientists discovered that the federal government has actually currently been evaluating facial recognition technology utilizing pictures from child pornography, visa applications, and pre-conviction mug shots without authorization or notification. Police in Baltimore dealt with criticism from civil liberties groups after it used facial acknowledgment to identify and monitor protesters after the death of Freddie Gray in 2015.

In 2016, the Center of Personal Privacy & Innovation (CPT) at Georgetown Law came out with a report outlining the degree to which facial recognition had actually multiplied across the country. The report found that almost 30 states enable law enforcement to scan motorists license and ID databases. “Roughly one in 2 American grownups has their pictures browsed by doing this,” the authors wrote. Overall, around 117 million grownups are affected by this technology– they’re part of “a virtual, continuous line-up,” the report says, where the identifying eyewitness is an algorithm.

Utilizing biometric data– measurements or maps of the distinct features of the human body– to find individuals isn’t brand-new, of course. The very first nationwide fingerprint database in the U.S. was assembled by FBI director Edgar J. Hoover in the 1920 s; in the 1990 s, the Bureau’s Combined DNA Index System, or CODIS, allowed law enforcement to utilize DNA profiles to help recognize and find offenders. But the scope and reach of facial acknowledgment goes even more than these databases, which just track a reasonably small segment of the population. You can’t privately fingerprint or gather DNA info from huge groups of individuals from afar, like you can with facial recognition. As monitoring cameras capture individuals’s faces, law enforcement can construct an intimate real-time picture of their movements around the city.

” This technology lets police do something they actually have actually never ever had the ability to do before,” said Alvaro Bedoya, the founding director of Georgetown Law CPT and co-author of the 2016 report. “People need to understand that this is not typical.”

It’s not simply that this technology has a higher reach. Research study has revealed that facial acknowledgment software application is susceptible to mistake when it comes to recognizing gender and ethnic minorities. In 2018, scientists at MIT evaluated three kinds of AI-based facial recognition technology and found a mistake rate of 0.8 percent when recognizing white men, but failed with 34.7 percent for black ladies.

” You have actually currently seen concerns with police and their use of force,” said Sameena Usman, the Federal Government Relations Planner for the Council on American-Islamic Relations, throughout public comment at the privacy commission’s Might meeting. “To have this type of facial acknowledgment software that is incorrect could cause these kinds of wrongful deaths.”

Still, cops departments across the country are eagerly adopting the innovation, which is being marketed by business such as Amazon. That business’s Rekognition software is being piloted in Florida and Oregon, regardless of questions about its precision. (The ACLU utilized a version of Rekognition to scan pictures of the 535 members of Congress against 25,000 public mugshots, and got 28 false hits) Amazon is likewise pressing ICE to buy Rekognition

Local leaders, particularly in cities dealing with crime problems, are frequently taken with the idea that facial recognition-powered security will be a benefit for public safety. “In Oakland, we do have an excellent amount of criminal activity,” Noel Gallo, a City Council member, informed the San Francisco Chronicle “It’s just a type of public security given that I don’t have the cops staffing essential to safeguard our children and families. If you do the criminal offense, you need to also be clearly recognized.”

Though the commission authorized the facial acknowledgment modification all, the tension in between security and privacy triggered argument at the Might meeting. Oakland Cops Department agent Bruce Stoffmacher had actually been tasked with evaluating the department’s use of remote, live-streaming cameras and suggesting in what circumstances they could be released under Oakland’s broader surveillance regulation. One draft of the exemptions suggested officers should only have the ability to use the gadgets for undercover operations, or at public occasions that are expected to draw 10,000 or more people and national attention.

But Stoffmacher likewise set out circumstances in which the cops would wish to use these devices in smaller sized events, such as Golden State Warriors video game parades. “It’s still thought about a large enough occasion where OPD is going to attempt to have officers observing. You might be live-streaming that either to emergency situation operations center or to the police head office structure,” he said. “I don’t think it’s encouraging of the nature of policing to be able to say this is a precise number.”

Others argued that any cameras released by police at big and small events would have a chilling effect on protected totally free speech and assembly. “There’s absolutely nothing benign about a policeman holding a camera at a public event,” said Henry Gage III, an Oakland-based lawyer and a member of the Coalition for Police Accountability, throughout public remark at the meeting. “That’s essentially political violence by another name.”

Part of the stress might be eased with clearly articulated use cases, Hofer said: If the police have intelligence that a protest or gathering will include a group with a history of violence– such as the white supremacist Proud Boys, for example– maybe this innovation would be beneficial.

” The cops have a task to ensure public safety, and they need to be able to operate in some manner in order to accomplish that,” he said. “Therefore some of this is going to need an actually fine line.”


In a sense, this technology is already out of the box, and a facial recognition ban such as Oakland’s or San Francisco’s will not be able to stuff it back in. Pictures of people in public can still be shot, gathered, and kept; possibly, they can simply be sent in other places to be analyzed.

” Facial recognition does not require specific cams to allow it– it’s helped by them, but does not require them,” said Mike Katz-Lacabe, a member of Oakland Privacy. “As long as you have the cams, you can take that video and send it elsewhere to analyze it.”

This situation regularly plays out in other worlds of clever security.

Last year, the local responsibility not-for-profit Oakland Personal privacy found that Bay Location Rapid Transit was documenting license plate details from California motorists and keeping the records in a database that was likewise accessible by Migration and Customs Enforcement. That offered ICE access to 57,000 plate records, allowing them to track the motion of possibly undocumented immigrants and anybody who is available in contact with them.

The worry is not simply that ICE has actually been deporting immigrants without criminal records at record levels It is that the Department of Homeland Security (DHS) is also keeping tabs on relative, supporters, and advocates. The Intercept just recently found that DHS has been monitoring civilians who are opposing the administration’s practice of separating migrant households.

In that context, BART’s transport information collection could have had more insidious ramifications– just as live-streaming a demonstration might. “It might be that a regional cops department does not truly have anything malignant in mind, however if they gather all of the pictures and make it offered, other companies can get ahold of that details and use it for any function they want,” said Tracy Rosenberg, an organizer with Oakland Personal privacy and the director of the Oakland technology organization Media Alliance. “And those functions aren’t always constant with what they thought.”

These convoluted surveillance supply chains complicate enforcement of facial acknowledgment restrictions like Oakland’s, says Steve Trush, a research study fellow at UC Berkeley’s Center for Long-Term Cybersecurity and a board member of the nonprofit Hofer chairs, Secure Justice Even if Oakland or San Francisco’s city departments abide by it, they’re surrounded by counties and states that don’t. “I can picture some of that friction there could be solved if enough cities in the county concur to prohibit facial recognition,” Trush said. “Then maybe you could see possibly it’s time for the county to consider banning it, and possibly the state consents to an outright ban.”

What’s more, facial acknowledgment technology isn’t just in the hands of official bodies like federal governments and law enforcement: It’s on your neighbor’s front door. “ Smart” video doorbell devices such as those made by Ring and Nest have actually created an entire army of citizen-recorders, who can possibly ping police with monitoring of passersby they discover suspicious. As these devices become recognition-enabled, it will be challenging for Oakland, or any city, to make sure that this innovation was not utilized somewhere in the data collection process when they get intelligence.

With more policies and pressure from consumers, Trush states, perhaps innovation business will pick to develop their items without facial acknowledgment technology, or produce options. Currently, campaigns released from within tech business like Microsoft and Amazon have actually raised ethical concerns surrounding tasks and agreements; in reaction, those companies have actually strolled back certain efforts.

Ultimately, achieving an emergency of local and state policies on this innovation is Hofer’s dream, too. “I do expect this to spread,” he stated. “We cracked the door open and are letting others take part.”

However he and other advocates likewise acknowledge the difficulties involved in containing the spread of clever innovation that threatens personal privacy. That’s why the commission is likewise pressing to pass Oakland’s directing Personal privacy Concepts, which can notify future policy-making.

” What we desired was, instead of technology-specific legislation, a procedure that might manage whatever boiled down the pike,” said Rosenberg. “Due to the fact that whatever was distressing in 2015 is going to be different than what’s distressing in 2019 or2023 You do not wish to produce a nine-headed hydra circumstance where you put in regulation that’s technology specific, and then innovation shifts and it’s like absolutely nothing ever happened.”

About the Author

Learn More

About Blair Morris