Here’s why San Francisco’s vote to ban facial-recognition tech mattersBy Blair Morris
June 18, 2019
San Francisco just voted to prohibit facial-recognition innovation.
The city that has for lots of pertain to represent the power of tech, both in all its horror and splendor, took an important step on Tuesday to control a few of that power. The city’s Board of Supervisors voted 8 to 1 in a veto proof bulk to approve an extensive ordinance that broadly manages monitoring innovation and forbids outright the city government’s use of facial-recognition tech for monitoring.
While this restriction has not yet technically become law– the ordinance returns before the Supervisors on May 21 and then Mayor London Type should sign it– its backers are confident that, having passed this first obstacle, the regulation’s success is essentially assured.
This is a huge deal, and not simply for San Francisco. Experts who spoke to Mashable discussed that the passage of such a measure, even in a city painted in the popular awareness with a broad progressive brush, means that other regional and state federal governments aren’t far behind.
” I think a restriction will send out an extremely strong declaration within the nationwide conversation about the potential damages related to [facial-recognition technology],” Sarita Yardi Schoenebeck, associate professor at the University of Michigan’s School of Details, discussed over email. “It’s likely it would motivate other communities to slow down and thoroughly think about the function of FRT in their communities.”
As San Francisco goes, so goes California. As California goes, so goes the nation.
San Francisco’s ban on facial-recognition tech is not occurring in a vacuum. While companies like Amazon sell Rekognition to the feds and pitch ICE, federal governments around the world have ended up being enamored of the innovation’s dark promise to track people “attending a protest, gathering outside a place of praise, or merely living their lives,” as the ACLU puts it.
” It infringes on people’s personal privacy and it heavily discriminates versus some groups of individuals.”
We see this scary reality in the Xinjiang region of Western China, where the federal government has put behind bars over a million Uighurs in a surveillance-state hellscape. More broadly, a New York City Times report from April exposes how the Chinese government is “utilizing a huge, secret system of innovative facial recognition innovation to track and control the Uighurs, a mainly Muslim minority.”
The information are, frankly, scary. “The facial recognition innovation,” continued the Times, “which is incorporated into China’s quickly expanding networks of surveillance video cameras, looks specifically for Uighurs based upon their look and keeps records of their comings and goings for search and evaluation.”
It is ignorant to think that geography or national borders will limit the spread of such pernicious technology. Ecuador, which has set up a network of Chinese-made monitoring cameras around the entire county, is evidence of that. And lest you believe the Land of the Free is unsusceptible to such overreach, companies that provide facial-recognition technology, such as Palantir and Amazon, are based in the United States.
On the ground in San Francisco
While the individuals of San Francisco at present need not personally fear a Chinese-style monitoring state, using facial-recognition tech by law enforcement does represent a verifiable risk to civil liberties.
Such technology has greater error rates for people of color and ladies, and, as the San Francisco-based Electronic Frontier Structure explains, this has the result of “[exacerbating] historic biases born of, and adding to, over-policing in Black and Latinx communities.”
Teacher Schoenebeck concurs. “It can be difficult to prepare for how innovation will be used,” she wrote, “but in the case of FRT, we already know that it infringes on people’s personal privacy and it greatly discriminates against some groups of individuals.”
So, how is this restriction going to make things better?
” The Stop Secret Monitoring Ordinance would require City Departments to get Board approval before utilizing or getting spy tech, after notification to the public and an opportunity to be heard,” checks out an EFF article detailing the effort. “If the Board authorized a new security innovation, the Board would have to make sure the adequacy of personal privacy policies to safeguard the general public.”
” Each time we embrace a new innovation we require to consider the unintentional effects of its use.”
Nash Sheard, the EFF’s grassroots advocacy organizer, described over email that this regulation will assist to restore trust and accountability in the city’s government and authorities.
Specifically, city agencies will no longer be allowed to directly use facial-recognition tech to target, track, or surveil its own residents. And, if local officials decide to contract with a private company that uses the tech, that utilize will go through public oversight.
” Technology has the ability to produce much better transparency and to help us through our choice making process,” the EFF’s Sheard observed over the phone, “however each time we embrace a brand-new innovation we require to think about the unintended repercussions of its use.”
Adam Harvey, a facial-recognition specialist and the artist behind CV Dazzle(among numerous other things), described over email that regulation backed by law is our best choice at keeping the possible dangers of facial-recognition tech at bay. Nevertheless, he cautioned that the effort requires to be directed by those who best comprehend the danger.
“[Ultimately] the solution is a legislative one,” he composed, “but without technologists stepping in to voice their concerns, create provocations, or alternative innovations, the debate on face-recognition technologies will continue to be guided by lobbyists and undemocratic organizations with little to no regard for civil liberties.”
San Francisco, a city bristling with technologists and activists, appears up to the job of setting the national program on regulating facial-recognition tech. We ought to all hope it prospers.