The first major city to ban such technology, San Francisco lawmakers voted Tuesday to bar the use of facial recognition software, likening the it to Big Brother-ish tactics of authoritarian states.
The ban, introduced earlier his year by Supervisor Aaron Peskin, passed 8-1, with two supervisors—Hillary Ronen and Shamann Walton—absent.
The text of the legislation frames the ruling as a buttress for residents’ civil rights:
Surveillance efforts have historically been used to intimidate and oppress certain communities and groups more than others, including those that are defined by a common race, ethnicity; religion, · national origin, income level, sexual orientation, or political perspective.
The propensity for facial recognition technology to endanger civil rights and civil liberties substantially outweighs its purported benefits, and the technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring.
Peskin, during Tuesday’s hearing, noted that similar software “is being used around the world for mass surveillance of minority groups” by autocratic states, like China.
The law defines facial recognition technology as “an automated or semi-automated process that assists in identifying or verifying an individual based on an individual’s face.”
A variety of different types of facial-recognition software are commercially available. According to Mountain View-based security company Norton, “A facial recognition system uses biometrics to map facial features from a photograph or video.”
(The Department of Homeland Security defines biometrics as “unique physical characteristics [...] that can be used for automated recognition.”)
“It compares the information with a database of known faces to find a match,” notes the Norton page on facial recognition.
A database may contain still photos or video that the computer reads to calculate “the geometry of your face” based on subtle indicators like “the distance between your eyes and the distance from forehead to chin.”
The program renders this information as an equation it can then use to match the same person in different photos or footage—at least in theory.
An MIT study from earlier this year found that some programs were more likely to return false positives on subjects with darker skin, prompting anxiety about the hazards of computerized misidentification.
Peskin specifically noted on Tuesday that, even when it works correctly, he considers such programs a potential danger to privacy rights. The ACLU of Northern California agrees, calling facial recognition “a highly-invasive technology” in a statement after the vote.
The ban does not extend to the Port of San Francisco or San Francisco International Airport, which are federally controlled. Private bodies may still use the software.
Supervisor Catherine Stefani was the only vote against the measure, raising concerns that the city could be scapegoatinbg technology for political reasons. Even with Stefani’s nay vote and the two absences, the ban has enough backing for a veto-proof majority if passed on a second reading next week.
Oakland and Berkeley are considering similar legislation.