The City of San Francisco voted to ban facial recognition technology on Tuesday, May 14th, 2019, but for many, this stance is sparking other conversations around public safety crowd-sourcing tools.
A Closer Look Into Surveillance Technology in San Francisco
In March 2009, Denise Green - a 47 year-old woman with no previous criminal record - was driving her burgundy Lexus on Mission Street in San Francisco, when her vehicle passed a SFPD police cruiser equipped with an automated license plate reader (ALPR). The Lexus' license plate was captured incorrectly by the ALPR and matched against a stolen grey GMC truck.
A police officer patrolling in his vehicle nearby was made aware of the match and, after confirming with officers in the police cruiser the vehicle they had seen was a burgundy Lexus, the patrol officer ordered a “felony stop”. When backup arrived, Green was stopped, ordered out of her car at gunpoint, and handcuffed. Her subsequent claim against the City of San Francisco was settled for $495,000.
Since the settlement of the case, the City has been reviewing its use of surveillance technology; and, in May, it was reported the Board of Supervisors have passed an ordinance which - among many other measures relating to the use of surveillance equipment - bans the use of facial recognition technology for fifty-three of the city's civic departments, including the San Francisco Police Department.
While supporters of the ban were celebrating a victory for civil rights, and opponents of the ban complained about the burden complying with the ordinance would put on the police, the transit system, and other public safety agencies, the two sides appear to have missed one important consideration - the San Francisco Police Department doesn’t use its own facial recognition systems.
SFPD did run some pilot tests on the technology between 2013 and 2017, but considering the existing proliferation of privately-owned surveillance cameras with facial recognition capabilities in San Francisco, it was considered a waste of money to duplicate what the existing systems were already doing. So, SFPD accesses privately-owned systems to help solve crimes and support prosecutions.
The recently-passed ordinance won't change SFPD´s allowable access to privately-owned systems, it will just make the police more accountable for the way in which data is obtained and used. However, if SFPD - or any other of the fifty-two departments to whom the ban supposedly applies - wants to use its own systems, a clause in the ordinance gives them the power to do so.
The “Exigent Circumstances” Clause Circumnavigates the Ban
- 19B.2.(d) of the ordinance (PDF) states: “it shall be unlawful for any Department to [intentionally] obtain, retain, access, or use: 1) any Face Recognition Technology; or 2) any information obtained from Face Recognition Technology”. That section is perfectly clear, and is the two lines of the twenty-one page document the media focused on.
Deeper into the document (§19B.7.(a)), the ordinance states: “a Department may temporarily acquire or temporarily use Surveillance Technology in exigent circumstances without following the provisions of this Chapter 19B” - effectively, the ban doesn´t apply in “exigent circumstances”. So, what are “exigent circumstances”? And who decides when they exist?
According to the definitions provided in the ordinance, “exigent circumstances” are events in which there may be “imminent danger of death or serious physical injury to any person”. If you interpret this as any event in which there is a risk of a serious physical injury you have to include events such as crossing the road - which would justify the police's day-to-day use of facial recognition technology.
There Needs to be More Discussion about Surveillance, Security, and Accountability
There is no doubt both supporters and opponents have legitimate arguments for and against the supposed ban. Nobody wants to see another incident such as the wrongful detention of Denise Green, and it is right that the San Francisco Police Department and other civic departments are accountable for how they obtain, retain, access, or use the personally identifiable information of San Francisco´s citizens.
However, there needs to be more discussion about surveillance, security, and accountability in public safety. If other jurisdictions decide to copycat the apparently meaningless ordinance passed by San Francisco's Board of Supervisors, all it will lead to is additional burdens on safety officials and more resources required to deal with them. Or, in other words, more demands on taxpayers.
There are multiple ways in which public safety officials can cost-effectively enhance public safety using crowd-sourced data that doesn't violate privacy standards or risk data breaches. Instead of banning the use of publicly-owned facial recognition systems that aren’t being used anyway, maybe we should be discussing these alternate solutions to enhance public safety.