The Cost of Being Seen: How Expanding Bay Area AI Surveillance Challenges Community Trust
Photography by Patrick Beaudouin
San Francisco and the Bay Area have recently been impacted by a new wave of surveillance technology that is being implemented across cities. In the past year, San Francisco introduced at least 400 of Flock Safety’s Automatic License Plate Readers (ALPRs). These surveillance cameras track, recognize, and store an immense amount of data on each car that passes their line of vision. Using image recognition technology, the cameras can recognize the make, model, color, alterations, decals, and bumper stickers, which it stores in a database that for 12 months that law enforcement can query.
With no way to opt out of this surveillance, how do citizens feel this impacts their privacy and safety, and how might this affect their trust in the government officials as well? This is what Omar Solis is trying to remedy. Solis, an Ethics and Technology Practitioner fellow at the McCoy Family Center for Ethics in Society, is working to increase transparency of ALPR surveillance and make the data collected more accessible to the public.
Solis mentioned how the government’s access to ALPRs has led to some cases of abuse of power, such as an officer stalking an ex-wife, officers allegedly tracking a woman in Texas after an abortion, and false arrests being made. Cases like these leave citizens wondering if new technological innovations are undermining the privacy and trust that communities rely on.
For Solis, this project started when San Francisco voters supported a ballot proposition to increase drone and surveillance tools for police use, a large shift from when the city’s Board of Supervisors banned city agencies from using facial recognition tools in 2019.
“Giving this ability to the police could cause a negative impact on communities,” Solis said. He wondered how the city moved from being anti-surveillance to now embracing new policies. “I wanted to know why [the adoption of surveillance] shifted in people's mindset.”
After graduating from Stanford with a degree in computer science, Solis worked for the city of San Francisco as an engineer, recognizing the emphasis placed on technological advancements. “Even just as a resident of San Francisco, I've noticed that local government has a really big push for using AI technologies because they want to be seen as technologically progressive.”
As a volunteer for Citizens Privacy Coalition, a grassroots organization that aims to combat the lack of awareness regarding government and corporate surveillance in Silicon Valley, Solis is invested in ensuring the public is aware of how ALPR cameras are changing and affecting communities.
Through the McCoy Family Center’s Ethics and Technology Practitioner Fellowship, Solis researched and synthesized key data points to aggregate and compile all the information the cameras capture with help from research assistants Esha Thapa B.S. ’25, Angela Ngoc Nguyen B.S. ’27, and Bolu Aminu B.S. ’27. From there, he created several maps and data figures that shows the location of the cameras compared with certain city demographics (such as income, race, crime, and traffic). He also conducted countless surveys and interviews with citizens to gather their thoughts about ALPR surveillance and their awareness. All this information is now available on the project’s website.
For Solis, this transparency helps create a better connection between the public and the government’s use of surveillance. “It's possible that [surveillance] can help public safety, but I think it's not being scrutinized and [doesn’t have] enough guardrails,” he shared. “When new technologies are used, they do need to be heavily scrutinized and constantly watched over to make sure that they aren't causing harm or potential harm.”
With surveillance technology expanding faster than policies can keep up, there are very few guardrails to monitor and control how these cameras are implemented and used, especially in California. Solis shared the results of his findings in November at the fellowship’s closing Project Showcase. He is now working on finalizing the report and website, which will help citizens across the Bay Area better understand the evolution of surveillance and be more informed of its impacts in their communities.
“A lot of guardrails have been coming off because people are pushing for progress [of technology],” Solis explained, “and I think it's okay to slow down and try to figure out if the technology you're working on is actually causing harm or not.”