Omar Solis
Bio
Omar Solis is a software engineer and privacy advocate with a passion for ethical technology and public accountability, particularly in the realm of algorithmic transparency. He holds a degree in Computer Science from Stanford University (Class of 2018) and has over six years of experience in software development and data science, having contributed to initiatives at the San Francisco Mayor’s Office of Innovation, seed to late-stage startups, and large tech companies.
As an Ethics and Technology Practitioner Fellow, he is leading a project to build an open data portal that allows and encourages communities to track and visualize surveillance technologies, such as Automatic License Plate Readers (ALPRs). In addition to crowdsourcing ALPR locations, the portal will integrate other related datasets to support deeper analysis of surveillance impacts. Omar’s work aims to empower communities to understand and challenge the societal effects of AI-driven surveillance, with a particular focus on its impact on marginalized communities.
Fellowship Project
Surveillance technologies powered by AI are deployed throughout cities in the USA by both the public and private sector. These technologies are created by private companies such as ShotSpotter, Flock Safety, and Ring. While these technologies promise increased public safety, there are many ethical questions that arise with the use of these technologies and also many unknowns that could decrease safety for many people, including marginalized communities.
One example is Automatic License Plate Readers (ALPRs) produced by Flock Safety and deployed by law enforcement in San Francisco. San Francisco deployed at least 400 of these cameras in undisclosed intersections across the city. These cameras capture video/images of all vehicles that drive through their line of vision and use image recognition technology to parse the make, model, color, alterations, and other unique identifying information, such as decals and bumper stickers, and store the data into an instantly searchable database that law enforcement can query. If a person doesn’t want to be tracked in this manner, the only option is to not drive, as it is illegal to cover your license plate. Once a driver’s data is tracked, it can be stored in the database for 12 months and also be shared with other in-state law enforcement agencies.
Mass surveillance of every driver can not only erode public trust in the government, but can have more immediate effects. They have caused false arrests because of errors in the image recognition technology, have allowed an officer to stalk an ex-wife, and can dissuade lawful protesters from protesting who fear retaliation. The sharing of the data to third parties can be an issue as well. Even if the data is shared with good intentions, the initial gatherers, for example the SFPD, lose control of what happens to the data once it leaves their hands and the data could be shared to other agencies such as ICE.
Many questions remain about ALPR’s and other AI-powered surveillance cameras in San Francisco, the Bay Area, and the rest of the nation. How much of the public knows they are being surveilled in this manner? Does the public want to be surveilled in this manner, especially if they know the risks? Is public safety actually increasing because of this surveillance? Who is public safety increasing for and who is it decreasing for? Are the locations of these cameras causing a predictive policing feedback loop on marginalized communities? Are the creators of the technologies understanding the risks when they develop their products and are they properly informing their clients?
Since this is a nationwide issue, I want to first focus on local communities in the Bay Area. As every community's opinions, laws, and culture differ around the nation, focusing on this narrower scope lets me tackle this issue in ways that are specific to this community's needs.
With my project, I intend to do a few actions. First, I want to understand the public’s understanding of AI-powered surveillance technologies in their cities and how they think it affects them. I plan to gather this knowledge through surveys across the Bay Area that are administered to a diverse population. Then, I want to bring awareness to how AI-powered surveillance technologies such as ALRP’s can bring harm to communities through workshops and presentations in the community. Simultaneously, I want to build on top of existing tracking technology to build an app that allows community members to verifiably mark locations of ALPR cameras on a map. I believe that this app will build community engagement and understanding, provide transparency and public oversight, and also provide valuable data that can be used for research. While others have built similar types of maps, the data either has become outdated or the data is unverified and didn’t engage a large portion of the community. The data can be combined with other data like public demographic data to understand how certain communities are being surveilled compared to others. Finally, I hope to organize communities to push for local public policy changes regarding surveillance that they believe is best for them.