Johannes Himmelreich doesn’t identify as an ethicist, per se. “I came to this interest through political philosophy and political theory—that got me into ethics and moral responsibility,” he clarifies. And for a long time, he wouldn’t have identified as a philosopher, either—despite what his buddies back home in Frankfurt, Germany, thought. “When I was 15, some of my friends called me the philosopher jokingly, but I had never considered studying philosophy seriously at all,” he recalls. Himmelreich expected to be a computer scientist—he had picked out the program and everything, but his interest in political issues spurred him in work as a journalist, too. “That got me into economics and philosophy. I wanted to work on questions of economics, globalization, and philosophy, and at some point I ended up doing a PhD.”
He studied philosophy and public policy at the London School of Economics, and now he finds himself as an Interdisciplinary Fellow at the McCoy Family Center for Ethics in Society, combining all those interests with a partnership at Apple. “The collaboration is an opportunity for philosophy to engage with praxis. I’m pretty sure there’s a model like this well known in bioethics where hospitals have something like a philosopher-in-residence who advises doctors on questions of consent and ethics of care,” Himmelreich says. “We have such a model at technology companies where these questions arise. It’s not just about technology, it’s also about business ethics.”
While much of his work led him to study questions revolving around group agency, right now he’s working on ethics and fairness in machine learning. “Machine learning is basically statistics and data to make better informed decisions. So in many contexts that doesn’t matter ethically—a computer decides how much power the battery is going to use in the next two hours, and that’s an innocuous example. But a very problematic scenario may appear in the criminal justice system: if the LAPD uses artificial intelligence to decide where to send their police cars to detect crime, that might incur racial biases.” It’s not just about making fair predictions using numbers, he insists. “We as society are looking in the mirror here. We have the data and the analytical tools now. That evidence shows us the biases we have.”
For his line of inquiry, Stanford is the best place to be, Himmelreich thinks. “If you’re interested in questions that range from applied political philosophy or theory, or ethics intersecting with technology and metaphysics, I can’t think of any better place to do this kind of work. I’ve really benefited already from this environment, to have a cohort at the same stage in your career. The postdoc workshop is really like a mini-university—we learn from each other, interact regularly, help each other with how we approach problems. There’s no way to dream it up in any better way.”
SARA BUTTON is a writer and editor. She lives in Menlo Park.