One Zero - Cybersecurity Workers Need to Learn From Victims

Nearly two years ago, Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, tweeted, “If you are a woman who has been sexually abused by a hacker who threatened to compromise your devices, contact me and I will make sure they are properly examined.” Despite Galperin’s vast expertise, she didn’t expect what happened next: a mailbox that was flooded with requests for help from survivors of domestic abuse that continues to this day. A determined Galperin responded by launching a multi-pronged campaign against stalkerware.

Abusers install stalkerware in order to surveil, harass, and control intimate partners without their knowledge, tracking every conversation and movement. Galperin is pushing for change in the antivirus industry at companies like Apple, and is calling for officials to use “their prosecutorial powers to indict executives of stalkerware-selling companies on hacking charges,” Wired reported. The Russian security firm Kaspersky is working so hard to combat the problem that Galperin praises it for “raising the bar for the entire security industry.” It detected 518,223 cases of stalkerware (both successful and unsuccessful attempts to install it) during the opening months of 2019. That’s a 373% increase from the same period in the previous year. Clearly, a lot of work still needs to be done to protect the privacy of vulnerable people.

For the women who reached out to Galperin, stalkerware is often the most potent privacy threat they face. Unsurprisingly, it was largely invisible to the male developers designing security software, and far from an outlying case. Silicon Valley’s well-established diversity failures are blinding developers to the needs of many of their most at-risk customers. Countless organizations try to develop free tools and resources to promote privacy in the nonprofit space, but these materials are often based on problems experienced by the straight white men who design them. And yet this privileged audience often faces the fewest privacy threats, oblivious to the threat of tools like stalkerware or the vulnerability of over-policed communities.

The experiences of vulnerable communities, including women targeted by domestic abusers, need to be considered and represented in the design process, and more tools need to be made available to them. Our collaboration, as a philosophy professor who teaches privacy (Selinger), and an executive director of a nonprofit advocacy organization fighting excessive surveillance (Cahn), led to a semester-long partnership addressing this issue. We brought together students and a grassroots community group, exploring a new model of collaborative development that moved beyond general cyber hygiene strategies and paternalistic assumptions about how to help people whose privacy is being threatened.

We contacted direct service providers around New York City whose clients had immediate privacy concerns. Turning Point for Women and Families (TPNY) was an ideal partner, a Queens-based nonprofit serving survivors of domestic violence in the Muslim, Arab, and South Asian communities. Their clients face a unique privacy challenge, needing to safeguard their data from both government surveillance and their former abusers. Students had a rare opportunity to work on a service-learning project that could make a life-or-death difference in the real world.

Students were repeatedly reminded that they should begin the design process by asking, “What do other users need?”

The students were asked to create a surveillance training kit to better protect the privacy of the clients who work with TPNY. The kit contains handouts and lesson plans on limiting stalker access to location data, protecting against unwanted calls and messages, strengthening passwords and access controls, checking for unauthorized account access, and limiting an online footprint. This frontline learning taught them far more about the surveillance of minority communities than a textbook ever could.

Some privacy literature, like Shoshana Zuboff’s The Age of Surveillance Capitalism, explores structural problems, like tech companies monetizing the behavioral surplus of consumers. Others, like Chris Gilliard’s article “Friction-Free Racism,” compellingly describe real-world privacy threats and how they impact minorities. Yet students from a background of relative privilege are inclined to interpret Gilliard’s writing from a safe cognitive and emotional distance. Projects like creating a surveillance training kit put students in touch with people, like TPNY staff, who are given the authority to correct students’ misunderstandings and challenge their biases.

Students were repeatedly reminded that they should begin the design process by asking, “What do other users need?” This consciousness-raising is a central value of working with individuals who will help shape commercial software products for years to come. In this way, students are not only helping to undo the damage done by prior iterations of design-based discrimination, but they are learning an invaluable lesson about the biases and presumptions they will bring to their professional projects.

They also managed to challenge our biases, too. In a moment that surprised both of us, it became clear that students were holding back when proposing surveillance scenarios to remedy. They feared that relevant examples, such as describing situations where an abuser is stalking a victim’s social media feed or surreptitiously scrutinizing her private texts, might prove triggering for TPNY’s clients, even as staff insisted that these sorts of prompts were essential to an effective training.

This is especially urgent so long as the student bodies of many of America’s technical programs remain disproportionately white and male. We can reinforce the lesson that every single one of us has an affirmative moral duty to remedy the discriminatory product design that benefits some privileged users at the price of ignoring countless others.

Evan Selinger is a professor of philosophy at Rochester Institute of Technology.

Albert Fox Cahn is the executive director of the Stop Surveillance Technology Oversight Project.