A PhD researcher at Oxford Internet Institute, @katejsim studies the intersection of gender-based violence and emerging technologies. Her work focuses on issues of trust, gender and sexual politics, and the double-edged role of technology in facilitating connections but also targeted harassment. While organising against campus violence she personally experienced cyberharassment and lack of support from law enforcement. More resources have become available since then, but we need to change how we conceptualise these issues and fundamentally change the design of the platforms.
She helped to form a cross-campus network that grew to a nonprofit organisation, Know Your IX. The space requires better structures in place to support mental health and protection from cyberharassment to reduce burnout. Research shows again and again that women, especially women of colour, tend to self-censor and reduce their visibility in order to survive - it is crucial we put more safeguarding in place to protect them.
Digital systems designed to facilitate disclosures, collect evidence and automate reporting of sexual assault are attractive to institutions because of their efficiency - and to some extent to victims as they are perceived to be objective and neutral. However, these systems have bias encoded in them. The designers are working with their own understanding of sexual violence, which may not match victims’ experiences. Some victims don’t have the data literacy or English level to work the systems, which could compound their trauma. Further, the pressure to report is encoded into the design of these systems, but this is a misguided emphasis on a single optimal solution, which is not appropriate for all victims. De-emphasising reporting and focussing on “small data” driven by relationship building can create a structured conversation which is rich, insightful and telling.
Rather than asking how tech can be fixed for the better, the more urgent and important question is: who and what are we overlooking when we turn to tech solutions? How can we support practitioners in anti-violence space, like social workers, jurors and judges, and advocates, with data and tech literacy, so that they have control over how they interpret and act on data?
Participate in the conversation with Kate here: Can tech design for survivors? How sex, violence, and power are encoded into the design and implementation of data/AI-driven sexual misconduct reporting systems
Join our Workshop on Inequalities in the age of AI, what they are, how they work and what we can do about them. Registration: https://register.edgeryders.eu