A conversation with Seda F. Gürses
With an undergrad degree in international relations and mathematics, Seda is now an academic at Delft University, focussing on how to do computer science differently - utilising a interdisciplinary approach to explore both concepts of privacy and surveillance, and also looking at what communities need in face of increasing use of AI, or what she calls logics of optimization in digital systems. She was led into this field of study by her fascination with the politics of mathematics and the biases contained within seemingly neutral numbers.
The technological landscape has changed enormously in the past few years — from static software saved on a disk that was only updated every once in a while, to software and apps that are held on services and so constantly updated and optimised. In addition to existing concerns around privacy and security, a whole host of new harms and risks have arisen that requires the attention of computer scientists to find other ways of securing and protecting users, non-users and their environments.
The negative consequences of prioritising optimisation over user experience can be seen in Google Traffic and Waze, which sends users down surface roads to avoid freeway traffic. They don’t care that this may have an adverse impact on the environment and local communities, or even that it actually may cause congestion over all. Further, Uber has optimised its system to outsource risk to its workers: instead of paying people for the time they work, Uber offers them a system that tells them when they are most likely to get customers, casting ``labor optimization’’ as paying people only for when they pick up a customer, and externalizing all further risks associated with the job, e.g., waiting time, sick leave, accidents, onto drivers.
When this kind of tech injustice is applied to public institutions such as borders and social welfare systems,optimization logic and injustices the discrimination embedded in the very systems mean we are changing the fabric of society without having the necessary discussions as to whether that’s something we want to do. We need to stop focusing so much on data and algorithms and pay more attention to the forms of governance we want these technologies to have. It is crucial that the computational infrastructure boosted by all the talk and funding invested in “AI"serves people, not the other way around.
Join our Workshop on Inequalities in the age of AI, what they are, how they work and what we can do about them. Registration: https://register.edgeryders.eu