AI is changing the fabric of society - it is crucial to ensure it serves people, not the other way around.

A conversation with Seda F. Gürses

With an undergrad degree in international relations and mathematics, Seda is now an academic at Delft University, focussing on how to do computer science differently - utilising a interdisciplinary approach to explore both concepts of privacy and surveillance, and also looking at what communities need in face of increasing use of AI, or what she calls logics of optimization in digital systems. She was led into this field of study by her fascination with the politics of mathematics and the biases contained within seemingly neutral numbers.

The technological landscape has changed enormously in the past few years — from static software saved on a disk that was only updated every once in a while, to software and apps that are held on services and so constantly updated and optimised. In addition to existing concerns around privacy and security, a whole host of new harms and risks have arisen that requires the attention of computer scientists to find other ways of securing and protecting users, non-users and their environments.

The negative consequences of prioritising optimisation over user experience can be seen in Google Traffic and Waze, which sends users down surface roads to avoid freeway traffic. They don’t care that this may have an adverse impact on the environment and local communities, or even that it actually may cause congestion over all. Further, Uber has optimised its system to outsource risk to its workers: instead of paying people for the time they work, Uber offers them a system that tells them when they are most likely to get customers, casting ``labor optimization’’ as paying people only for when they pick up a customer, and externalizing all further risks associated with the job, e.g., waiting time, sick leave, accidents, onto drivers.

When this kind of tech injustice is applied to public institutions such as borders and social welfare systems,optimization logic and injustices the discrimination embedded in the very systems mean we are changing the fabric of society without having the necessary discussions as to whether that’s something we want to do. We need to stop focusing so much on data and algorithms and pay more attention to the forms of governance we want these technologies to have. It is crucial that the computational infrastructure boosted by all the talk and funding invested in “AI"serves people, not the other way around.

Join our Workshop on Inequalities in the age of AI, what they are, how they work and what we can do about them. Registration: https://register.edgeryders.eu

More info: https://edgeryders.eu/t/workshop-on-inequalities-in-the-age-of-ai-what-they-are-how-they-work-and-what-we-can-do-about-them-19-11-brussels/10326/60

Hey Inge, this looks great, here are some suggestions for changes.
I did not hear from you how you would like me to edit the longer version. Let me know and I will get to it later in the afternoon!

Thanks again for pulling all of this together.

s.

inge
Community Journalist

    November 16

A conversation with Seda F. Gürses

With an undergrad degree in international relations and mathematics, @sedyst is now an academic at Delft University, focussing on how to do computer science differently - utilising

an

interdisciplinary approach to explore both concepts of privacy and surveillance, and also looking at what communities need

in face of increasing use of AI, or what she calls logics of optimization in digital systems.

. They were led into this field of study by their fascination with the politics of mathematics and the biases contained within seemingly neutral numbers.

The technological landscape has changed enormously in the past few years — from static software saved on a disk that was only updated every once in a while, to software and apps that are held on services and so constantly updated and optimised.

In addition to existing concerns around privacy and security, a whole host of new harms and risks have arisen that requires the attention of computer scientists to find other ways of securing and protecting users, non-users and their environments.

The negative consequences of prioritising optimisation over user experience can be seen in Google

Traffic

and

Waze

, which

send users down

surface

roads to avoid freeway traffic. They don’t care that this

may have

an adverse impact on the environment and local communities, or even that it actually

may increase congestion over all

. Further, Uber has optimised its system to outsource risk to its workers: instead of paying people for the time they work, Uber offers them a system that tells them when they are most likely to get customers,

casting ``labor optimization’’ as paying people only for when they pick up a customer, and externalizing all further risks associated with the job, e.g., waiting time, sick leave, accidents, onto drivers.

When this kind of tech injustice is applied to public institutions such as borders and social welfare systems, the

optimization logic and injustices

embedded in the very systems

means

we are changing the fabric of society without having the necessary discussions as to whether that’s something we want to do. We need to stop

focusing so much on

data and algorithms and

pay much more attention to

the forms of

governance

we want these technologies to have. It is crucial that the computational infrastructure boosted by

all the talk and funding invested in “AI"

1 Like

I think it was mostly to to ensure that it (the longer version) accurately represents what you were saying, and making edits where you feel is needed to convey what you mean.

Thanks, Nadia. I will get to this alter in the afternoon. Need to run to an event now. Thank you!
s.

1 Like