Discriminating algorithms

"Nothing on the Internet is neutral"

21. December 2022 by Siegrun Herzog
Sandra Wachter is one of the leading researchers in the field of Internet and artificial intelligence. In our interview, the alumna of Law, who has been recently appointed Professor at the Oxford Internet Institute, explains why it is so important to take a look behind the decisions of algorithms.
Sandra Wachter is a lawyer and data ethicist at the Oxford Internet Institute, where she conducts research the legal and ethical aspects of new technologies such as artificial intelligence and machine learning. On 16 January 2023, the law alumna will be a guest at the University of Vienna and will give the keynote speech at the panel discussion on the semester question. © Sandra Wachter

Rudolphina: You have been recently appointed Professor at the University of Oxford. Congratulations. When making this move, there was obviously no algorithm involved that learns from previous decisions to make future decisions. You are not an old white man, but a young woman...

Sandra Wachter: Yes, this is a good example and this is also why I am very glad that certain decisions are still taken by humans.

Rudolphina: For which decision in our everyday lives would an algorithm be handy?

Sandra Wachter: I love to conduct research, to teach and to study with students, but for a few tasks I can really imagine getting help from algorithms. An algorithm could help me coordinate meetings, plan business trips or check footnotes. However, these tasks are much too complex for an algorithm and this is why I might have to refrain from the help by algorithms for the time being.

Rudolphina: You are leading the research group Governance of Emerging Technologies at the Oxford Internet Institute, at which you are dealing with legal, ethical and social questions arising from new information technologies. What are the most topical issues at the moment?

Sandra Wachter: We are addressing, for example, Internet regulation, platform regulation, fake news, deep fakes or robotics. I am currently focussing on artificial intelligence. I am looking at how algorithms are changing our lives, either in a positive or a negative way. One of my specialisations is data protection. An algorithm can actually only work if you feed it with data. And without any algorithm the data are often too much or too incomprehensible to make use of them. But if there are data, there are always also problems of data protection. I would like to, so to speak, unravel the mysteries a little bit that an algorithm can learn about us without us being really aware of it.

Rudolphina: Could you give an example?

Sandra Wachter: Algorithms are very good in reading between the lines. Based on data, for example, about which car you drive, which shoes you wear or if you like to eat ice on Sunday, the algorithm can make certain predictions about us. For example, if you are a woman, if you are black or if you have a certain confession. Things that we would actually not associate with that. We are constantly leaving our traces on the Internet and thus, we are actually delivering our diary. Algorithms are often taking very important decisions about us, for example, if you have to go to jail, if you receive a mortgage or if you are promoted or fired. And since these algorithms are often very complex and untransparent, I am interested in the question of how to make them explainable. I want to understand why an algorithm has taken a certain decision, just to make sure that it was really the right decision. And another focus is related to justice, fairness and diversity. After all, we can only collect those data that have already been generated. And since there are relatively few areas that are not shaped by sexism, racism, heterosexism, etc., we should be clear on the fact that the probability that this algorithm is unfair is very high. You have to be alert and not propagate discrimination of the past without reflecting on it.

Rudolphina: You are saying that we are actually all being discriminated against on the Internet without knowing it. How? And what is the difference between discrimination on the Internet and in the analogue world?

Sandra Wachter: In the analogue world, persons affected by discrimination usually notice it and can fight back. Algorithms discriminate against us behind our backs. For example, if I am looking for a job, as soon as I open a browser, it is already clear who I am, regarding my gender identity, my sexual orientation, my ethnicity, my religion, my age or my disability, if applicable. This means that not every job is displayed in an unfiltered way, but the algorithm decides what I should see. Hence, it might be that some jobs are never displayed to me because the algorithm has filtered me out as unsuited. But I do not know this at all.

Panel discussion: What is digitalisation doing to democracy?

In the current topic of the term, our experts look at the question of how much algorithms democracy can take – and how the digital transformation can help to strengthen democracy again – from different perspectives. Join the discussion at the panel discussion (German) on the topic on 16 January 2023 at 6 p.m. in the Great Festival Hall, with, among others, the lawyer and data ethicist Sandra Wachter from Oxford University.

"In the analogue world, persons affected by discrimination usually notice it and can fight back. Algorithms discriminate against us behind our backs."
Sandra Wachter
You may also read
Election campaigns on the Internet
Democracy can only work if citizens are able to take informed decisions. But what happens if a news feed is so incredibly personalised that nobody really knows what is actually presented to us, and from whom. Sophie Lecheler and her team at the University of Vienna are using experiments and 'data donations' to see what users see – and to understand political discourse in times of digitalisation.

Rudolphina: What can we do against this hidden discrimination? Is awareness-raising already the first step?

Sandra Wachter: The law is toothless in this case. A complaints procedure can, as you know, only help the person who perceives discrimination. Law was not made for preventing an algorithm from harming humans. Here, we have to urgently mend the law. We know from statistics that 80 to 90 per cent of the people believe that the search results that are displayed to them by Google are neutral. But nothing on the Internet is neutral. We should keep this in mind. We are all sitting in a filter bubble that is tailor-made for the person the algorithm determines that we are. Therefore, we must find other ways to take care that humans are no longer disadvantaged. What we can do and what I advocate for are so-called bias tests. I consider this the duty of companies and organisations. They must prove that the algorithm they use treats all groups equally.

Rudolphina: How do you assess the willingness of companies to change something in this respect?

Sandra Wachter: By now, many are already aware of the problem of biases in AI and algorithms. However, people disagree on what fairness is and how to operationalise it. I advocate for justice not being something binary that can be distributed between 1 and 0. Justice depends on the context and strongly fluctuates between 1 and 0. It is fluid from a historical and cultural background and not just a simple mathematical system. Anyhow I think that active efforts for safeguarding fundamental and human rights will attract consumers in the future. It could be a nice side effect that firms that are doing the right thing from an ethical and legal point of view are rewarded.

Rudolphina: What is your greatest concern regarding the current or future technological developments?

Sandra Wachter: I am greatly alarmed by the assumption of inevitability – that we have to use technology just because we can. Technology should be used to improve our lives. But currently algorithms are often not used to make better decisions but to save costs.

Rudolphina: What are the greatest opportunities at the moment, in your opinion?

Sandra Wachter: In medicine, but not to replace the doctor or to eliminate care. AI is a new technology that can help us make better decisions, for example, in the early detection of skin cancer. Today algorithms are already very good in detecting cancer on white skin but not on dark skin. Therefore, you could design the algorithm in a way to improve cancer screening for non-white people.

Rudolphina: Your grandmother was one of the first women studying at the Technical University of Vienna. Was she influential in your interest in technology?

Sandra Wachter: My grandmother has strongly influenced me in terms of the fact that technology and mathematics and being a woman are never a discrepancy. Already as a child, I was interested in technology because I had the feeling that it can be something good for our society. Law just simply lent itself to this. I am actually not the person who is creating technology but maybe the one who creates a firewall to guarantee that technology is useful for society.

Rudolphina: Do you have some tips for our readers? Mistakes to easily avoid on the Internet?

Sandra Wachter: It is important to understand how valuable our private data are. Entering your e-mail address might seem harmless but when you keep in mind that others constantly want to have our data, by all means, we can assume that they are likely valuable. This means that I hold the valuable think in my hands. I have control and, in fact, I have the power. Often, there are alternative solutions. For example, you can use browsers that are more considerate of your privacy, such as Firefox. You can use other search engines, such as DuckDuckGo, where your steps on the Internet cannot be as easily traced. But actually this is like walking on the street, video cameras are everywhere and I give somebody a hint on how they can avoid being seen by putting a hat on or using a scarf. This is not freedom. Freedom means that I can go out, however I like and I am left alone. My wish would be that I do not have to constantly protect my identity on the Internet. In my opinion, this would be freedom, justice and democracy. 

How does digitalisation change democracy?

Sandra Wachter: "Technology can strengthen democracy or counteract it. It may even destroy it. The task of law is to guarantee that fundamental rights can be exercised. I want to enjoy technology and do things that would not be easily possible in an analogue way, without being afraid that somebody finds out about by political views, who my God is or any other private information that I do not want to reveal."

© Sandra Wachter
© Sandra Wachter
Sandra Wachter is Professor for Technology and Regulation at the Oxford Internet Institute at the University of Oxford, at which she is leading the Governance of Emerging Technologies research group. In her research, she addresses legal, ethical and social questions arising from the use of new information technologies. Sandra Wachter studied Law at the University of Vienna.