Theme 2: The Regulation of Algorithms

Emancipating Users Against Algorithmic Biases

This theme aims to allow participants to think critically about who (if anyone) should be responsible for making sure that algorithms are fair and unbiased, including consideration of surrounding society and personal responsibility.

Background Information

The discussions in Theme 1 will help to educate and inform participants, as well as help them to reflect on the systems that might govern their digital worlds.  This part of the session (Theme 2) will start by using examples to help participants think of how bias may be found as a result of the algorithm, and the dangers that might be posed as a result of this.

Building on these foundations, participants are encouraged to think about how algorithms may be regulated.  This part of the Youth Jury will aim to:

  • Discuss the challenges that may be presented by the use of algorithms, which may lead to discrimination, inequality and injustice.
  • Strike up a debate between participants in relation to how, what and if algorithms should be regulated. Who might be accountable for if/when algorithms go wrong?

The regulation of algorithms is a topic that is getting a lot of visibility in the news media and in government. The main issues are related to exactly what needs to be regulated, and whose responsibility it is to regulate the use and fairness of algorithms – the government? The big tech companies? Someone else?

Running Theme 2

Download the Resource Pack

The main activity in this theme is the Jury scenario, in which participants are confronted with a scenario and asked to make guilty/not guilty decisions for the parties involved. It is helpful to introduce some real life examples of discrimination and bias that have occurred due to the use of algorithms – there are some suggested links on the Further Information page. In running our Youth Juries we found that introducing and talking about 2 or 3 examples at the end of Theme 1 allowed the jurors to consider the issues and talk to each other during the break, meaning they were ready and primed to discuss the issues in Theme 2.

Task 5: Who is responsible when an algorithm goes wrong?

5 minutes

The introduction to this task is fairly important, as the aim is to get the group to consider both sides of the argument surrounding who is responsible for issues that may arise because of the use of an algorithm. It is also a good idea to keep the black box on view throughout this theme and the remainder of the Youth Jury. You may wish to refer to it when people discuss ‘the algorithm’ or ‘their data’ etc.

Two suggested cases are included in the resource pack (plagiarism or inappropriate content), but you might like to use your own that is relevant to your group. In general there should be a clear description of the problem, followed by discussion about who could be to blame for what goes wrong, including the company that use the algorithm, and the individual who created it. There are then three ways to run this activity.

15 minutes

  1. You may wish to pick one of the potential guilty parties to ‘put on trial’. The group can be split into a ‘Prosecution’ and a ‘Defence’. Give the two groups time to discuss their arguments separately before reporting back to the whole group in a mock jury. The whole group can then vote if the chosen party is to be found guilty or innocent.
  2. Another method is to get the participants to vote on guilty/not guilty for each of the options, after a group discussion of each, and also discussion of anyone else who might have a part in the blame. This is the method we used in our Juries.
  3. You could also combine the first two methods, by starting with a discussion of each option, and using the original guilty/not guilty vote to decide who to prosecute and defend in the mock trial, before voting again on the final outcome. This would work well if you had a lot of time left for the jury.

RETURN TO THEME 1

CONTINUE TO THEME 3

FURTHER INFORMATION

BACK TO OVERVIEW