Theme 3: Algorithm Transparency

Emancipating Users Against Algorithmic Biases

This theme asks the participants to consider the kind of information they would like to be provided with in order for them to be able to make decisions or feel better about the use of algorithms in their online world.

Background Information

Previous parts of the Youth Jury will have helped participants to think about and reflect on their own online activity, and how an algorithm may influence the information that they see online. You will also have discussed the benefits and the potential concerns of how algorithms may govern online activity, as well as how algorithms may be regulated to reduce the risk of discrimination towards particular groups in society.

This part of the Youth Jury (Theme 3) will encourage participants to think about whether social media platforms and others should be responsible for making the algorithm that they use more transparent, so that individuals know how the algorithm functions, and how this might affect their online lives.  The session will aim to:

  • Ask participants their views on algorithm transparency. Is algorithm transparency important?  If so, why?
  • Encourage participants to reflect on what algorithm transparency might look like. How should this be communicated to people in the everyday?
  • Discuss any further recommendations that participants might have in relation to changes that could be made to the way that the internet currently functions.

Algorithm transparency is talked about often in the media as a solution to people’s concerns about the use of algorithms in online (and other) systems. But what is transparency?

Running Theme 3

Download the Resource Pack

This theme is carried out towards the end of the jury, once participants have had the chance to consider the use and regulation of algorithms.

The main suggested task for this theme is the ‘reveal’ of the Black Box which has been used throughout the rest of the youth jury to represent an algorithm and at this point should be full of data cards and potentially other suggestions. There is also a wrapping up activity:

  1. Revealing the Black Box (Task 6)
  2. Wrapping up the Jury (Task 7)

Task 6: What would you like to know about how internet companies use algorithms?

10 minutes

Reintroduce the black box which should now contain all the data cards and other data-related activities, and remind the group that it is often difficult (or impossible) to know how an algorithm came to a particular decision. Give some more examples of decisions that algorithms make about you online.

Each side of the black box can be uncovered to reveal an illustration of a different type of ‘algorithm transparency’, for example we used source code, flow charts, Facebook’s EdgeRank visualisation, and an entirely transparent side that just shows the contents of the box with no context. Reveal each side in turn, talk about whether it would help the group to understand the algorithm or trust the company that is using it.

The resources pack contains example instructions for creating your own black box.
– TOOLS: The black box, containing data cards and post-its etc.

Task 7: What are your recommendations for increasing fairness and preventing bias online?

15 minutes

Ask participants to make recommendations or suggestions about increasing fairness and preventing bias in any or all of the scenarios. This should also include comments about what is currently good or bad about the internet. It is good to colour code the recommendations, so positive suggestions are on green and negative on red post-its, with a third colour for neutral comments. These can be placed in a clear container so that people can see the balance of positive and negative comments.
– TOOLS: Post-its on giant paper, sharpies

Participants should then be provided with the final questionnaire if required, and thanked for their participation, which may include a flyer with advice for next steps or a thank you gift.