Data-driven products are becoming more and more ubiquitous. Humans build data-driven products. Humans are intrinsically biased. This bias goes into the data-driven products, confirming and amplifying the original bias. In this tutorial, you will learn how to identify your own -often unperceived- biases and reflect on and discuss the consequences of unchecked biases in Data Products.
Data-driven products are becoming more and more ubiquitous across industries. Data-driven products are built by humans. Humans are intrinsically biased. This bias goes into the data-driven products, which then amplify the original bias. This has the consequence that the power imbalances in a data-driven world tend to get bigger instead of smaller, most of the time unintentionally, and is particularly prevalent in the tech sector where teams are not diverse.
One of the obvious solutions is to get diverse teams, but when considering all the intersections of diversity, achieving full diversity is practically an impossible task. Therefore we see education and awareness as foundational steps to working towards a more equitable data world.
This tutorial has two parts. In the first exercise, we will start by revisiting our own privileges, as a tool to better educate ourselves in order to identify our individual - often unperceived - biases.
In the second part, we will evaluate what happens when these biases happen on a group level and go unchecked into our data products, based on the Data Feminism book and enriched with our own experiences as data professionals.
Education about Privilege and Ethics in the data-driven world can only improve how we see and work with data and better understand how our work with data can affect others.