CONCEPT + DESIGN
Pro-active Brief
Amnesty - The Mirror of Mistakes
Nowadays, there are more and more algorithms. From recommending music to detecting cancer. It’s a tool with the power to save lives, but also to destroy them.
The benefits scandal is only one example of the terrible injustice it can cause. It remains incomprehensible that in a country like the Netherlands violation of the basic rule of law took place for so long.
But algorithms themselves do not discriminate, it is the data that's fed into it. If this data is biased, the algorithms will create answers based on discriminatory assumptions. And due to historical data prone to being racist, sexist, and discriminatory serious repurcussions are inevitable for certain individuals. You can see the algorithm as a mirror of the mistakes humans make.
Algorithms reinforce human discrimination. And with the new WGS bill, we must stand up against its uncontrolled use before thousands of people will be unjustly treated again.
Insight:
Algorithms can be highly discriminatory if fed with biased data.
Idea:
To show the biased algorithm to the Dutch we create The Mirror of Mistakes. It’s a large, digital mirror at a crowded square.
Execution:
Use crime based datasets from the Dutch government to feed the algorithm. Based on the data, the mirror will point out features that indicate to what extent you are a threat.
Why:
With this concept we want the public to literally face the problems of a discriminatory algorithm and take action (by signing the petition).
