top of page
Marcel_Top3.jpg

Poison Data, Kill Algorithms 
2025

The Only Human Thing examines common patterns behind advancements in surveillance technologies, looking closely at the way private companies buy data to train their algorithms and the extent of their reliance on outsourced human labour. The project explores data poisoning as a way to resist oppressive surveillance and use automated data collection to the protesters’ advantage.

 

The datasets surveillance companies buy to train their algorithms (usually created by individuals who are paid by task an average of two dollars per hour) contain images of people wearing different disguises in various combinations, angles, poses, and backgrounds. Quickly captured on smartphones or webcams, these images are often low-quality, as algorithms must be trained on the same sort of imagery they are likely to encounter during real-world use. 

 

As their only purpose is to improve surveillance algorithms, these self-portraits are more data than image. The subject is not the person but their disguise: a pair of glasses, a fake moustache, a wig, or a face mask, etc. The end is not the photograph itself but the variables it introduces to the algorithm.

 

After researching some of these datasets available online, Marcel Top decided to create his own. 

 

Marcel Top’s dataset is poisoned, though.

 

Published online alongside a fake scientific paper and divided into two parts, the dataset is designed to be very attractive to surveillance companies, as it is ideal for training algorithms to perform better during protests. The first part of the dataset consists of a series of the artist’s self-portraits, built to train algorithms to recognise when people are concealing their identity. Top photographed himself wearing 11 popular protest disguises in all their possible combinations (over 1000 images) on top of a silicon lifelike mask. Some of these algorithms are trained to detect “liveliness”, a parameter that can establish how heavily someone is disguised. Like most of the datasets Top found online, the eyes are The Only Human Thing that remains visible in the images.

 

The second part of the dataset is built to train algorithms to detect suspicious movements. Top identified and performed a series of movements, dividing them into four categories: walking, walking while holding a sign, sprinting, and throwing. Looping the same path a hundred times per movement category, he photographed himself at different heights and recorded throughout. This second part of the dataset consists of 500 images and five videos.

 

What makes this dataset poisoned is the pixel pattern and mislabelling present in 10% of the images. The pixel pattern is a tiny red square at the centre of the image, while the mislabelling is its inaccurate annotation. In combination, this corrupted data is likely to deteriorate any algorithm it trains. When used outside of the dataset, the algorithm could recognise the pattern and mislabel it, making it unreliable.

 

By playing with governments’ implicit demand for (and eagerness to operate) the newest, most efficient, and most reliable surveillance tools, these companies are actively feeding into fantasies of state-surveilled societies, to the detriment of citizens’ rights.

 

The race to supply the best surveillance tools means that data is in high demand and often unethically sourced. Surveillance companies collect enormous amounts of data online, either publicly available or outsourced. In most cases, people supplying data are unaware they are doing so, or unaware of how their data will be used.

 

Analysed from a protest standpoint, The Only Human Thing researches the concept of data poisoning as a tool to disrupt unethical automated data collection and outsourcing. By creating a dataset that appears, on paper, to improve facial recognition during protests, Top hopes to lure private surveillance companies into training their algorithms on this conveniently available, free dataset.

 

Algorithms are trained to recognise patterns.

If they are trained on public data,

We can try to give them patterns to recognise–

And use it to protect our rights.

Marcel_Top4.jpg
Marcel_Top17.jpg
MARCEL_TOP_23.jpg
MARCEL_TOP_22.jpg
MARCEL_TOP_24.jpg
Marcel_Top12.jpg
Marcel_Top14.jpg
Marcel_Top13.jpg
Marcel_Top15.jpg
Marcel_Top10.jpg
Marcel_Top5.jpg
Marcel_Top7.jpg
Marcel_Top8.jpg
Marcel_Top6.jpg
Marcel_Top21.jpg
Marcel_Top20.jpg
Marcel_Top19.jpg
Marcel_Top22.jpg
Marcel_Top23.jpg
MARCEL_TOP_25.jpg
Marcel_Top24.jpg

© 2025 by Marcel Top.

bottom of page