5th Istanbul Design Biennial
Ibiye Camp, 2020
English with Turkish subtitles
Behind Shirley deconstructs and rethinks the colonial narratives in the development of facial recognition systems, exploring how darker skin was not taken into account in film chemistry and is now ignored in facial-recognition software.
In photography, ‘Shirley cards’ were used as a standardised reference for colour-balancing skin tones. These cards generally showed a single Caucasian woman dressed in bright clothes, and coloured square blocks of blue, green and red. The chemicals distorted tones of red, yellow and brown, which led to faults when photographing darker skin. Film was not improved until furniture and chocolate makers began complaining that it was unable to capture the difference in wood grains and chocolate types. The default towards lighter skin in technology is still present today, with facial recognition occasionally not registering people of colour.
The algorithmic bias that exists in digital-imaging technology is due to human biases. When trying to make artificial intelligence, we inevitably recreate human intelligence. AI finds patterns from within pools of data, reflecting our own behaviour and often exacerbating its negative aspects. Empathy has a growing importance in artificial intelligence, datasets and algorithms, fields whose inherent perspectives require further interrogation.