Just like “neutral” algorithms programmed by white computer programmers, photography has long favored white skin over black. In 2020, Twitter’s auto-cropping tool got caught ignoring non-white faces, but it goes back a lot farther than that. Photographic film itself was optimized for pale skin tones. Digital cameras are way better, but much of that can be attributed to how they work rather than an effort to capture dark skin better. So, why has it taken so long to correctly record non-white faces in photographs? “Supposedly, in the days of film, it was far different, and photographing dark and light skin tones was a big difference. But nowadays, the notion that there is a big difference is just no longer there,” headshot photographer Rafael Larin told Lifewire via email.

Film’s Historic Bias

The chemical recipes for color photographic film were designed to favor the colors found in white skin. This bias was also institutionalized in the lab, where the film got developed and printed. American film producer Kodak provided a standard calibration card called the Shirley Card (named after Shirley Page, the white Kodak employee whose image appeared on the card). Lab technicians used this card to determine the “correct’ result,” which meant that black faces disappeared into shadow. The Japanese film company Fujifilm developed a slide film that better captured brown skin, Harvard professor, Sarah Lewis, wrote in her 2019 essay for the New York Times, The Racial Bias Built Into Photography. Kodak finally followed, but not because it wanted to capture dark skin better. Instead, a chocolate company complained to Kodak that it wasn’t getting the right brown tones in photos of its candies, and that’s what prompted a fix. Eventually, Kodak updated the Shirley Card and created a consumer-grade film that worked well with dark skin, although it still didn’t mention people of color. The ads for Kodak Gold boasted that it was “able to photograph the details of a dark horse in low light.” Film also has another purely technical limitation. It can capture only a limited dynamic range. If the photographer sets the camera’s exposure to capture a White face properly, then a Black face in the same photo will be underexposed, and vice versa. The photographer would have to make a choice. But with digital, things changed. “Film presents a completely different issue because you don’t get the room to edit in post. For darker skin tones, I meter the light for the shadows to make sure the detail of the face is fully exposed. It may blow out the highlights of a background, rendering the background or framing brighter than expected,” photographer Matthew Alexander told Lifewire via email.

Film vs Digital

Digital cameras are way better, both in terms of dynamic range and the detail they can capture in darker tones. In fact, the main danger with a digital camera is ‘blowing out’ the highlights. Once a white tone is overexposed, it has gone forever. And yet, with modern sensors, detail can be pulled out of seemingly impossibly dark parts of the image. But camera sensors don’t create photos. Instead, they record data, which algorithms must interpret to make images. Adobe’s new presets then take these images and tweak them. The Deep Skin pack contains 15 presets by documentary photographer Laylah Amatullah Barrayn, and the Medium Skin presets were designed by photographer and visual artist Dario Calmese. There is also a Light Skin pack. These presets look great, and with digital, it is easy for the photographer to use such tools to get great results for any skin tone and produce images where dark- and light-skinned subjects can both be well-represented in the same image. But the problems have not been solved. They’ve just moved. Instead of ethnic bias existing in film, we now find it in photographic algorithms, like the Twitter cropping tool’s preference for white faces, or Instagram filters that lighten dark skin. These algorithms can be far more dangerous, for instance, in the case of Robert Julian-Borchak Williams, falsely arrested on the evidence of facial recognition algorithms. This technology works well to distinguish white men, but fails on black men. The common thread is that seemingly neutral technologies contain the biases of those that create them. And this will persist until the people designing our technology are the same as the people using it.