A look into how People of Color have been misrepresented by the cameras we use everyday…
So chances are you’ll see this article on or after Africa Day of 2021, a day meant to celebrate Africa and all it’s people, and the diversity, beauty, strength and various other gifts we give to this earth. But last week at Google’s I/O developer conference, the most interesting announcement they made did get to me a little. Google announced their inclusive camera initiative, a project meant to create image processing algorithms that better represent people of color in their photos. It’s a noble and admirable endeavor that’s definitely necessary but chances are it made you think the same thing I did: There’s something with how cameras capture black people?
Turns out the answer to that is yes. If you watch the Vox Media video above, you’ll find that dating back as far as the early days of film, image processing and capturing was tailored for white skin and lighter skin tones, and considering that formed the basis of modern day cameras and image processing, it essentially means that even to this day, the algorithms on your phone and your DSLR are more likely to favor or better represent lighter skin tones in a photo as compared to darker ones, and that creates a representation problem being able to be linked as far back as the days of the slave trade, where no one wanted to represent people of color well on pictures because they were a lower class anyway. It’s a dark alley to go through but it has to be brought up when it comes to how it’s impacted even how cameras work to this day or how people prefer to take pictures. Chances are people prefer cameras that make them look lighter, even if that slightly misrepresents their skin tone, as of course the best pictures we’ve seen are likely those of light skinned people as opposed to dark skinned one. The ubiquity of filters hasn’t made this any better either of course , as it’s essentially a meme how people aim to make themselves look better using them, even if studies show the multiple self image issues linked to and worsened by such.
Hence it all comes down to this, our cameras are pretty much designed wrong, and the practice of how we use most cameras or features such as filters has only led to whats a misrepresentation of the black self-image, one that’s led to it being looked down upon or desired less even in the minds of people of color, especially those of darker skin tones.
So how does this all get fixed? Well, the short answer starts with it already is, on most fronts that is. After years of berating and shaming, most major camera companies have tweaked their cameras’ algorithms to continually recognize people of color better. Google’s had their own disasters actually when previous image processing algorithms classified a black person as a gorilla back in 2015. This was seemingly the shot in the arm they needed to get even better at their camera algorithms , and to date the Pixel’s camera algorithms are best in class, but their new initiative aims to better balance exposure and white balance to bring our the nuance of darker skin , as well as enhancing portrait mode shots to better outline curly or wavy hair . It aims to be an ongoing initiative but it’s seemingly going in the right direction, especially since these photo processing algorithms will be shared with other Android manufacturers as well and not just be Google Pixel features. And in conjunction with all this movement from everywhere else, and cheaper technology meaning people can actually get good cameras for lower prices, we might just end up at a point where more black people can take great photos and feel good about them without having to edit them or add filters. It’s a minor thing that seems to have a subtle but large impact, and hopefully it can be curbed soon enough.