GOOGLE has plans to introduce a new scale for measuring skin tones in an attempt to eliminate AI bias.
After partnering with Harvard professor Ellis Monk, Google is promoting a new way of identifying skin tones within its products and services.
“A lot of the time people feel that they are lumped together into racial categories: the Black category, white category, the Asian category, etc., but in this, there’s all this difference,” Monk said.
“You need a much more fine-grain complex understanding that will really do justice to this distinction between a broad racial category and all these phenotypic differences across these categories.”
In response to this need, Monk, an assistant professor of sociology at Harvard, developed the 10-shade Monk Skin Tone Scale (MST).
His model is designed to rectify outdated AI skin tone scales which are biased towards lighter skin.
When an AI’s computer vision (CV) fails to correctly categorize skin color, it can lead to performance problems, especially for users with darker complexions, Monk explained.
“Studies show that products built using today’s artificial intelligence (AI) and machine learning (ML) technologies can perpetuate unfair biases and not work well for people with darker skin tones,” Google writes.
Most notably, this research will help users when getting Search results and in Google’s Photos app.
In Search, Google is implementing the MST scale to show results that are more inclusive of darker skin tones.
Most read in Tech
For example, bridal makeup or hair-related searches will come with an algorithm that takes different skin tones into account so users can receive the most relevant results.
And, in Photos, the MST scale will provide a new set of “Real Tone filters” that are “designed to work well across skin tones” and “a wider assortment of looks,” per Google.
In time, Google hopes to employ the scale throughout more of its products and services, calling it an “important next step in a collective effort to improve skin tone inclusivity in technology.”
“For Google, it will help us make progress in our commitment to image equity and improving representation across our products. And in releasing the MST Scale for all to use, we hope to make it easier for others to do the same, so we can learn and evolve together.”
The move by the tech giant follows a recent lawsuit in March that accused Google of bias against black employees.
This lawsuit is one of many that calls the tech industry’s biases out – not only with regard to employment practices but in product development as well.
“Machines can discriminate in harmful ways. I experienced this firsthand when I was a graduate student at MIT in 2015 and discovered that some facial analysis software couldn’t detect my dark-skinned face until I put on a white mask,” Joy Buolamwini wrote for TIME in 2019.
“These systems are often trained on images of predominantly light-skinned men,” Buolamwini – who conducted research on large gender and racial bias in AI systems sold by tech giants like IBM, Microsoft, and Amazon – added.
“We often assume machines are neutral, but they aren’t.”
Another example of this includes SkinVision, an AI-powered app that aims to detect cancer.
SkinVision’s algorithm was challenged in 2021 as being more effective for lighter skin tones.
“The algorithms are far from ideal, in part because they threaten to augment existing racial biases in the field of dermatology,” Jyoti Madhusoodanan wrote for The Guardian in 2021.
Thankfully, along with Google, a growing number of tech companies including Apple, Pinterest, and Snapchat are working to tackle such issues.
We pay for your stories!
Do you have a story for The US Sun team?