MIT Scholar Finds Gender Bias in Commercial Facial Analysis Programs
Posted on Feb 15, 2018 | Comments 0
A new study by researchers at the Massachusetts Institute or Technology and Stanford University finds that commercially released facial analysis programs demonstrate both skin-type and gender biases.
The study found that the computer programs had a very low error rate when determining the gender of light-skinned men. But one program had an error rate of 20 percent in determining the gender of dark-skinned women. The other two programs had an error rate of more than 34 percent when asked to identify the gender of dark-skinned women. For women who had the darkest skin, the systems failed to accurately determine their gender nearly half the time.
The findings raise questions about how today’s neural networks, which learn to perform computational tasks by looking for patterns in huge data sets, are trained and evaluated. Joy Buolamwini, a researcher in the MIT Media Lab states that “what’s important here is the method and how that method applies to other applications. The same data-centric techniques that can be used to try to determine somebody’s gender are also used to identify a person when you’re looking for a criminal suspect.”
Buolamwini is a graduate of the Georgia Institute of Technology, where she majored in computer science. She earned a master’s degree at the University of Oxford as a Rhodes Scholar and is currently at work on a Ph.D. at MIT.
A video about the research can be viewed below.
Filed Under: Research/Study