Did you know that image recognition software, when trained by large photo collections, learns a sexist view of women? Wired Magazine reports that last fall, a computer science professor, Vicente Ordonez, noticed that his recognition software more often than not associated an image of a kitchen with women, not men. It even went so far as to label a man in the kitchen as a woman.
Initially, Ordonez thought the researchers – himself included – might be corrupting the software with their own biases. To find out if this was the case, he tested two prominent research-image collections – including one supported by Microsoft and Facebook. He found that the images displayed a predictable gender bias in their depiction of activities, with shopping and washing linked to women, and photos of coaching and shooting tied to men.
Another surprising finding was that the machine-learning software didn’t just mirror the bias, it amplified it. The software was not only reinforcing existing social biases, it was actually making them worse. Even more interesting was the possible implication of this software bias in the future: a robot in the kitchen, unsure of why a person was there, offers beer to a man and offers help to a woman.
Why this article is so fascinating to me – aside from the idea of a robot with gender bias issues – is how the creation of gender bias is so similar for both humans and computer software. In humans, the brain processes information in two ways – fast and slow. Fast processing works by seeking and creating patterns from vast amounts of data. If an association is there, it will make it. Later, when we encounter similar things, situations or behaviors, we respond automatically and quickly based on these learned associations. It provides a very handy mental short cut. Unlike computer software, gender associations for humans are made not only from the imagery we’re exposed to in childhood, but also from the rigors of gender training. Interestingly, these internal associations become largely unconsciousness in adulthood.
How these unconscious internal associations are measured is ingenious. The Implicit Association Test (IAT) times how quickly we make external associations using pictures or words. If the external associations match our internal associations, response time is shorter and fewer errors are shown. In addition to gender bias, race and age bias can also be measured using the IAT.
Measurements of implicit gender associations reveal that men are associated with math, science, career, high authority, and hierarchy. Women, in contrast, are associated with liberal arts, family, low authority, and egalitarianism. Research conducted in 2015 using words related to work and family revealed that approximately 75 percent of people think “men” when they hear career related words (such as profession, business, and work), and think “women” with words such as domestic, house, and household. This research also revealed that the vast majority of people associate men with boss, CEO, and directors, while they associate women with positions such as assistant, attendant, and secretary. In short, men are viewed as leaders and women as helpers.
Back to software and bias. The software scientists solved the amplification effect by forcing the training software to reflect the actual data. However, the corrected software still reflected the gender biases found in the original images. One scientist, Eric Horvitz, director of Microsoft research, asked “… when should we change reality to make our systems perform in an aspirational way?” Should scientists discount the gender associations that currently exist?
This is similar to the dilemma faced by parents. How do you raise children to be gender blind when the world is not? It is not a simple issue, nor one that is easily solved. But based on our knowledge of how bias is created, it is imperative that we ensure that images seen by children and adolescents are gender balanced. By this I mean that children and adolescents should be shown images that include women as leaders, plumbers, scientists and CEOs, and men as nurses, assistants, helpers and elementary school teachers. This would allow fast processing to make new connections for a new generation. New connections that will allow old gender stereotypes to be disrupted to benefit the future of both men and women.