How did an inherent bias against women creep in?
Artificial Intelligence in Recruiting
Artificial Intelligence is only as good as the person that programmed it, it’s as simple as that.
When considered the recent case with Amazon scraping its AI recruitment software due to discovering it had an inherent bias against women, it got everyone asking, How could this happen?
Sadly, bias still exists even when we consider ourselves unprejudiced. It especially exists unconsciously, in the way we write and speak. Take for example masculine and feminine language. In the case of Amazon’s AI as well as lots of AI that are written for the recruitment space, men and their inherent use of masculine words code this tech and language through the programming of these platforms, resulting in bias.
You would think that a computer system would be completely objective about its matching process, however when the taxonomy, rules, and touch points of a system’s machine are dictated by one demographic of person (in this case, men) it’s obvious that there may be biased traits that come out as ghosts of the machine.
Read the full article at SSON