In its original story, The Seattle Times suggested that LinkedIn's search algorithms were biased against women’s names.
Searching for a female name might result in being asked if you actually meant to search for a similar looking man's name. Alternative (male) names were offered when searching for many popular names. Stephanie became Stephen. Andrea became Andrew; Danielle became Daniel. Alexa became Alex. The suggestions did not it seem work the other way. No one searching for Stephen was asked if they actually meant to look for Stephanie.
When asked about the disparity, LinkedIn claimed that its suggested alternatives were generated automatically by past searchers, despite the fact that LinkedIn's users are pretty much evenly split between men and women.
Software algorithms can harbour human biases
Kate Crawford, writing in the New York Times, discusses some AI limitations which are caused by the humans behind the systems – and the risks of built-in prejudices. There are plenty of horror stories. From Google’s photo app to Nikon’s camera software. Or Tay, Microsoft’s foul-mouthed racist chatbot. Or the software used in the US to predict prisoner recidivism – found to be biased against black people.
Despite rejecting the gender bias claim, LinkedIn has now changed the algorithm. Alternative names will no longer be suggested.
BBC News; Seattle Times.