Discriminating Software

The Economist takes a look at the software that big companies are using to sort through job applicants. It finds that people who use Chrome and Firefox browsers are better employees, and people with criminal records are suited to work in call centers. One drawback to having a computer sort potential employees is that its algorithms may treat some variables as proxies for race, as discussed in our “How Much Does Your Name Matter?” podcast, in which the Harvard computer scientist Latanya Sweeney found that distinctively black names are more likely to draw ads that offer arrest records. 

(HT: Louis Henwood)

Leave A Comment

Comments are moderated and generally will be posted if they are on-topic and not abusive.

 

COMMENTS: 14

View All Comments »
  1. Matt says:

    So… a computer algorithm that was generically designed to correlate people’s demographic characteristics and the ads they click determined that one demographic was more likely to click on a certain type of ad. Because it was “black” and “arrest records” rather than “male” and “sports tickets” or “elderly” and “medical alerts” a kerfuffle ensues.

    I guess one could try to specifically design an algorithm that would weed out any possible correlation to any proxy for race, but the assumptions (“stereotypes,” if you will) required to do so would be far more insidious than just letting the vanilla algorithm run its course.

    Well-loved. Like or Dislike: Thumb up 31 Thumb down 0
    • Michael says:

      Hidden due to low comment rating. Click here to see.

      Disliked! Like or Dislike: Thumb up 5 Thumb down 11
      • Enter your name... says:

        Yes, you’re right: an algorithm that is designed to reject X will reject X, even if X shouldn’t be rejected in every case. For example, if you want to find someone who is rich, then you might reject everyone who drives an old car, but you’d be screwing up if you rejected the person driving a Rolls Royce Phantom.

        Of course, it also works the other way around: an algorithm designed to accept X will accept X, even if X shouldn’t be accepted. After all, that “rich person’s” car might be rented, borrowed, or stolen.

        Thumb up 3 Thumb down 0
      • James says:

        “…but you’d be screwing up if you rejected the person driving a Rolls Royce Phantom.”

        Or even the guy driving that old Ford pickup – it might just have been Sam Walton :-)

        OTOH, driving around in a new Rolls-Royce might mean that you have money, or it might mean that you HAD money until you blew it all on a fancy car :-)

        Well-loved. Like or Dislike: Thumb up 5 Thumb down 0
      • Seminymous Coward says:

        For the record, a well-designed algorithm for this purpose would consider the information as vectors, i.e. combinations of items. For example, it might consider car value > $50k and salary <$50k (simultaneously) to be a bad sign or consider a 3.5 GPA at a prestigious university as good as a 4.0 at a lesser school. Your example would be particularly noteworthy then, because it would work correctly if the D vs. M distinction were entered for each application; the refusal to consider minority status as an input is what causes the pathological behavior in your example.

        Well-loved. Like or Dislike: Thumb up 5 Thumb down 0
    • Matt (Notsameone) says:

      I agree with Matt, and I think the major reason they do this is because the cost of hiring a bad employee is very large, so they don’t care if they throw out a few good employees to weed out the bad ones.
      However I think the biggest downside is group think. While it is good to pick friends who have the same experiences and live the same way as you, however when running a business and dealing with group think, it means that people are not going to have a lot of disagreements, and disagreements are the items that spur innovation and keep everyone looking for ways to guard their ideas. If everyone in the room has general beliefs that are similar, then how is there going to be those disagreements.
      For instance:
      If everyone who was hired for a job clicked on XYZ add then there is a strong change that a similar add will be well liked by the company, regardless of weather or not it is a good decision.

      Thumb up 4 Thumb down 0
  2. C says:

    “having a computer sort potential employees is that its algorithms may treat some variables as proxies for race”

    So these proxies for race might include IQ, criminality, etc. and that would be a problem for an employer, how? Arent employers to use actual data even if the truth is politically incorrect?

    Well-loved. Like or Dislike: Thumb up 19 Thumb down 1
    • Enter your name... says:

      Depends on their goals. If their goals include not getting sued into the ground for unintentional-but-unreasonable discrimination, then they do need to worry about this. It’s called “disparate impact”, and it’s why places with smart HR departments don’t require four-year university degrees for receptionists and janitors, even when the job market is so weak that people with these degrees are desperate.

      Thumb up 4 Thumb down 1
  3. Matt says:

    It will disriminate between a good candidate and a bad one. Why does the word “discriminate” have to mean bigotry?

    Well-loved. Like or Dislike: Thumb up 20 Thumb down 0
  4. Brett says:

    When you apply for a job online, which is the only way to do it now at most companies, and you have to spend all day filling out various surveys, skills tests, and personality tests only to be rejected by the system because of some seemingly meaningless correlations in your data, it can be very disheartening. It’s no wonder so many of the unemployed have stopped looking for work. Just having that label “unemployed” is an automatic disqualifier at most places, and unemployed people are well aware of that. Of course the algorithms are selecting better employees “on average”, but meanwhile thousands of good people are slipping through the cracks. Maybe they should turn to a life of crime — it might at least help them get a call center job down the road.

    Well-loved. Like or Dislike: Thumb up 9 Thumb down 0
  5. Plainspeak says:

    The issue here is not with discrimination. All hiring is discrimination to separate ‘who fits’ from ‘who doesn’t’. The issue here is that of risk. Algorithms may contain unknown systemic biases particularly with incomplete or wrong feedback mechanisms. And by the time the biases are uncovered, significant loss would have occurred.

    Thumb up 0 Thumb down 0
  6. Jennifer says:

    Guys, I love your blog, but PLEASE do not make your videos auto-play. I tend to open multiple tabs and read them one by one, so if one of the background tabs suddenly starts speaking it can be disturbing and leads to a “which one is it” search… Thanks!

    Thumb up 3 Thumb down 0
  7. Mayday says:

    What’s next? Discriminating against younger people by charging them with a higher car insurance rate?

    Thumb up 1 Thumb down 0