Recently the New York Times discussed how algorithms learn to discriminate.  What’s that you say?

The Times reporter wrote that “Algorithms have become one of the most powerful arbiters in our lives. They make decisions about the news we read, the jobs we get, the people we meet, the schools we attend and the ads we see. Yet there is growing evidence that algorithms and other types of software can discriminate. The people who write them incorporate their biases, and algorithms often learn from human behavior, so they reflect the biases we hold.”who talked about how algorithms learn to discriminate, and who’s responsible.

Header, Banner, Head, Display Dummy

The Times interviewed Cynthia Dwork, a computer scientist at Microsoft Research, who said that “Algorithms do not automatically eliminate bias. Suppose a university, with admission and rejection records dating back for decades and faced with growing numbers of applicants, decides to use a machine learning algorithm that, using the historical records, identifies candidates who are more likely to be admitted. Historical biases in the training data will be learned by the algorithm, and past discrimination will lead to future discrimination.”

She discussed, for example, bias in the resident matching program that matches graduating medical students with residency programs at hospitals.

It made me think about how this issue may be on the cutting edge of employment discrimination law.  One question crossed my mind:  if there is, in fact, discrimination resulting from the use of an algorithm, is it disparate impact discrimination or intentional discrimination?  Or even both? For our purposes, the only example given involving employment discrimination “came from Carnegie Mellon University, where researchers found that Google’s advertising system showed an ad for a career coaching service for “$200k+” executive jobs to men much more often than to women.”

This presents a pretty significant issue for employment lawyers and HR folks — as well as computer professionals.

Anyone care to weigh in with other such examples, whether actual or predicted?