Tuesday, September 20, 2016

Pushing back against Weapons of Math Destruction

I'm such a huge Cathy O'Neil fan, that I put Doing Data Science: Straight Talk from the Frontline on my very short list of recommended books for scientists and data scientists. I wrote:
The most hands-on of the meta books or the most meta of the hands-on books? Not many introductory books include a chapter on ethics but more should.
Weapons of Math Destruction is the book about data science ethics that I've been waiting for.

Listen to the interview with author Cathy O'Neil on All Things Considered.

I especially like this exchange:
MCEVERS: So it sounds like when you're saying, you know, we have these algorithms, but we don't know exactly what they are under the hood, there's this sense that they're inherently unbiased. But what you're saying is that there's all kinds of room for biases.

O'NEIL: Yeah, for example, like, if you imagine, you know, an engineering firm that decided to build a new hiring process for engineers and they say, OK, it's based on historical data that we have on what engineers we've hired in the past and how they've done and whether they've been successful, then you might imagine that the algorithm would exclude women, for example. And the algorithm might do the right thing by excluding women if it's only told just to do what we have done historically. The problem is that when people trust things blindly and when they just apply them blindly, they don't think about cause and effect.

They don't say, oh, I wonder why this algorithm is excluding women, which would go back to the question of, I wonder why women haven't been successful at our firm before? So in some sense, it's really not the algorithm's fault at all. It's, in a large way, the way we apply algorithms and the way we trust them that is the problem.
I hope you read or listen to the interview. Perhaps we can do a virtual book club and read it together?

In case you were not a reader of this blog in 2008, I wrote about my experiences using a proto credit-scoring algorithm while in high school student working part-time for Citicorp in the mid-1980s.  I did (with my boss' support) what I could to push back against arbitrary scoring algorithms when I felt they did not accurately capture an applicant's credit-worthiness.  It's also a time capsule for a time when we could assume that health insurance companies would eventually pay so that healthcare liabilities did not count against employed people.

What I didn't write in 2008 and should have, was that I applied to both Kelly and Kelly Technical Services. Kelly sent me to do the lower-skilled clerical work for slightly above minimum wage. Kelly Technical Services sent a male former classmate (who needed my help to debug one of his homework assignments) to work on implementing the software algorithm that eventually replaced the clerks like me. He got paid more. A lot more.

Bias was not created by algorithms.  We built the algorithms in our own image.

1 comment:

  1. Julia14:46

    Thank you for blogging about this. I just downloaded the book sample and will probably buy it and read it. I probably wouldn't have heard about it without your blog.

    Cathy O'Neil's blog is great too. Productive procrastination ++ (Productive because I am struggling at work with figuring out how to organize large quantities of data)