Tuesday, September 27, 2016

#sewphotohop Day 27: other interests

It's day 27 and I'm running out of steam for #sewphotohop.

One look at the word cloud in the column at right shows that I have a lot of interests.

When I started this blog, I had to be coy about my day job because I worked in a military lab, and the security officers really, really did not want their scientists to be out in public. It's not that I was doing anything to be ashamed of. In fact, I really enjoyed the work I did there. It's just not in the culture there to bring attention to oneself.

Since 2014, I've been providing data support for climate and weather research.  We are an open access data provider, funded by the National Science Foundation.  Both the data and the data tools are provided free thanks to governments around the world.  If you pay taxes (and I hope you do), then you help pay for this data infrastructure.

All this is an excuse to provide a list of data links.
I have a passion for making stuff.  I have a passion for data.  I have a passion for sharing my knowledge and skills.

*National Software Reference Library, a division of the National Institute of Standards and Technology (NIST).  I received my PhD at JILA, another part of NIST.

Pockets on my mind

I'm a big fan of pockets in my own clothing.  There was a time when I skipped them, or only inserted them in the side seams (totally unflattering and uncomfortable for my build), in the rush to be done with a sewing project.

Now, I sew to get exactly what I like.  If it takes longer to make it, then so be it.

Imagine my delight when I saw this slideshow of Marni's S/S 2017 collection:

Photo from NowFashion via NYT
An entire fashion collection devoted to exploring pockets!

BTW, I read the NPR fact-check transcript, but could not bear to watch the debate last night.

Did you read the Politics of Pockets?  There's all sorts of good historical stuff about suffragettes, rational dress, the "New Woman", and Hillary Clinton's pantsuits.

Read this story about Susanna, a custom clothier in Beverly Hills who makes some of HRC's pantsuits.  Fox and right-wing pundits tried to deflect criticism that Trump-branded clothing is all made abroad in low wage countries by suggesting that HRC's pantsuits are made in Bangladesh.  They lie.
Susanna Forest, of Susanna Beverly Hills, has been handcrafting women's suits in the United States since 1976. From design to manufacturing, every step of her process is carried out entirely from her atelier in Beverly Hills, California. Susanna Beverly Hills garments are the product of hundreds of hours of American labor, and women, including Hillary Clinton, can be proud to wear a garment that has been crafted with the absolute most care and skill here in the United States.
I have a small quibble.  The article says that this white suit she wore at the convention does not have pockets.  Click to embiggen the photo.  Do you see a faint outline of welt pockets on her jacket just below the waist?  Methinks her jacket has a pair of pockets, but they are so skillfully done, they are practically invisible.

Enlarge this to see the hip jacket welt pockets.
I'd like to make the jacket below for my S/S 2017 collection.

V8732 is sadly OOP.
But first, I need to make progress on F/W clothes from DD and myself.  I promised her 2-3 colorful tops.  I need to make a few for myself and clear a backlog of 3-4 sweaters patiently awaiting seaming and finishing touches.

Tuesday, September 20, 2016

Pushing back against Weapons of Math Destruction

I'm such a huge Cathy O'Neil fan, that I put Doing Data Science: Straight Talk from the Frontline on my very short list of recommended books for scientists and data scientists. I wrote:
The most hands-on of the meta books or the most meta of the hands-on books? Not many introductory books include a chapter on ethics but more should.
Weapons of Math Destruction is the book about data science ethics that I've been waiting for.


Listen to the interview with author Cathy O'Neil on All Things Considered.

I especially like this exchange:
MCEVERS: So it sounds like when you're saying, you know, we have these algorithms, but we don't know exactly what they are under the hood, there's this sense that they're inherently unbiased. But what you're saying is that there's all kinds of room for biases.

O'NEIL: Yeah, for example, like, if you imagine, you know, an engineering firm that decided to build a new hiring process for engineers and they say, OK, it's based on historical data that we have on what engineers we've hired in the past and how they've done and whether they've been successful, then you might imagine that the algorithm would exclude women, for example. And the algorithm might do the right thing by excluding women if it's only told just to do what we have done historically. The problem is that when people trust things blindly and when they just apply them blindly, they don't think about cause and effect.

They don't say, oh, I wonder why this algorithm is excluding women, which would go back to the question of, I wonder why women haven't been successful at our firm before? So in some sense, it's really not the algorithm's fault at all. It's, in a large way, the way we apply algorithms and the way we trust them that is the problem.
I hope you read or listen to the interview. Perhaps we can do a virtual book club and read it together?

In case you were not a reader of this blog in 2008, I wrote about my experiences using a proto credit-scoring algorithm while in high school student working part-time for Citicorp in the mid-1980s.  I did (with my boss' support) what I could to push back against arbitrary scoring algorithms when I felt they did not accurately capture an applicant's credit-worthiness.  It's also a time capsule for a time when we could assume that health insurance companies would eventually pay so that healthcare liabilities did not count against employed people.

What I didn't write in 2008 and should have, was that I applied to both Kelly and Kelly Technical Services. Kelly sent me to do the lower-skilled clerical work for slightly above minimum wage. Kelly Technical Services sent a male former classmate (who needed my help to debug one of his homework assignments) to work on implementing the software algorithm that eventually replaced the clerks like me. He got paid more. A lot more.

Bias was not created by algorithms.  We built the algorithms in our own image.