Either I am severely mistaken, or there is something wrong with Google's machine learning inference software. You are either much younger, or young at heart.
ambiguous gender, too. ;-)
On a more serious note, please take the time to read Evgeny Morozov's The Real Privacy Problem. Here are a couple of juicy paragraphs to whet your appetite, but you should read the whole thing.
In case after case, Simitis argued, we stood to lose. Instead of getting more context for decisions, we would get less; instead of seeing the logic driving our bureaucratic systems and making that logic more accurate and less Kafkaesque, we would get more confusion because decision making was becoming automated and no one knew how exactly the algorithms worked. We would perceive a murkier picture of what makes our social institutions work; despite the promise of greater personalization and empowerment, the interactive systems would provide only an illusion of more participation. As a result, “interactive systems … suggest individual activity where in fact no more than stereotyped reactions occur.”
If you think Simitis was describing a future that never came to pass, consider a recent paper on the transparency of automated prediction systems by Tal Zarsky, one of the world’s leading experts on the politics and ethics of data mining. He notes that “data mining might point to individuals and events, indicating elevated risk, without telling us why they were selected.” As it happens, the degree of interpretability is one of the most consequential policy decisions to be made in designing data-mining systems. Zarsky sees vast implications for democracy here:
A non-interpretable process might follow from a data-mining analysis which is not explainable in human language. Here, the software makes its selection decisions based upon multiple variables (even thousands) … It would be difficult for the government to provide a detailed response when asked why an individual was singled out to receive differentiated treatment by an automated recommendation system. The most the government could say is that this is what the algorithm found based on previous cases.
Someone asked when I was going to post the backlog of sewing and knitting projects. Stay tuned, because I do have some stuff to share for our girly sides. But first, I want to take a statistical detour about how my local public middle school, a school that is above average on an absolute scale and compared to schools with similar demographics, was labeled a "failing" school by NCLB (No Child Left Behind).
We are in "Program Improvement" status for the second year in a row because of what Bad Dad calls a statistical fluke. Actually, it is not a fluke. It is an entirely predictable misclassification by a bad statistical algorithm. I would like the lawmakers who passed NCLB to take a statistics test and publicly post their scores on their congressional websites. Would that be too much to ask for in the name of democracy?
Could someone come up with a blog badge for blogs of indeterminate gender?