/* ---- Google Analytics Code Below */

Saturday, August 04, 2018

Responsibility for Program Output

Not quite understanding what they mean by responsibility here, hard to know how it will produce unexpected results.  Which might also diminish use of analytics as well.    Certainly you could promote an awareness of the further implications of output.

Assessing Responsibility for Program Output  

Communications of the ACM, August 2018, Vol. 61 No. 8, Pages 12-13
Robin K. Hill, University of Wyoming

Remember the days when record-keeping trouble, such as an enormous and clearly erroneous bill for property taxes, was attributed to "computer error?" Our technological society fumbles the assignment of responsibility for program output. It can be seen easily in exaggerations like this, from a tech news digest: "Google's Artificial Intelligence (AI) has learned how to navigate like a human being." Oh, my. See the Nature article by the Google researchers2 for the accurate, cautious, description and assessment. The quote given cites an article in Fast Company, which states that "AI has spontaneously learned how to navigate to different places."4 Oh, dear.

But this is not the root of the problem. In the mass media, even on National Public Radio, I hear leads for stories about "machines that make biased decisions." Exaggeration has been overtaken by simple inaccuracy. We professionals in Tech often let this pass, apparently on the belief the public really understands machines and algorithms have no such capacity as is normally connoted by the term "decision"; we think the speakers are uttering our own trade shorthand. When we say "the COMPAS system decides that offender B is more likely to commit another crime than is offender D"1 (paraphrase mine), it is short for "the factors selected, quantified, and prioritized in advance by the staff of the software company Northpointe assign a higher numeric risk to offender B than to offender D." When the Motley Fool website6 says "computers have been responsible for a handful of 'flash crashes' in the stock market since 2010," it means that "reliance on programs that instantaneously implement someone's predetermined thresholds for stock sale and purchase has been responsible ... etc."

The trouble is that there is no handy way to say these things. The paraphrases here expose the human judgments that control the algorithms, but the paraphrases are unwieldy. For decades of software engineering, we have adopted slang that attributes volition and affect to programs. Observations can be found on Eric S. Raymond's page on anthropomorphization5. I doubt many hackers ascribe the intentional stance to programs; I suspect rather that programmers use these locutions for expedience, as the "convenient fictions that permit 'business as usual'."3 But the public misunderstanding is literal, and serious. .... "

No comments: