All analytics needs explaining. But primarily, why is what we have discovered as a pattern here trustable as a pattern in the future? And will that discovered pattern be useful in some business process today or in the future? Will some regulatory condition in the future prevent us from using this discovery?
AI, You’ve Got Some Explaining To Do By Alex Woodie in Datanami
Artificial intelligence has the potential to dramatically re-arrange our relationship with technology, hearkening a new era of human productivity, leisure, and wealth. But none of that good stuff is likely to happen unless AI practitioners can deliver on one simple request: Explain to us how the algorithms got their answers.
Businesses have never relied more heavily on machine learning algorithms to guide decision-making than they do right now. Buoyed by the rise of deep learning models that can act upon huge masses of data, the benefits of using machine learning algorithms to automate a host of decisions is simply too great to pass up. Indeed, some executives see it as a matter of business survival.
But the rush to capitalize on big data doesn’t come without risks, both to the machine learning practitioners and the people whom are being practiced upon. The risk posed to consumers by poorly implemented machine learning automation is fairly well-documented, and stories of algorithmic abuse are not hard to find.
And now, as a result of the European Union’s General Data Protection Regulation (GDPR), the risks are being pushed back to the companies practicing the machine learning arts, which can now be fined if they fail to adequately explain to a European citizen how a given machine learning model got its answer.
“In addition to all the rights around your personal data, the right to be removed and so forth, there’s a passage in GDPR talking about the right to an explanation,” says Jari Koister, FICO Vice President of Product and Technology. “The consumer actually has the right to ask, Why did you make that decision?”
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment