Good piece with link to detailed Forrester study:
AI Accountability: Proceed at Your Own Risk
A new report suggests that to improve AI accountability, enterprises should tackle third-party risk head-on.
A report issued by technology research firm Forrester: AI Aspirants: Caveat Emptor , highlights the growing need for third-party accountability in artificial intelligence tools.
The report found that a lack of accountability in AI can result in regulatory fines, brand damage, and lost customers, all of which can be avoided by performing third-party due diligence and adhering to emerging best practices for responsible AI development and deployment.
The risks of getting AI wrong are real and, unfortunately, they're not always directly within the enterprise's control, the report observed. "Risk assessment in the AI context is complicated by a vast supply chain of components with potentially nonlinear and untraceable effects on the output of the AI system," it stated.
Most enterprises partner with third parties to create and deploy AI systems because they don’t have the necessary technology and skills in house to perform these tasks on their own, said report author Brandon Purcell, a Forrester principal analyst who covers customer analytics and artificial intelligence issues. "Problems can occur when enterprises fail to fully understand the many moving pieces that make up the AI supply chain. Incorrectly labeled data or incomplete data can lead to harmful bias, compliance issues, and even safety issues in the case of autonomous vehicles and robotics," Purcell noted. .... "
Tuesday, September 08, 2020
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment