Like the idea of visual tools that map with specific process, resource needs and output results. Leads to better understandable and resilient results.
Google's What-If Tool And The Future Of Explainable AI
Kalev Leetaru Contributor in Forbes
AI & Big Data
(Excerpt)
" ..... As deep learning has matured sufficiently to find widespread adoption in industry and as developers require increasingly greater understanding of their creations in order to pioneer new advances, the AI community has begun investing heavily in explainable AI as a way to render their black boxes transparent.
Google has been an early leader in emphasizing interpretability and how practitioners can build more understandable, representative and resilient AI solutions. Last year the company unveiled its What-If Tool, which offers a range of interactive visualizations and guided explorations of a TensorFlow model, allowing developers to explore how their model interpreted its training data and how subtle changes to a given input would change its classification, yielding insights into the model’s robustness. .... "
Google's Description:
The What-If Tool: Code-Free Probing of Machine Learning Models
Tuesday, September 11, 2018
Posted by James Wexler, Software Engineer, Google AI
Building effective machine learning (ML) systems means asking a lot of questions. It's not enough to train a model and walk away. Instead, good practitioners act as detectives, probing to understand their model better: How would changes to a datapoint affect my model’s prediction? Does it perform differently for various groups–for example, historically marginalized people? How diverse is the dataset I am testing my model on?
Answering these kinds of questions isn’t easy. Probing “what if” scenarios often means writing custom, one-off code to analyze a specific model. Not only is this process inefficient, it makes it hard for non-programmers to participate in the process of shaping and improving ML models. One focus of the Google AI PAIR initiative is making it easier for a broad set of people to examine, evaluate, and debug ML systems.
Today, we are launching the What-If Tool, a new feature of the open-source TensorBoard web application, which let users analyze an ML model without writing code. Given pointers to a TensorFlow model and a dataset, the What-If Tool offers an interactive visual interface for exploring model results. .... "
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment