Once a system has been taught it acts as an experimental model of its learning. When we built such neural models you had to write code or use data manipulation tools to test against new data sets. Classic method for all analytic models, not only ML. This new tool should make it easier.
Google's new What-If Tool "allows users to analyze a machine learning model without the need for writing any further code. Given pointers to a TensorFlow model and a dataset, the What-If Tool offers an interactive visual interface for exploring model results."
What If...
you could inspect a machine learning model, with no coding required?
Building effective machine learning systems means asking a lot of questions. It's not enough to train a model and walk away. Instead, good practitioners act as detectives, probing to understand their model better.
But answering these kinds of questions isn't easy. Probing "what if" scenarios often means writing custom, one-off code to analyze a specific model. Not only is this process inefficient, it makes it hard for non-programmers to participate in the process of shaping and improving machine learning models. For us, making it easier for a broad set of people to examine, evaluate, and debug machine learning systems is a key concern.
That's why we built the What-If Tool. Built into the open-source TensorBoard web application - a standard part of the TensorFlow platform - the tool allows users to analyze an machine learning model without the need for writing any further code. Given pointers to a TensorFlow model and a dataset, the What-If Tool offers an interactive visual interface for exploring model results.... "
Wednesday, September 26, 2018
Analyze an ML Model without More Coding
Labels:
Analytics,
Deep Learning,
GitHub,
Google,
ML,
TensorFlow,
Testing,
What-If
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment