/* ---- Google Analytics Code Below */

Tuesday, April 25, 2017

Cheap not Deep Learning is the Future

Had not heard of the term 'cheap-learning'  before, but have been thinking of the concept for a long time.  Building higher levels of abstraction is a matter of better leverage.  Only a few people need to know the low level math.  These people are absolutely necessary, but just a few of them.     And very few as we creep towards automation.    A matter of car mechanics vs Phds in thermal mechanics.

In Datanami: 

" ... Higher Levels of Abstraction

In the new cheap learning paradigm, the combination of sophisticated frameworks like Theano and Tensorflow and the powerful but simple languages like Python will help to create a new layer of abstraction that eliminates the need for big data application developers to understand the nitty gritty details of high-level math and low-level execution models to get stuff done.

Developers will be able to tell the computer what to do at a high level, Dunning says, and the computer will take care of the implementation details, whether it’s running on a 1,000-node cluster, a gaggle of GPUs, or just a laptop.

(Ted) Dunning (of MapR)  uses a car analogy to communicate his vision for how the cheap learning metaphor will evolve.

“I don’t understand cars anymore. I understand what combustion is and what gasoline is, but the actual details of how cars work escaped my grasp many years ago,” Dunning says. “But I have a mental model, when I want to get to work or talk to somebody about the car. We need to have that same sort of loss of detail, and a higher abstraction level, when talking about these parallel programs.” .... ' 

No comments: