From O'Reilly.
The artificial intelligence computing stack
A look at why the U.S. and China are investing heavily in this new computing stack. By Reza Zadeh
Reza Zadeh will be keynoting and speaking at the AI Conference in Beijing, April 10-13, 2018. Hurry—best price ends January 26. ...
A gigantic shift in computing is about to dawn upon us, one that is as significant as only two other moments in computing history. First came the “desktop era” of computing, powered by central processing units (CPUs), followed by the “mobile era” of computing, powered by more power-efficient mobile processors. Now, there is a new computing stack that is moving all of software with it, fueled by artificial intelligence (AI) and chips specifically designed to accommodate its grueling computations.
In the past decade, the computational demands of AI put a strain on CPUs, unable to shake off physical limits in clock speed and heat dissipation. Luckily, the computations that AI requires only need linear algebra operations, the same linear algebra you learned about in high school mathematics. It turns out the best hardware for AI speaks linear algebra natively, and graphics processing units (GPUs) are pretty good at that, so we used GPUs to make great strides in AI. .... "
Tuesday, January 09, 2018
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment