Insightful piece. Have started to slow movement towards intelligence? I think it even more fundamental than that, we still cant provide complex context for the intelligence we want to leverage
Deep Learning Has Hit a Wall, Intels’ Rao Says by Alex Woodie in Datanami
The rapid growth in the size of neural networks is outpacing the ability of hardware to keep up, said Naveen Rao, vice president and general manager of Intel’s AI Products Group, at the company’s AI Summit yesterday. Solving the problem will require rethinking how processing, network, and memory work together, he said.
This is the cutting edge of AI at the moment, but it’s only available to the biggest technology firms, such as Google and Facebook, and a handful of big companies in private industry that have the ways and means to tackle such challenges. But lately, the neural networks have grown so big, with so many parameters to calculate, that they’ve essentially maxed out the hardware they run on, Rao said.
“Over the last 20 years we’ve gotten a lot better at storing data,” Rao said during a one-hour presentation at Intel‘s AI Summit 2019 in San Francisco Tuesday. “We have bigger data sets than ever before. Moore’s Law has led to much greater compute capability in a single place. And that allowed us to build better and bigger…neural network models. This is kind of a virtuous cycle and it’s opened up new capabilities.”
The growing data sets means more data for training deep learning models to recognize speech, text, and images. The largest companies in the world have invested aggressively to obtain the hardware, software, and technical skills necessary to build AI solutions that give them a competitive advantage. Computers that can identify images with unparalleled accuracy and chatbots that can carry on a somewhat natural conversation are two prime examples of how deep learning is impacting people’s day-to-day lives. ... "
Wednesday, November 13, 2019
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment