Speed for solutions and speed for alternative maintenance updates of models.
Latest Neural Nets Solve World’s Hardest Equations Faster Than Ever Before
Two new approaches allow deep neural networks to solve entire families of partial differential equations, making it easier to model complicated systems and to do so orders of magnitude faster.
Partial differential equations, such as the ones governing the behavior of flowing fluids, are notoriously difficult to solve. Neural nets may be the answer.
Alexander Dracott for Quanta Magazine, Anil Ananthaswamy
In high school physics, we learn about Newton’s second law of motion — force equals mass times acceleration — through simple examples of a single force (say, gravity) acting on an object of some mass. In an idealized scenario where the only independent variable is time, the second law is effectively an “ordinary differential equation,” which one can solve to calculate the position or velocity of the object at any moment in time.
But in more involved situations, multiple forces act on the many moving parts of an intricate system over time. To model a passenger jet scything through the air, a seismic wave rippling through Earth or the spread of a disease through a population — to say nothing of the interactions of fundamental forces and particles — engineers, scientists and mathematicians resort to “partial differential equations” (PDEs) that can describe complex phenomena involving many independent variables.
The problem is that partial differential equations — as essential and ubiquitous as they are in science and engineering — are notoriously difficult to solve, if they can be solved at all. Approximate methods can be used to solve them, but even then, it can take millions of CPU hours to sort out complicated PDEs. As the problems we tackle become increasingly complex, from designing better rocket engines to modeling climate change, we’ll need better, more efficient ways to solve these equations.
Now researchers have built new kinds of artificial neural networks that can approximate solutions to partial differential equations orders of magnitude faster than traditional PDE solvers. And once trained, the new neural nets can solve not just a single PDE but an entire family of them without retraining.
To achieve these results, the scientists are taking deep neural networks — the modern face of artificial intelligence — into new territory. Normally, neural nets map, or convert data, from one finite-dimensional space (say, the pixel values of images) to another finite-dimensional space (say, the numbers that classify the images, like 1 for cat and 2 for dog). But the new deep nets do something dramatically different. They “map between an infinite-dimensional space and an infinite-dimensional space,” said the mathematician Siddhartha Mishra of the Swiss Federal Institute of Technology Zurich, who didn’t design the deep nets but has been analyzing them mathematically. ... '
No comments:
Post a Comment