/* ---- Google Analytics Code Below */

Wednesday, April 11, 2018

Supercharging Simulations with GPUs

Makes lots of sense since machine learning utilizes lots of simulations to train, check and deliver results.   So why not use the same hardware to simulate real complex process, systems or design?   Or choices of parameter.    Monte Carlo or not.  Abductive design?    Of course these may have different architecture than a neural network, so adaptation may be required.  But still. the thought is good.

How the University of Sheffield Uses GPUs to Supercharge Simulations 
Networks Asia
Via Hannah Williams

Researchers at the University of Sheffield in the U.K. are increasingly relying on graphics-processing unit (GPU) technology to supercharge their work with complex system simulations. One of their projects used GPUs to improve the speed and accuracy of road micro-simulations by a maximum factor of 33. "We can scale these simulations up to represent human immune systems and run them fast enough and explore all of the different parameters around what type of interventions may produce emergent properties," says Sheffield's Paul Richmond, pointing to disease remission as an example of a "good patient outcome." Last year Sheffield bought a DGX-1 workstation and also teamed with Joint Academic Data Science Endeavor, Britain's biggest GPU facility, to advance its deep learning and artificial intelligence research. Richmond says about 25 percent of the time the workstation is used for functions such as high-performance computing simulations. .... " 

No comments: