/* ---- Google Analytics Code Below */

Tuesday, September 01, 2015

Bigger, Faster Models and Analytics

From the CACM: Exascale Computing and Big Data
" ... Nearly two centuries ago, the English chemist Humphrey Davy wrote "Nothing tends so much to the advancement of knowledge as the application of a new instrument. The native intellectual powers of men in different times are not so much the causes of the different success of their labors, as the peculiar nature of the means and artificial resources in their possession." Davy's observation that advantage accrues to those who have the most powerful scientific tools is no less true today. In 2013, Martin Karplus, Michael Levitt, and Arieh Warshel received the Nobel Prize in chemistry for their work in computational modeling. The Nobel committee said, "Computer models mirroring real life have become crucial for most advances made in chemistry today,"17 and "Computers unveil chemical processes, such as a catalyst's purification of exhaust fumes or the photosynthesis in green leaves." ...  ' 

" ... Exascale computing refers to computing systems capable of at least one exaFLOPS, or a billion billion calculations per second. Such capacity represents a thousandfold increase over the first petascale computer that came into operation in 2008.[1] (One exaflops is a thousand petaflops or a quintillion, 1018, floating point operations per second.) At a supercomputing conference in 2009, Computerworld projected exascale implementation by 2018.[2] Enabling applications to fully exploit capabilities of Exascale computing systems is not straightforward.[3]

Exascale computing would be considered as a significant achievement in computer engineering, for it is believed to be the order of processing power of the human brain at neural level (functional might be lower). It is for instance the target power of the Human Brain Project. ... " 

No comments: