From Models of Galaxies to Atoms, Simple AI Shortcuts Speed Up Simulations by Billions of Times
Modeling immensely complex natural phenomena such as how subatomic particles interact or how atmospheric haze affects climate can take hours on even the fastest supercomputers. Now, work posted online shows how AI can easily produce emulators that can accelerate simulations by billions of times.
Modeling immensely complex natural phenomena such as how subatomic particles interact or how atmospheric haze affects climate can take many hours on even the fastest supercomputers. Emulators, algorithms that quickly approximate these detailed simulations, offer a shortcut. Now, work posted online shows how artificial intelligence (AI) can easily produce accurate emulators that can accelerate simulations across all of science by billions of times.
“This is a big deal,” said Donald Lucas, who runs climate simulations at Lawrence Livermore National Laboratory and was not involved in the work. He says the new system automatically creates emulators that work better and faster than those his team designs and trains, usually by hand. The new emulators could be used to improve the models they mimic and help scientists make the best of their time at experimental facilities. If the work stands up to peer review, Lucas said, “It would change things in a big way.”
A typical computer simulation might calculate, at each time step, how physical forces affect atoms, clouds, galaxies—whatever is being modeled. Emulators, based on a form of AI called machine learning, skip the laborious reproduction of nature. Fed with the inputs and outputs of the full simulation, emulators look for patterns and learn to guess what the simulation would do with new inputs. But creating training data for them requires running the full simulation many times—the very thing the emulator is meant to avoid.
The new emulators are based on neural networks—machine learning systems inspired by the brain’s wiring—and need far less training. Neural networks consist of simple computing elements that link into circuitries particular for different tasks. Normally the connection strengths evolve through training. But, with a technique called neural architecture search, the most data-efficient wiring pattern for a given task can be identified.
The technique, called deep emulator network search (DENSE), relies on a general neural architecture search codeveloped by Melody Guan, a computer scientist at Stanford University. It randomly inserts layers of computation between the networks’ input and output and tests and trains the resulting wiring with the limited data. If an added layer enhances performance, it’s more likely to be included in future variations. Repeating the process improves the emulator. Guan says it’s “exciting” to see her work used “toward scientific discovery.”