2013 Nobel Prize in Chemistry winners bring HPC to the lab

The 2013 winners for the Nobel Prize in Chemistry, Martin Karplus, Michael Levitt and Arieh Warshel, have been advancing science for years, but wide-spread acceptance has not come in a vacuum. This acceptance has come along with the rise of high-performance computing (HPC) in chemistry–something XSEDE specializes in–a move that Sven Lidin, Chairman of the Nobel Committee for Chemistry, likens to doing chemistry outside of a traditional laboratory. "This is how far theoretical chemistry has come… to us, 'theory' has become the new 'experiments.'"

To put it another way high-performance computing is validating ideas about chemistry just as telescopes validate ideas and theories about astronomy. HPC has become so pervasive it is just another tool in modern science.

The work of Karplus, Levitt and Warshell was in establishing a framework for analyzing the behavior of biomolecules in terms of interactions between the constituent atoms. Documenting the actions of, say, 50,000 atoms in a reaction over the course of a fraction of a millisecond takes incredible compute power, and it is work that simply cannot be done in a traditional lab.

Without the growth and popularization of high-performance computing, massive data storage and breakthrough visualization there is no world of molecular dynamics. This, in part, has manifested itself in the form of a free, research-oriented ecosystem of those very bodies and more: the NSF-funded eXtreme Science and Engineering Discovery Environment (XSEDE).

XSEDE's very mission speaks to the democratization of highly powerful digital resources: XSEDE accelerates open scientific discovery by enhancing the productivity of researchers, engineers, and scholars and making advanced digital resources easier to use.

XSEDE's wealth of compute resources and infrastructure of knowledgable personnel has even seen a few familiar names in recent years as allocated users: Karplus, Levitt and Warshel.

Karplus used supercomputing resources Queenbee, Bigben and Lonestar as part of TeraGrid, the predecessor to XSEDE, ending in 2008. He wanted to investigate minuscule particles numbering over 300,000 - work that could not be done from classical experimentation. Levitt computed in 2005, also on TeraGrid resources as part of PI Jeffry Madura's work in predicting the structure of proteins. "[My work] require[s] major computer resources and would greatly benefit from obtaining access to the XSEDE computational resources.This conclusion has been confirmed by extensive tests and benchmarking performed during the startup XSEDE allocation… Given the large size of the systems under investigation… a state-of-the-art supercomputer will be essential not only for large-scale production following the standard paradigm, but also as a research tool intimately coupled to the computational design."

Warshel has a current allocation with XSEDE on the supercomputer Gordon at San Diego Supercomputer Center. He is doing simulations that combine molecular dynamics and quantum mechanical calculations. From Warshel's abstract: "Our computer simulations of the functions of bimolecular systems have been progressing since the inception of this field, and involved the development and refinement of advanced (and in some cases unique) simulation methods. These methods have been aimed at getting the maximum information about biological functions from the given simulations using the computer resources available at the given time (going back to very slow computers with very small memory)."

XSEDE's growth in use–throughout classical HPC fields like chemistry, molecular dynamics and physics, but also economics, history, linguistics and more–will continue, just as the work of Karplus, Levitt and Warshell will live on and provide the footing for future discovery.


Publish Date: