(Source: National Defense Magazine)
The world of supercomputing is as competitive as an Olympic sport.
Nations develop new systems each year in hopes of seeing them climb the charts. Supercomputers are designed by mainstream companies like IBM and Hewlett-Packard and have a deployable lifespan of a few years. They have names like Jaguar, Roadrunner and Intrepid, and can crack codes or predict the weather. One in the works in Illinois will help explain the cosmos, scientists say.
The predominant measurer of these machines today is the Top 500 list. It’s all about speed — which supercomputer can do the most calculations in a second. Today, China owns the fastest machine. Tomorrow is another story.
Five of the 10 fastest supercomputers in 2005 came from the United States. Two more were built in Japan and one each in Spain, the Netherlands and Switzerland. The latest list from November 2010 still puts five U.S. machines in the first 10 slots, but the other players have changed. The most notable newcomer is China. In addition to boasting the fastest computer in the world, the Chinese also have built another ranked third.
The rankings are based on a system’s performance while running the Linpack test, which measures floating point computation. Half of the machines on one year’s list probably won’t make it the following year. Still, after China displaced the United States from the top spot, observers began to describe the rush to build the speediest computer as an arms race.
Not so fast, the Obama administration said.
A month after China reached the peak of the Top 500 list, the president’s science and technology council issued a report cautioning that a race to create the fastest system according to the Linpack benchmark could distract from more fruitful pursuits in high-performance computing, or HPC.
“The goal of our investment in HPC should be to solve computational problems that address our current national priorities, and this one-dimensional benchmark measures only one of the capabilities relevant to those priorities,” the report said. “A single-minded focus on maintaining clear superiority in terms of flops count is probably not in our national interest.”
Engaging in such an arms race may prove costly and divert resources away from basic research that could lead to breakthroughs allowing the United States to “leapfrog” other nations, the report said.
Supercomputers are built for more than just contests and rankings. They play a critical role in matters of science and national defense, said Thomas Sterling, a computer science professor at Louisiana State University best known for his part in developing the Beowulf class of computer clusters in the 1990s. Supercomputers are critical for engineering simulations that lead to the creation of state-of-the-art weapon systems like the stealth aircraft that is now being developed by the Chinese. They help the military develop complex battle simulations, control autonomous vehicles and figure out enemy communications, Sterling said.
That doesn’t mean it’s OK for the United States to fall behind in the speed-focused Top 500 list, he said.
Sterling respectfully disagrees with the president’s team. He believes that the United States is involved in a “quiet but strategic” computing arms race. Backing off could undermine U.S. strategic goals, he said.
“This is exactly the wrong moment to tone it down,” Sterling said. “This is a particularly dangerous time to be taking our foot off the accelerator. It’s not just bragging rights. This is national security.”
Instead of disparaging the metric used to determine the Top 500 list, the United States should be asking why it’s no longer number one, he said. Though only a measure of one function, the poll is an indicator of each country’s progress, he added. Others besides the United States have been at the top of the list before. Japan once was recognized for having the most powerful supercomputer, for instance. This time is different, Sterling said.
“This time the fear should be greater, not less,” he said. “If Japan beats us in supercomputing, they may benefit some economically, but they’re not a threat to our national security.” China is, he said.
“It’s not just that they built a machine bigger than anyone else; it is that they built some of the true enabling technologies,” Sterling said. “They could have bought them off the shelf, but they did it from the sand up.” The Chinese still used products manufactured in the United States for its supercomputer, but they showed signs of developing more of the components themselves. “How long before they don’t need our NVIDIA [products]? How long before they don’t need our microprocessors?” he said.
“If the momentum for extreme computing shifts to the Pacific Rim, the U.S. will lose economic control and defense-related superiority,” Sterling said. “Industry standards will be dictated by the nation with the largest market and the largest production capacity. We are already forced to buy much of our semiconductors from Asia.” A worst-case scenario could put the United States entirely at the mercy of potential enemies for mission-critical technologies, he said.
Still, the call for additional computing metrics has grown in the wake of China’s accomplishment.
Sandia National Laboratories issued a new rating system late last year called Graph 500. It tests a machine’s ability to solve complex problems involving randomly appearing graphs, rather than its capacity to do basic numerical problems. The idea behind Sandia’s test is that the problems of the real world are not as simple as running through numbers at the speed of light. In the medical industry, large numbers of entries must be correlated. An analysis of social networks requires the grasp of a giant pool of electronically related participants. And international security demands the ability to keep track of containers on ships at sea and their homeports. These problems call for more than basic calculations at furious speeds, experts said.
Intrepid, a computer built at the Department of Energy’s Argonne National Laboratory, topped the first Graph 500 list. The ranking proves that the system can perform raw floating point operations at extreme speeds, as well as support data-intensive applications that analyze and search for relationships within datasets, said Pete Beckman, who has been tapped to lead Argonne’s effort to achieve exascale computing.
“For many applications, scientific discovery includes a computation phase followed by a data analysis,” Beckman said. “Intrepid can do both well.” Beckman and his team aim to create systems 1,000 times more powerful than today’s fastest computer, machines that can perform a million trillion operations a second.
Originally developed for the Defense Advanced Research Projects Agency, a High Performance Computing Challenge also tries to provide a more rounded way to evaluate supercomputers. It offers several tests and allows a system to demonstrate a range of its hardware and software capabilities. Beyond the speed used to solve equations, the challenge measures a system’s ability in more complex scientific computing. One benchmark is based on operations used in climate modeling, seismic analysis and the development of new materials. NEC Corp.’s Earth Simulator was a top performer at the most recent challenge. Experts say that the system could lead to more accurate climate change projections, as well as prevention and mitigation of natural disasters through high-resolution simulations of earthquakes.
As the evaluation of these machines evolves, so too are the way they are built. Supercomputer architectures are going through a radical transformation, experts said.
The Air Force recently built a supercomputer with off-the-shelf components including nearly 2,000 Sony PlayStations. There are more than 3,000 total pieces connected by six miles of cable in the new machine dubbed the Condor Cluster. It can perform about 500 trillion calculations a second. While nowhere near as fast as the speediest supercomputers, it is much less expensive. The most powerful machines can cost as much as $100 million to make. The Air Force cluster was developed on a $2 million budget.
The Air Force plans to use its invention to tackle neuromorphic computing, or using electronics to mimic biological architectures in the human nervous system. “We call it computational intelligence,” said Mark Barnell, high performance computing director at the Air Force Research Laboratory. “Can we not only take information in, which computers are really good at and do it extremely quickly and in large volumes ... But what if some information is missing?”
The research lab has been using algorithms and other techniques to see if a machine can fill in the blanks in sentences and paragraphs. “Can it actually guess which word is correct? Because now the meaning of the sentence will bear the secret to what the word really meant, or the intent,” Barnell said.
Argonne National Laboratory has been leading an effort to more accurately simulate safe and efficient nuclear power reactors, a process that will improve with exascale computing, Beckman said. General Electric has been using Argonne’s supercomputers to study wind turbine noise, heated jet noise and film cooling heat transfer. Future breakthroughs could have broad impacts on the transportation and power industries, he said.
Supercomputers also could change the electric car industry where the goal remains to improve technology while dropping prices for average consumers. Scientists from Argonne and Oak Ridge National Laboratory are working on models to help design the next generation of rechargeable battery. “With the capability of an exascale computer, there is no telling how quickly this scientific field could improve current designs,” Beckman said.
The Oak Ridge lab used to lay claim to the world’s fastest supercomputer.
The world of technological innovation is on everyone’s minds. President Obama mentioned supercomputing twice in his latest State of the Union speech. He referred to China and India as serious innovative competitors. “We need to out-innovate” the rest of the world, Obama said. He called it a “Sputnik moment.”
But it is not the same as a true arms race, in which countries spend money developing weapons they hope to never have to use, Beckman said. “Investments in developing the next generation technology to support exascale computing will provide immediate benefits,” he said.
The fierce competition “will strengthen our science and engineering community and help train the next generation of scientists.”
The United States is expected to regain the top 500 List spot from China next year when the National Center for Supercomputing Applications, located at the University of Illinois at Urbana-Champaign, unleashes a machine that can perform 10 quadrillion calculations every second. Scientists say the use of that supercomputer, Blue Waters, will help them predict behavior of complex biological systems, tornadoes and hurricanes. They say the system also could help them understand how the cosmos evolved after the Big Bang.