Friday, December 4, 2009

Supercomputing Edges Toward the Masses

Supercomputing Edges Toward the Masses


For inspiration, scientists are looking at cloud-computing services like Google's online office software, photo-sharing sites and Amazon.com's program for renting data centers. They are trying to bring that type of web-based technology into their labs and make it handle enormous volumes of data, helping to democratize science and good ideas.


For decades, supercomputers have been the tightly guarded property of universities and governments. But what would happen if regular folks could get their hands on one?

The price of supercomputers is dropping quickly, in part because they are often built with the same off-the-shelf parts found in personal computers, as a supercomputing conference [held] last week made clear. Just about any organization with a few million dollars can now buy or assemble a top-flight machine.

Meanwhile, research groups and companies like IBM Relevant Products/Services, Hewlett-Packard Relevant Products/Services, Microsoft Relevant Products/Services and Intel are finding ways to make vast stores of information available online through cloud Relevant Products/Services computing Relevant Products/Services.

These advances are pulling down the high walls around computing-intensive research. A result could be a democratization that gives ordinary people with a novel idea a chance to explore their curiosity with heavy computing firepower -- and maybe find something unexpected.

The trend has driven some of the top computing experts and scientists in the world to work toward freeing valuable stores of information. The goal is to fill big computers with scientific data Relevant Products/Services and then let anyone in the world with a PC, including amateur scientists, tap into these systems.

"It's a good call to arms," said Mark J. Barrenechea, chief executive of Silicon Graphics, which sells computing systems to labs and businesses. "The technology Relevant Products/Services is there. The need is there. This could exponentially increase the amount of science done across the globe."

The notion of sharing information among leading research centers is hardly new. Some of the earliest incarnations of what we now know as the World Wide Web came to life so that physicists and other scientists could have access to large data stores from afar.

The current thinking, however, is that the labs can accomplish far more than was previously practical by piggybacking on some of the trends sweeping the technology industry. And, this time around, research bodies big and small, along with brainy individuals, can participate in the sharing agenda.

For inspiration, scientists are looking at cloud-computing services like Google's online office software, photo-sharing sites and Amazon.com's program for renting data centers. They are trying to bring that type of Web-based technology into their labs and make it handle enormous volumes of data.

You've seen these desktop Relevant Products/Services applications move into the cloud," said Pete Beckman, the director of the Argonne Leadership Computing Facility in Illinois. "Now science is on that same track. This helps democratize science and good ideas."

With $32 million from the U.S. Energy Department, Argonne has set to work on Magellan, a project to explore the creation of a cloud-computing infrastructure Relevant Products/Services that scientists around the globe can use. Mr. Beckman said that such a system would reduce the need for smaller universities and labs to spend money on their own computing infrastructure.

Another benefit is that researchers would not need to spend days downloading huge data sets so that they could perform analysis on their own computers. Instead, they could send requests to Magellan and receive the answers.

Even curious individuals on the fringe of academia may have a chance to delve into things like climate change and protein analysis.

"Some mathematician in Russia can say, 'I have an idea,"' Mr. Beckman said. "The barrier to entry is so low for him to try out that idea. So, this really broadens the number of discoverers and, hopefully, discoveries."

The computing industry has made such a discussion possible. Historically, the top supercomputers relied on expensive, proprietary components. Government laboratories paid huge sums of money to use these systems for classified projects.

But, over the past 10 years, the vital innards of supercomputers have become more mainstream, and a wide variety of organizations have bought them.

At the conference in Portland, undergraduate students competed in a contest to build affordable mini-supercomputers on the fly. And a supercomputer called Jaguar at the Oak Ridge National Laboratory in Tennessee officially became the world's fastest machine. It links thousands of mainstream chips from Advanced Micro Devices.

Seven of the world's top 10 supercomputers use standard chips from A.M.D. and Intel, as do about 90 percent of the 500 fastest machines.

"I think this says that supercomputing technology is affordable," said Margaret Lewis, an A.M.D. director. "We are kind of getting away from this ivory tower."

At the Georgia Institute of Technology, for example, researchers have developed software that can evaluate scans of the brain and heart and identify anomalies that may indicate problems. To advance such techniques, the researchers need to train their software by testing it on thousands of body scans.

But it is hard to find a repository of such scans that a hospital or a government organization like the National Institutes of Health is willing to share, even if personal information can be stripped away, said George Biros, a professor at the Georgia Institute of Technology. "Medical schools don't make this information available," he said.



No comments:

Post a Comment

Reviews Tom's Hardware US