ITHACA, N.Y.-Scientists and engineers in American industry desperately need computing power far beyond the capability of today's fastest supercomputers. The computer industry hopes to fill that need-with the help of university researchers.
That vision emerged during a conference on supercomputing held last month at Cornell University's Center for Theory and Simulation in Science and Engineering. The facility is one office university centers for research on super-computing established last year by the National Science Foundation.
"The Cray just isn't good enough, and it isn't good enough by a factor of 100," said John McTague, vice president of re search for the Ford Motor Company and former acting director of the White House Office of Science and Technology Policy. McTague said that better computers are vital to American security in the current economic wars being waged with our allies.
Kenneth Wilson, a Nobel laureate in physics and director of Cornell's supercomputing center, said universities can make their greatest contribution by developing improved software. Computer manufacturers have to focus on a specific product and worry about deadlines, but "in the university you can build a piece of software and discover that what you can do with it is different from what you had planned."
Universities can conduct a broader range of basic research and serve as a testing ground. "That enables manufacturers to make more innovative products than they would if serving industry," he said, "because the university can live with less functional products."
McTague offered an example from his company's laboratories to illustrate the limitations of present supercomputers. A Ford engineer who was simulating the collapse of a single beam in a proposed design was forced to stop the test when the beam was only half destroyed. "I ran out of time," the engineer explained about the process, which took one hour to run on the Cray supercomputer. The same simulations must be done for a thousand other parts, McTague noted, and then again for the entire car.
Wilson pointed to the complexity of current theoretical problems as another reason to build more powerful computers. In computer models of how atoms link into molecules, an area vital to materials de sign, "anything with more than four electrons gets into the realm of supercomputing. But real problems range to thousands or millions of electrons." He said that computing costs increase by a factor of 32 for each doubling of the number of electrons in the model.
Hardware improvements will keep pace with the need, predicted Ralph Gomory, senior vice president and director of research for IBM. Processor speed and storage density can continue to increase as a linear function for the next decade without any scientific break-throughs, he said, although parallel processing will be necessary.
But parallel processing introduces new problems. "Not everyone is excited by advanced architecture computers," said Albert Erisman, director of the Engineering Technology Applications Division of Boeing Computer Services. "Some people are very depressed." The challenge, Erisman said, is to make the new machines so easy to use that scientists and engineers won't have to take up programming as a second career.
Programs that run on a parallel processing computer must be written in sections. That is a reasonable task on machines having 16 processors, but machines of the future will have thousands.
"I think what's going to happen is something intermediate," said Wilson. "You'll have to rewrite the program, but you'll get a lot of help." He cited efforts of Alex Nicolau at Cornell and David Cook at the University of Illinois to develop software that will convert existing programs to run on parallel machines.
Steele is a freelance science writer in Ithaca.