The National Science Foundation’s cancellation of funding for the John von Neumann National Supercomputer Center has, as might be expected, drawn sharp criticism from supporters of the Princeton, N.J., facility. But here, the move is being watched as a leading indicator of where the program is headed.
Assuming the failure of a lastditch effort by the Princeton center to reverse NSF’s rejection of its request for $70 million over the next five years, NSF officials say that the equivalent amount of money could be put to several conceivable uses. Possibilities include, expansion of four remaining centers in the agency’s four-year-old supercomputing program, support for state-sponsored supercomputing centers, or the creation of a new center, possibly based on massively parallel supercomputers such as the revolutionary 65,000-processor Connection Machine.
While the agency looks forward, officials at the Von Neumann center are scrambling to survive. The center’s problems stem largely from the fact that it is the only NSF supercomputer center based on processors made by ETA Systems. That company, 4 subsidiary of Control Data Corp., closed its doors in April after operating in the red for several years (The Scientist, May 15, 1989, page 1). The ETA shutdown came just weeks before NSF was scheduled to make a decision on renewing the funding of all five centers it has sponsored. As a result, NSF approved the other four centers in May and gave the Princeton center a six-month extension to prepare an alternate plan. Although center officials soon responded with a plan to replace the ETA supercomputers with a Cray Y-MP, they suspect that their fate had already been decided.
“We are being held to a different standard than the other centers,” complains von Neumann center director Doyle Knight. Despite continuing software problems with the facility’s two ETA- 10 supercomputers, Knight says that his facility has trained more users than any other NSF center and established the first high-speed (1.5 megabits-per-second) network connecting a supercomputer facility with its academic and industrial users. Adds Edward Cohen, director of the New Jersey Commission on Science and Technology, “The NSF is giving out the wrong signals: that if you try something innovative and it fails, you get punished. The machine didn’t fail, the company failed.” The state commission spends about $2 million annually on the Princeton facility.
Knight says the decision was “inconsistent with the recommendations of the NSF’s own peer review panel,” which appraised the Princeton center’s renewal proposal. Indeed, a copy of the internal panel report obtained by The Scientist shows that the seven-member panel voted 5-2 for continued NSF support for the von Neumann facility, contingent upon the center’s finding enough money to upgrade the requested four-processor Cray to an eight-processor version. The center quickly met that requirement, obtaining commitments from several of its large users.
But agency officials say that the review panel was only one of several elements inthe decision. “We looked at it in the context of the entire national program,” says NSF supercomputer program manager Thomas Weber. One problem, he said, was that only two of the 10 members of the Princeton center’s consortium of state, industry, and academic users from outside New Jersey had agreed to help fund the $5 million Cray upgrade. “I don’t think the [von Neumann facility proposal] fulfills the spirit of the panel review,” says Weber, noting the panel’s suggestion that “a strong financial commitment by the consortium to bring [the center] into the Cray world would.., be crucial to the future viability of the center.”
The von Neumann facility has taken its complaint to Rep. Robert Roe (D-N.J.), who chairs the committee that oversees NSF, and Sen. Frank Lautenberg (D-N.J.). Both men have met with NSF officials on the issue, and have raised questions about the procedures followed in making the decision. “We’re going to work hard to see that this decision is reviewed properly, and to see if there is'nt any recourse,” says James Townsend, a Lautenberg spokesman. At press time, however, no formal appeal had been filed.
Without NSF funding, the von Neumann center could be doomed. “I don’t think we can function without federal support,” says Cohen. Even if the center does operate at a vastly reduced level, its machines continue to be plagued by software problems. The NSF review panel found that the ETA- 10 suffered a software failure once every 30 hours, and that its ability to run programs on more than one of its eight processors at any one time was poor. Although its hardware is still considered state-of-the-art, the overall package is an “extremely immature computer system,” the panel concluded. The center also runs two smaller Cyber 205 super computers, but they will be phased out late next year by Control Data. Without federal funding, says Cohen, “we don’t have very long to last.”
NSF officials say that its decision shouldn’t hamper current users. The other four centers are expected to easily absorb the computing requirements of scientists now linked up with the Princeton center. Even without the von Neumann computing contribution, the NSF centers “will have twice the capacity of last year,” says William WuIf, head of the agency’s computer and information sciences directorate.
Meanwhile, NSF is planning the future of its supercomputer center project, and the destination of the funds that otherwise would have gone to the Princeton installation. While no decisions have yet been made, it’s clear that agency officials anticipate a program very different from the past. NSF managers are counting on a bigger budget in response to heightened national and congressional interest in high-performance computing. And high on their wish list is a new parallel-computing center. “Massively parallel processing is an interesting concept whose time is approaching,” hints Weber.
Others are more blunt. “The situation that exists in 1989 is very different than it was in 1985 when the centers were first started,” says Wulf. “Nobody would have predicted then that we would increase the number of cycles [a measure of computer capacity] by a factor of 80.” Although officials won’t say that they have all the supercomputer power they need. An NSF report to Congress earlier this year indicated ample capacity for the near future.
Given the absence of a pressing need for another traditional center, and the emergence of parallel machines as an exciting new approach, insiders are betting that parallel architectures will figure prominently in the next five years of NSF’s supercomputing program.