"There's a lot of hype in the field," declared Yaser AbuMostafa, a researcher at California Institute of Technology. "The problem is how to achieve generalizable learning, to extend a computer's experience to new fields."
Technology Venture Investors, of Menlo Park, Calif., is backing a new firm, Synaptics Inc., that features the work of Carver Mead and Federico Faggin of Caltech. But venture capitalist Jim Bochnowski admits that "we might be too early. The field might develop more slowly than we expect."
Neural-net computing is based on a design patterned after the human brain, in which each fundamental element is linked to a large number of other elements. But the scale is quite a bit smaller: while each neuron in the brain is connected to about 10,000 others, neural-net computers have so far joined about 100 such elements.
The design contrasts with the better-known parallel processing, which has already reached the marketplace. Although parallel processing consists of a large number of computing units working together, each of its logic gates is connected typically to only a few close neighbors. The result is a faster version of conventional computers that can tackle more tasks.
The concepts behind neural-net computing have linked computer scientists with researchers from various disciplines, in particular the neurological sciences. The use of optical as well as electronic systems has further broadened the field. The result is significant activity at such places as the California Institute of Technology, MIT, Stanford University and AT&T's Bell Laboratories.
But the system works well only after repeated practice on the same page; it handles a new page of text much more poorly than other text-to-speech converters, based on older technology, that have been on the market for years.
What is needed, Caltech's Mostafa noted, is a system free of algorithms that can learn without being programmed. "In complex cases such as finding a tree in a picture," he said, "we don't know what the pattern is. The system could proceed only by learning without instructions."
Despite these limitations, a few pioneers have launched neural-net companies. Synaptics Inc., of San Jose, Calif., features the involvement of Mead, co-inventor of very large-scale integration, and Faggin, founder of Ziog and designer of the first microprocessor. Swimming against the flow, Synaptics has attracted $1 million in venture capital. Nestor Co., of Providence, R.I., is offering a system for recognizing handwritten text, including Japanese characters. Its founders include Nobel Prize-winning physicist Leon Cooper of Brown University.
The Defense Department is also supporting work on neural nets through its Advanced Strategic Computing program within the Defense Advanced Research Projects Agency (DARPA). The money is funding modest efforts at Texas Instruments, AT&T, IBM, Bendix, TRW and General Electric.
Another milestone may be achieved later this summer. NETtalk's Sejnowski has devised a net that can play high-quality backgammon. He intends, in the weeks following the conference, to match his system against the current world champion, a computer program written by Hans Berliner of Carnegie-Mellon University.