|Some call him a genius, others a charlatan; but even his critics agree that Huang's optical computers are unsurpassed|
HOLMDEL, N.J.--Absent was the cautious reserve usually adopted by scientists in formal presentations. In its place was Alan Huang's characteristic approach to the scientific briefing: machine-gun bursts of excited speech, highly animated hand gestures, all in support of the virtues of Huang's passion - optical computers, whose circuits run on light rather than electricity.
Huang, who heads the Optical Computing Research Department of AT&T Bell Laboratories, was speaking about his work to an official delegation of top-flight research scientists from Bellcore, Bell Communications Research Inc. Suddenly, Stuart Personick, a Bellcore vice president, stood up, interrupting the talk. Huang was spouting nonsense, Personick declared, and his conduct was damaging his reputation, Bell Labs, and science itself. Huang, Personick asserted, was either incredibly stupid or deliberately misleading those present. With that, Personick stalked out of the room.
Even those who criticize Alan Huang as a starry-eyed zealot of an impossible technology have to admit that he puts his money where his mouth is when he tells his staff that one of the biggest impediments to great science is a fear of appearing foolish. Huang, head of optical computing at Bell Laboratories, frequently sets a personal example by pulling outrageous practical jokes.
Recently, for instance, at a going-away party of a long-time collaborator, Huang broke in on the party dressed in drag and makeup, pretending to be a wronged woman, wringing his hands and crying, "Why are you leaving me?" He conceived of the idea about a month beforehand - but says that it took even him that full month to decide to stake his reputation on the gag.
"My first thought was, `No, I can't do this, it will be too embarassing,' " recalls Huang. " Then it started eating at me - like, if I was already worried about what other people thought, I might as well hang it up. Because it amounts to the same fear of embarrassment as when you have a new idea and are afraid to bring it out because you fear appearing stupid.
"So I pulled off the skit. I was truly embarrassed, but I pulled it off, and I pulled it off well. I proved to myself that I'm willing to seem like a complete ass, and I had set that example for others."
Huang's audience - not to mention Huang himself - was shocked. Still, it was not the first time Huang had provoked such visceral reactions among colleagues. Some tout him as the genius who has almost singlehandedly made the once-disreputable field of optical computing respectable again, while others deride him as Bell Labs' "PR star," a self-promoting lightweight whose chief talent is dazzling the media. Both reactions are comprehensible the moment Huang is glimpsed in his home environment, a small office on the fifth floor of the Bell Labs Holmdel building, where Bell's optical computer program, still the only full-scale one in the world, is based. Though 40 years old and 6'2", Huang sits on the edge of his chair like a short, fidgety kid, his legs bent back underneath, his hands gesturing wildly, or clasping his head. Like a high-speed computer gone berserk, he jumps from one mental file to another, emptying each in turn in a torrent of words that leaves one with a sense one moment of sheer brilliance, the next of supreme self-involvement. Small wonder that people find him trying. Yet this is the man - or so think growing numbers of people - who may be spearheading the next computer revolution.
"I know some people call me a `charlatan;' they find my enthusiasm childlike and unscientific, but I think it's important to maintain a childlike way of approaching things. I once had a professor, and whenever you introduced a new idea to him, his eyes glazed over, like, `Oh, my God, I have to absorb a new concept.' Whereas this other professor had a childlike wonder at a new idea; his eyes became shiny, like, `Oh, my God, a new toy!' That childlike imagination is important, but it gets killed in people, and to preserve it I'll do weird things. I'll sit in McDonalds and wonder if there's an algorithmic way of converting Burger King's hamburger processing into McDonalds'. McDonalds' process is batch-oriented, whereas Burger King's is assembly-lined, and I wonder if there is some way of mixing the two. I think I have a solution. "Science is like golf; some people perfect their skills on the driving range or putting green, but that's artificial, being a dilettante. To play the game, you have to fetch the balls, hit them out of sand traps, chase them in the woods, wade in the water. Real science is seat-of-the-pants, and doing it means playing the game all the time."
"Science is like golf; some people perfect their skills on the driving range or putting green, but that's artificial, being a dilettante. To play the game, you have to fetch the balls, hit them out of sand traps, chase them in the woods, wade in the water. Real science is seat-of-the-pants, and doing it means playing the game all the time."
Born in San Francisco in 1948, Huang inherited an analytical skill from his mother - a housewife with a scientific background - and a dreaminess from his father, a poet and professor of Chinese. Huang's obsession with computers began at age three after an encounter with one at a Boston science museum; by fifth grade he was winning science fairs with makeshift computers cobbled together from nails and wires.
As a sophomore studying electrical engineering at Cornell University, Huang spotted a tiny, one-paragraph notice in Popular Science about a newly developed type of crystal, smaller than a sugarlump but capable of storing a thousand holograms. "With such an immense storage capacity," the article ran, "the crystals may also be used as optical memories for computers." That single sentence sent Huang's mind racing - information riding on beams of light! Certain that future computers would run on light rather than electrons, he threw himself into optics, graduating with a B.S. in 1970 and an M.S. in 1971, with a thesis on optical computing memories.
"That was the time I started to have computer dreams - about once a week. They're like [what you see] at halftime in a football game. Groups of people are marching along, and other groups march through them. But in the dreams it's not real people, it's sort of fluctuating colors merging, all these things marching along and through each other. And there's two groups of things, one thing sort of like instructions and the other sort of like data. The question in computing is whether instructions and data can merge and not tie up in a knot. In my computer dreams, these things marched around but kept winding up in a knot."
Unfortunately, Huang was entering the field at precisely the moment when almost everyone in it was fleeing. Optical computing had been a hot topic in the 1960s, when scientists had discovered that the effect that makes transistors possible - one electric current flowing in a semiconducting medium causes variations in a second current flowing at right angles to it - is also present in certain semiconductors when beams of light, rather than streams of electrons, cross. Excited by this prospect, because photons are much faster than electrons, researchers at IBM and RCA set out to construct optically based components.
But the promise of optical semiconductors quickly faded. It turned out that so much light had to be pumped into the semiconductor to create the desired effect that the tiny components would burn up long before any computing could begin. In addition, none of the optical components could perform inverted logic such as switching all trues to falses and vice versa, which is crucial for computer applications. By the beginning of the 1970s the field was in disrepute. And when Huang gave his first talk on the subject, at a 1974 conference in Washington, D.C., a third of the audience walked out.
"People get their kicks stomping on other people's dreams. They're dead already, and they want to kill other people. In school I was taught that in science you get a good hypothesis and they reward you by letting you address the Royal Society and applaud you because it's brilliant. It doesn't work that way. If you have a good idea, your problem is not preventing people from stealing it, but having to cram it down their throats. [So] I convinced myself that the idea was ridiculous and that I had to stop these grandiose ideas and content myself with being an ordinary person, like someone trying to write the great American novel who gives up to become a typist. I [then] worked as a hack, as basically a programmer, to punish myself."
But deciding to work as a lowly hacker didn't mean Huang no longer had aspirations. In 1981, Stanford Ph.D. in hand, Huang joined Bell Laboratories. He had given up on optical computers, justifying the decision by telling himself that until someone came up with a way to do inverted logic he needn't waste his time. He devoted himself full-time to very large-scale integration (VLSI), the area of microelectronics dealing with the problem of coordinating thousands of logic gates on a single chip. Huang describes the challenge, difficult as it was, as a "snap" compared to his former obsession.
But unbeknownst to Huang, Arun Netravali, then head of the Visual Communications Research Department at Bell Labs and the person responsible for hiring Huang, was secretly hoping that he would eventually turn back to optical computers. "We are in the business of breaking new ground," explains Netravali today. "That can't be done without people with big dreams - and Alan's big dream was optical computers." Following his company's policy, however, Netravali refused to dictate Huang's research topic.
But Netravali didn't hesitate to drop a few hints when he could, and he gained important ammunition when Scientific American published an article entitled "The Optical Computer," by three people at Heriot-Watt University in Scotland. The team had discovered an optical transistor-like effect in a semiconductor material that worked at low power, thus raising the hope that the thermal problem could be overcome. Unfortunately, the article contained errors and misleading statements. It claimed, for instance, that the material had a switching time of "a few picoseconds," not mentioning that this was only the switch-on time and that the all-important switch-off time - how long it took until the switch could be used again - was much longer.
Moreover, the authors were physicists rather than computer scientists, and made naive assumptions about computer structure that reinforced the scorn of most computer scientists for optical computing. Still, for Netravali, the article meant that the door was not yet closed on the subject. He showed the article to Huang and asked idly, "What do you think about this, Alan?"
Huang was taken aback. Though he instantly caught the misleading statements, he was thrown off balance by the fact that the devices apparently could perform inverted logic. Two weeks later, he told Netravali that he was no longer sure optical computers were impossible. His mind started racing again. Netravali asked him to give a talk on the subject, and an optical computing program slowly grew at Bell Labs, with Huang hiring two assistants in April 1984.
Over the next year, Huang's small team developed two optical logic devices and a way of connecting them, demonstrating some basic possibilities for optical technology. Higher-ups in Bell Labs began to pay attention, and in June 1985 Huang was named head of a newly created Optical Computing Research Department and was given a virtual blank check to proceed with a major optical computing effort. "We liked his confidence and unconventional thinking," Netravali says. "If you talk even briefly to him, you think that, with enough resources, this is the kind of guy who can make the difference."
Huang's efforts to put together a team faced numerous obstacles. One was the bad odor surrounding the field, which the flawed Scientific American article had done nothing to dispel. Another was that Huang got the green light just as the AT&T divestiture was beginning. Not only was the company about to lose an enormous part of its revenue base and face competition for the first time, but 4,000 of the 22,000 Bell Lab employees were to be transferred to Bellcore. Many in the research community had serious doubts about whether AT&T would be able to sustain its legendary commitment to pure research. Huang realized he would have to act aggressively - even to the point of talking to journalists - if his fledgling program were to succeed.
"Most places would be circling the wagons, protecting themselves, doing more applied stuff, but Bell had the guts to go ahead and risk itself. Some areas got pruned, but Bell's commitment to computers and optics grew. They told me that I would be supported for as long as it takes. But I knew if there was even a rumor that pure research was threatened, it would be self-fulfilling, because serious research requires ultrastability. I wouldn't be able to get good people, and the whole thing would die. The journalists were vultures looking for a death story. So I went out and let them know that Bell was giving me the resources and taking real risk, blue-sky risk. I said it loudly, and reporters liked that, but some of my colleagues didn't. That's when resentment started. Then CBS did a special, `What America Does Best,' and they came and filmed me and a lot of high-level people in coats and ties. [The AT&T executives] all went home and told their kids they were going to be on television tomorrow, and it ended up that all they showed from Bell was five seconds of me saying, `What America does best is takes risks' and then a few lasers flashing. That got me in trouble. It's the Carl Sagan syndrome, the assumption that if you popularize you're a lightweight. But Carl works his ass off."
Despite the strong backing that AT&T gave the program, Huang's team faced numerous design and technological hurdles, one of the most serious of which had to do with communications between the computer components. While electrons can be shuttled about with wires, bits of light are less easily routed, and the number of routes would be vastly multiplied in an optical computer. The more complicated this orchestration becomes, the greater the information flow and the more complicated the management of this flow becomes. The problem is similar to that of having to coordinate the operations of thousands of people just by telling a handful both what to do and what to tell the next handful of people to do, and so on. It was the old problem of instructions tangling with data, which haunted Huang in his computer dreams.
"Then one day I woke up, and I'd had a computer dream, but this time the people didn't march and get stuck in a huge pile. It had worked, and that bugged the hell out of me. I thought, `Oh, my God, maybe there's a solution.' So I spent a month and a half looking for it - and the solution turned out to be similar to how I had merged McDonalds' and Burger King's hamburger processing. - It involved making all the devices and communication routes absolutely regular, like Buckminster Fuller's tessellated structures. This regularity at first looks inefficient because you can do it more simply, but on complicated problems it winds up more efficient because you spend less time housekeeping, keeping track of the data. Now you can take any particular computer problem and find a way to fold it to fit into the super-regular structure. I call it computer origami. It was the kind of thing I love to do, get myself into a situation where I don't understand what's going on, like, `Toto, I don't think we're in Kansas anymore,' and then work my way out, and this time it was my dream that inspired me. Before, I didn't have the incentive to push it that far, but since it worked out in my dream, I started looking and found it. I believe in my subconscious. That's one thing scientists have to learn to go with."
Today, Huang's Optical Computing Research Department consists of 12 individuals who work on various aspects and approaches to optical computers, from optical logic devices and wave guides to computer architecture. Huang coordinates their efforts, irons out problems, secures resources, and acts as a "technology groupie," devouring news of the latest developments and figuring out how he can apply them to his project. Huang claims that the optical devices built by his group are comparable to the best electronic devices. His team recently constructed an 8-x-8 matrix of logic gates and is currently working on a 64-x-64 array.
"He's amazing," says Michael Prise, a researcher working in Huang's department. "He doesn't write anything down - he spins off ideas for us to pick up and develop. Then he provides us with support and inspiration, and deals with the bureaucracy. I've never asked for something and not gotten it in one way or another.
But Huang's extravagant claims, exuberant presentations, and his mental stutter - the result of trying to condense several hours worth of material into 10 minutes - still grates on some colleagues. "He can't speak nearly as fast as he thinks," says one, which is a charitable way of saying that listeners often haven't a clue as to what he is driving at.
Moreover, Huang, a computer scientist, sometimes speaks fast and loose about physics details, which repels some specialists. But by toning down some of his claims and by sheer hard work, Huang and his colleagues are earning some measure of respectability; his group's publications are cited more and more frequently in the scientific literature. Today, objections to the work are likely to call it impractical rather than loony, and when Huang is heckled at conferences the audience now tends to take his side. But the Bell program is still the only full-scale effort to develop an optical computer. Other groups, at Heriot-Watt University in Scotland and the University of Erlangen in Germany, for instance, are either considerably smaller or are working on isolated aspects of the problem.
Huang, however, is convinced that whatever his technical achievements, his most important contribution to science has nothing to do with optical computing, but with creating and maintaining an environment that fosters the kind of childlike imagination that he thinks is essential to genuinely innovative scientific work.
"We're inventors here. To be an inventor it doesn't help just to be smart, or just to have energy. An idea only lasts a few milliseconds in your brain, and then the only thing that keeps it alive is your willingness to nurture it in the face of appearing stupid. Science is encumbered by people worrying about their careers, making incremental improvements on what they've already done, but that's the road to hell. You have to work hard to break that natural conservatism, and reward people who are willing to take risks. "I think my true success here has been not to achieve progress in building an optical computer, but to establish a different culture, one conducive to a childlike way of thinking, to taking risks. That's the only way anyone will ever figure out how to do anything so wild as what we are attempting.
"I think my true success here has been not to achieve progress in building an optical computer, but to establish a different culture, one conducive to a childlike way of thinking, to taking risks. That's the only way anyone will ever figure out how to do anything so wild as what we are attempting.
Writer Robert Crease teaches philosophy of science at the State University of New York, Stony Brook.