Not a journalist’s fairy tale, this capsule history of computer science was invented by Al “progenitor” Seymour Papert himself. The professor of media technology at MIT concedes that neural networks, those powerful and speedy mimics of the structure and processes of the brain, have indeed supplanted Al to become the latest fair-haired child of computational science.

In 1969, Papert and Minsky had pointed out in their book that a simple one-layer “perceptron” (a simple neural network) could not compute the logical function “x OR y.” Neither could it distinguish a “T” from a “C”, although it was able to tell a square from a triangle. While Papert insists he and Minsky intended only to clarify the potential of neural nets, the book instead caused most researchers to neglect the field and hop on the Al bandwagon instead. A few, however, clung to the network notion and sought to develop new...

Interested in reading more?

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!
Already a member?