next up previous
Next: 2 A non-superpositional quantum Up: 1 Quantum computers & Previous: 1.2 Non-superpositional quantum computation

1.3 Quantum learning

Carver Mead, a visionary of computer hardware design, has pointed out [Mead, 1989] that until lately, advances in hardware have focussed on issues of scale; smaller is better, smaller is faster. But he points out that although the brain's hardware is of much larger scale, and much slower than current computer hardware, the brain can perform computations far beyond our fastest supercomputers. His recommendation, then, for advances in hardware design, is to look at how the brain is organized, for inspiration to come up with new forms of computation, rather than just try to make the kinds of computer we have now faster and smaller. Neural network algorithms are part of this search for novel kinds of computation. A computer on the spatial and temporal scale of quanta is bound to have advantages over current hardware, but Mead's point still holds. So why not pursue both improvements in scale and alternative forms of computation? That is one of the reasons for looking at quantum learning; but perhaps there are qualitative advantages of quantum learning that are not just the simple addition of algorithmic and scale advantages.

For the discussion of the advantages of (especially non-superpositional) quantum computation on which I wish to focus, a more concrete notion of an NSQC will be helpful. Since I also wish to stress the further advantages of quantum learning over quantum computation that does not involve learning, the particular NSQC I will consider is an implementation of a neural network architecture.



Ron Chrisley
Wed Nov 20 01:10:59 GMT 1996