26.01.2015 Views

ULTIMATE COMPUTING - Quantum Consciousness Studies

ULTIMATE COMPUTING - Quantum Consciousness Studies

ULTIMATE COMPUTING - Quantum Consciousness Studies

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Toward Ultimate Computing 15<br />

1.3.2 Connectionism<br />

The Mind’s Eye may be the apex of a collective hierarchy of parallel systems<br />

in which the cytoskeleton and related structures are the ground floor. Parallel<br />

systems in both computers and biological systems rely on lateral connections and<br />

networks to provide the richness and complexity required for sophisticated<br />

information processing. Computer simulations of parallel connected networks of<br />

relatively simple switches (“neural nets”) develop “cognitive-like functions” at<br />

sufficient levels of connectedness complexity-a “collective phenomenon”<br />

(Huberman and Hogg, 1985). Philosopher John Searle (Pagels, 1984), who has an<br />

understandable bias against the notion that computer systems can attain human<br />

consciousness equivalence, points out that computers can do enormously complex<br />

tasks without appreciating the essence of their situation. Searle likens this to an<br />

individual sorting out Chinese characters into specific categories without<br />

understanding their meaning, being unable to speak Chinese. He likens the<br />

computer to the individual sorting out information without comprehending its<br />

essence.<br />

It would be difficult to prove that human beings comprehend the essence of<br />

anything. Nevertheless, even the simulation of cognitive-like events is interesting.<br />

Neural net models and connectionist networks (described further in Chapter 4)<br />

have been characterized mathematically by Cal Tech’s John Hopfield (1982) and<br />

others. His work suggests that solutions to a problem can be understood in terms<br />

of minimizing an associated energy function and that isolated errors or incomplete<br />

data can, within limits, be tolerated. Hopfield describes neural net energy<br />

functions as having contours like hills and valleys in a landscape. By minimizing<br />

energy functions, information (metaphorically) flows like rain falling on the<br />

landscape, forming streams and rivers until stable states (“lakes”) occur. A new<br />

concept in connectionist neural net theory has emerged with the use of multilevel<br />

networks. Geoffrey Hinton (1985) of Carnegie-Mellon University and Terry<br />

Sejnowski of Johns Hopkins University have worked on allowing neural nets to<br />

find optimal solutions, like finding the lowest particular lake in an entire<br />

landscape. According to Sejnowski (Allman, 1986; Hinton, Sejnowski and<br />

Ackley, 1984) the trick is to avoid getting stuck in a tiny depression between two<br />

mountains:<br />

Imagine you have a model of a landscape in a big box and you want<br />

to find a lowest point on the terrain. If you drop a marble into the<br />

box, it will roll around for a while and come to a stop. But it may not<br />

be the lowest point, so you shake the box. After enough shaking you<br />

usually find it.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!