Analog Neural Networks : Computational Power

Super-Turing Computation

Analog Computation and Dynamical Systems

Genetic Networks : Analog Computation and Artificial Design

Machine Learning: Data Clustering, Active Information Retrieval, Neural Networks and Applications

Circadian System and Jetlag

Perception, Memory and Decision Making

Super-Turing Computation

Many researchers have called for development of theories that are not based on Turing's concepts -- among others, Shannon, Feynmann, Sir Roger Penrose, Smale, Arbib, and philosopher Jack Copeland. In a series serious of papers we introduced a neural network model that is computationally more powerful than the Turing machine. For example, this model can solve the Halting problem of the Turing machine which is un-decidable in the Turing sense. Our results on such analog networks can be considered a possible analytical answer to the challenges represented by these calls, and I termed the words "Hyper Computation," "Supra-Turing" and "Super-Turing computational power" (see invited lecture in Workshop on Hyper-computation 2000). An active group of researchers has emerged in the field of hyper-computation, comprising researchers from many areas of science and philosophy. See for example, see in and in (See also invited lecture in Unconventional Computation, Santa Fe 2007 on analog asynchronous networks.)

The first super Turing computation model was received by changing the weights in the analog recurrent neural networks from rational and reals (see Analog Neural Networks Analog : Computational Power). To explain the difference between the number sets that cause such a discrepancy in computational power, I have measured, with Balcázar and Gavaldá, the complexity, or information content, of the weights by a variant of resource-bounded Kolmogorov complexity. We revealed a proper hierarchy of nonuniform complexity classes associated with networks having weights of increasing Kolmogorov complexity, of which the computational class of analog networks with the rational weights (P) and the class with real weights (P/poly) are but the two extreme cases.

In another work, I proposed a stochastic analog model, based on a network with simple rational weights (which should thus be Turing-equivalent), but augmented by a binary coin that outputs 1 with probability p and 0 with probability (1-p). Although p can be a real number, it is never accessed by any neuron (the coin is binary so that the other neurons receive from it digital values only). The computational power of this network lies between P and P/poly (technically, Pref-P/log), and thus is still hyper-computational. That is to say, the real value does not have to be explicit to imply the extra power, but any process that is affected by a real number may give rise to non-recursive computation. This work is interesting when considering biological neural networks in which the different neurons may have slightly different properties which come about in similar way and demonstrate higher computation.


H.T. Siegelmann, "Computation Beyond the Turing Limit," Science, 238(28), April 1995: 632-637

J.L. Balcázar, R. Gavaldá, and H.T. Siegelmann, "Computational Power of Neural Networks: A Characterization in Terms of Kholmogorov Complexity," IEEE Transactions on Information Theory, 43(4), July 1997: 1175-1183

H.T. Siegelmann, "Stochastic Analog Networks and Computational Complexity," Journal of Complexity, 15(4), 1999: 451-475

H.T. Siegelmann, Neural Networks and Analog Computation: Beyond the Turing Limit, Birkhauser, Boston, December 1998