Hierarchical Network Structure and Dynamics Motivated by Brains

We consider various graph structures, as the combination of regular lattices and random graphs. These hybrid graphs incorporate the key biological insight that the cortex not only has a regular structure based on the physical neighborhood of neurons and cortical columns, but has long-range connections via neural fibers and long axons as well. Our team has thorough theoretical and practical experience with such graph structures and their dynamics, which is very beneficial for this task execution.


Our results provide key insights into dynamical memory properties of the percolation model, when input patterns can be encoded into attractors with limit cycles of specific lengths, or chaotic attractors (practically infinite cycle length). The breakthrough aspects of these results for AI rest in the exponential memory capacity attractors with very long cycles (chaos), and the instant recall of the stored patterns without the need for lengthy search and related convergence process required in more traditional, convergence-based (fixed point) memory devices.

Illustration of the 2-dimensional lattice model we refer to the video above. Here the excitatory layer has 100x100 vertices, with lattice edges, without additional long-range connections.
Computer simulations provide evidence of chaotic behavior with fractal boundary regions for higher values of the firing reset parameter (m). Extensive studies produce a diagram with bifurcation and tri-furcation properties, evidencing highly intermingled limit cycle and chaotic attractor basins.

Year 1 report | 12/2017 quarterly report | References