Cat Brain back in Memristor vs Synaptronic AI Showdown

ssd-memory A lot of discussions over the latest memristor platform model to attempt the “Cats Brain” artificial intelligence (read: machine learning, pattern recognition, synaptic modeling) are full of wonder and hype [ex: bb], but the actual algorithmic synaptic simulator program the memristor cat brain study is based on actually goes back quite a ways. Last November, there was a big flap when researchers at IBM’s used the cat brain model as part of a DARPA funded synaptic brain research project, called “Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE)”, which stirred up some controversy:

Scientists, at IBM Research – Almaden, in collaboration with colleagues from Lawrence Berkeley National Lab, have performed the first near real-time cortical simulation of the brain that exceeds the scale of a cat cortex and contains 1 billion spiking neurons and 10 trillion individual learning synapses. [DARPA cat brain, IBM, 2009]

Primarily, the controversy is over the difference in modeling vs implementing. Which is why the recent University of Michigan paper is interesting, even if its implementing at a small scale:

So far, Lu has connected two electronic circuits with one memristor. He has demonstrated that this system is capable of a memory and learning process called “spike timing dependent plasticity.” This type of plasticity refers to the ability of connections between neurons to become stronger based on when they are stimulated in relation to each other. Spike timing dependent plasticity is thought to be the basis for memory and learning in mammalian brains. [DARPA cat brain, University of Michigan, 2010]

These are both part of the DARPA programs, but the interesting point is the competition between different platforms, implementation and modeling the underlying hardware architectures to run the cat-brain algorithms on.

“The goal of the SyNAPSE program is to create new electronics hardware and architecture that can understand, adapt and respond to an informative environment in ways that extend traditional computation to include fundamentally different capabilities found in biological brains,” said DARPA program manager Todd Hylton, Ph.D. [Link]

Its hard to beat the fact that the earlier IBM simulation used a massive amount of simulator power. The research has some fascinating points to make on its own, using the “cortical simulator”, at Lawrence Livermore, using the National Labs Dawn Blue Gene/P supercomputer with 147,456 CPUs and 144 terabytes of memory:

“Learning from the brain is an attractive way to overcome power and density challenges faced in computing today,” said Josephine Cheng, IBM Fellow and lab director of IBM Research – Almaden. “As the digital and physical worlds continue to merge and computing becomes more embedded in the fabric of our daily lives, it’s imperative that we create a more intelligent computing system that can help us make sense the vast amount of information that’s increasingly available to us, much the way our brains can quickly interpret and act on complex tasks.” To perform the first near real-time cortical simulation of the brain that exceed the scale of the cat cortex, the team built a cortical simulator that incorporates a number of innovations in computation, memory, and communication as well as sophisticated biological details from neurophysiology and neuroanatomy. This scientific tool, akin to a linear accelerator or an electron microscope, is a critical instrument used to test hypotheses of brain structure, dynamics and function. The simulation was performed using the cortical simulator on Lawrence Livermore National Lab’s Dawn Blue Gene/P supercomputer with 147,456 CPUs and 144 terabytes of main memory. […edit…] After the successful completion of Phase 0, IBM and its university partners were recently awarded $16.1M in additional funding from the Defense Advanced Research Projects Agency (DARPA) for Phase 1 of DARPA’s Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) initiative. This phase of research will focus on the components, brain-like architecture and simulations to build a prototype chip. The long-term mission of IBM’s cognitive computing initiative is to discover and demonstrate the algorithms of the brain and deliver low-power, compact cognitive computers that approach mammalian-scale intelligence and use significantly less energy than today’s computing systems. [IBM: Nov 18, 2009]

Leave a Reply