The following blog item was authored by Steve Hamm of IBM
A SyNAPSE chip enables a revolutionary new technology design inspired by the human brain. IBM built the new chip with a brain-inspired computer architecture powered by an unprecedented 1 million neurons and 256 million synapses. It is the largest chip IBM has ever built at 5.4 billion transistors, and has an on-chip network of 4,096 neurosynaptic cores. Yet, it only consumes 70 mW during real-time operation — orders of magnitude less energy than traditional chips. As part of a complete cognitive hardware and software ecosystem, this technology opens new computing frontiers for distributed sensor and supercomputing applications.
The idea of making machines modeled on the human brain has thrilled and confounded scientists since the earliest days of computing in the 1940s. The brain is a remarkable organ. Thanks to this spongy mass the size of a grapefruit, which uses just 20 watts of power, we humans understand complex concepts, navigate the physical world, and create marvelous things—from spacecraft to sonnets.
Not surprisingly, imitating the brain has proven to be incredibly difficult. Conventional computers don’t even try. They use linear logic and hard-wired circuitry to calculate, send messages, analyze data and organize knowledge consuming enormous amounts of power while failing to match the brain’s protean capabilities.
The SyNAPSE team at IBM Research, funded by the U.S. Defense Advanced Research Projects Agency and aided by scientists from several universities, has demonstrated powerful yet energy-efficient neuromorphic chip that has the potential to help fulfill the dreams of the computer industry’s pioneers. “I hope this will inspire completely different thinking about what computing can do,” says Dharmendra S. Modha, IBM Fellow and principal investigator of the SyNAPSE Project.
The team’s sensory microprocessor, called TrueNorth, addresses a wide range of sense-based uses, including vision, hearing and multi-sensory scenarios. The chip is designed to be especially effective in situations where computing is constrained by power and volume. Think robots, sensor networks and public safety monitoring applications. And think about your smartphone being used as a mobile sensing device.
In addition, it can be used in combination with other cognitive computing technologies such as IBM Watson to create systems that learn, reason and help humans make better decisions.
Each TrueNorth chip contains one million programmable neurons (16 times the largest neuromorphic chip existing today) and 256 million synapses. With 5.4 billion transistors, it’s one of the largest digital integrated circuits ever produced.
By tiling dozens or hundreds of TrueNorth chips on circuit boards, the deisigner can rapidly scale to supercomputing-class capabilities. Imagine being able to analyze all the data being received from outer space via every radio telescope in the world—and in real time.
Just as important as the sheer data-processing capability of the chip is its energy efficiency. An individual TrueNorth processor consumes just 70 milliwatts of electricity. Computers based on the chip will be many orders of magnitude more energy-efficient than conventional systems for sensory-based uses.
Here’s how the chip works:
Signals detected by sensing devices, such as video cameras, are routed to the on-chip network of neurons and synapses. The signals are converted to spikes of electricity, which travel along wires corresponding to the axons in the brain, make contact with digital switches, which correspond to the brain’s neurons, and, depending on their composition and strength, fan out to other neurons via virtual axons and synapses—the connecting points. In this way, the neural network makes sense of patterns of signals and learns from identifying repeating patterns.
The chips is manufactured for IBM by Samsung Electronics using their state-of-the-art 28 nanometer CMOS technology—today’s mainstream method for fabricating chips.
The SyNAPSE team developed an ecosystem of software to make the TrueNorth technology useful. That includes a simulator, called Compass, and a programming environment, Corelet. The programming environment enables application developers to create reusable software modules for capabilities such as real-time video analysis, signal processing and object detection, classification and recognition.
We can envision plenty of uses for TrueNorth. How about smart glasses for people who can’t see? The glasses would analyze live video to help the wearer navigate obstructions or even identify people—whispering critical information through an ear bud. Or think about a portable health diagnosis device.
How about using a TrueNorth system in the hospital operating room? During exploratory surgery, doctors could perform real-time analysis of human tissue samples—reducing the need for tissue removal or for additional surgeries.
Automakers could use the chip to help pilot the driverless cars of the future. Today’s experimental vehicles don’t really understand the road like humans do. They have to be programmed to respond to every situation. A system based on our technology would be able to learn on its own and to react expertly to surprises.
You could build a very powerful supercomputer using thousands of these chips.
And, I bet, you’ll find TrueNorth in smartphones. You can imagine a face-recognition system for quick and easy authentication. Or a system for sniffing out pollution, gas leaks or other toxic agents in the air.
When the SyNAPSE team started on this journey eight years ago, conventional wisdom said it couldn’t be done. Well, Dharmendra and the team did it. They designed the chip and the technology ecosystem that surrounds it.
This is the fulfillment of the science community’s long-held dream. It’s also just a first step into brain-like technology. But we truly are on the path to a new era of computing.
To learn more about this new era of computing, read:
Smart Machines: IBM’s Watson and the Era of Cognitive Computing
Columbia University Press
John E. Kelly III and Steve Hamm
Cloth, 160 pages, ISBN: 978-0-231-16856-4 $22.95 / £15.95
In Smart Machines, John E. Kelly III, director of IBM Research, and Steve Hamm, a writer at IBM and a former business and technology journalist, introduce the fascinating world of “cognitive systems” to general audiences and provide a window into the future of computing. Cognitive systems promise to penetrate complexity and assist people and organizations in better decision making. They can help doctors evaluate and treat patients, augment the ways we see, anticipate major weather events, and contribute to smarter urban planning. Kelly and Hamm’s comprehensive perspective describes this technology inside and out and explains how it will help us conquer the harnessing and understanding of “big data,” one of the major computing challenges facing businesses and governments in the coming decades. Absorbing and impassioned, their book will inspire governments, academics, and the global tech industry to work together to power this exciting wave in innovation.