Brain-Inspired Computing Research - overview
The Brain-Inspired Computing Research Team of IBM Research - Australia builds Cognitive Sensing solutions by merging exploratory Deep-Learning research with applied research on our neuromorphic TrueNorth chip.
In recent years, there has been a tremendous improvement in machine learning methods capable of surpassing human performance in tasks like image and speech recognition thanks to the development of Deep-Learning. Meanwhile, IBM has introduced a novel technology called SyNAPSE which is based on a brain-inspired chip called TrueNorth. At the size of a postage stamp it contains 5.4 billion transistors and is capable of performing a fundamentally new type of computing which replaces conventional processing steps with brain-inspired operations. This allows TrueNorth to operate at ultra-low power consumption (less than a dim lightbulb at full chip utilisation) and orders of magnitude faster than today's most advanced supercomputer systems. In 2015 IBM Research - Australia brought the TrueNorth chip to Asia/Pacific and established a TrueNorth hard- and software research ecosystem.
We are merging Deep Learning and TrueNorth into a new technology which we call Cognitive Sensing. Cognitive Sensing allows low-power, predictive high-accuracy analytics of unstructured data in real-time at the point of sensing. It therefore forms the core of a novel cognitive data analytics platform with applications in remote sensors, autonomous systems and wearable devices.
From Wearables to Thinkables
Wearables allow measurement of biometric parameters through systems attached to the human body and either store the collected data on the device or send it to the cloud for offline analysis. As the wearable revolution unfolds a rapidly increasing number of parameters can be monitored simultaneously making data storage and transmission unfeasible. Moreover, as measurement data becomes more complex and diverse cognitive methods such as deep-learning-based pattern and feature recognition will replace conventional analytical schemes. Thus, technologies for continuously correlating, contextualising and filtering data in real-time at the point of sensing are needed to empower artificial intelligence systems to instantly and proactively interact with the wearer creating novel ways of support, guidance and intervention. The architecture of such autonomously operating, always-on cognitive sensors will be minimum-footprint biosensors feeding into low-power deep-learning pipelines with a closed-loop interface back to the wearer. The TrueNorth chip constitutes the means to transform a wearable into a thinkable. Linking advances in biosensing, brain-inspired computing and deep-learning we expect first thinkables to emerge in the field of applied neuroscience for monitoring and interpreting brain activity, diagnostics and predictive prevention in epilepsy and mental illness, deep-brain-stimulation, brain-machine-interfacing, and neurobionics.