Image analytics using AI and neuromorphic computing - overview
Cameras and sensors create a continuous stream of data, which in many cases needs to be analysed directly at the place of origin. IBM scientists are using brain-inspired processors and Artificial Intelligence to interpret image data. Image data can be from video cameras, image sensors, or hyperspectral imaging cameras, which take pictures of the electromagnetic spectrum invisible to the human eye.
IBM’s brain-inspired TrueNorth processor is uniquely suited to run deep neural networks yet consumes less than 70 mW at full chip utilisation. This makes it ideal for real-time data analysis at the edge of the Internet of Things. For example, in the future, a drone equipped with this kind of chip and software could study crops and activate an alert in case of a bushfire or hailstorm.
Scientists at IBM Research – Australia have combined IBM’s neuromorphic TrueNorth chip with deep-learning algorithms and the NAO robot to fully autonomously classify live images into 10 categories in real time. The TrueNorth chip significantly speeds up image processing and NAO then can communicate the classification through speech.