Autonomous image analytics using AI and neuromorphic computing - overview
In Industry 4.0, cameras and sensors create a continuous stream of data, which in many cases needs to be analysed directly at the place of origin. IBM scientists are using the neuromorphic chip called TrueNorth and developing learning software to interpret image data. For example, image data can be from video cameras, image sensors, or solid-state imaging cameras, which take pictures of the electromagnetic spectrum invisible to the human eye.
The brain-inspired TrueNorth chip is made of silicon and powered by 1 million neurons and 256 million synapses. However, it still consumes less than 70 mW, which makes it ideal for real-time data analysis at the edge of the Internet of Things. For example, in the future a drone equipped with this kind of chip and software could study crops and issue an alarm in case of a bushfire or hailstorm.
Scientists at the IBM Research – Australia lab have combined the TrueNorth chip with deep-learning algorithms and the NAO robot to classify live images into 10 categories, e.g. horse or car. The TrueNorth chip significantly accelerates the picture processing, after which NAO can communicate the classification result through speech.