This article was written for ACM News
The human brain processes information in an incredibly energy-efficient way. Its power consumption is only a tiny 20 watt. Computers that mimic the brain’s neural networks via deep learning have given rise to wonderful applications in recent years, but they consume much more energy than the human brain.
Thanks to an algorithmic breakthrough in training spiking neural networks (SNN’s), many applications of artificial intelligence, such as speech recognition, gesture recognition and the classification of electrocardiograms (ECG), can become a factor of a hundred to a thousand more energy-efficient. This means that it will be possible to put much more artificial intelligence (AI) into chips, allowing applications to run on a smartwatch or a smartphone, for example, while until now this had to be done in the cloud.
Moreover, by running AI on a local device, the applications become more robust and privacy friendly. More robust, because there is no longer a need for a network connection to the cloud. And more privacy-friendly, because data can remain local.
The breakthrough was achieved by a research team from Centrum Wiskunde & Informatica (CWI), the Dutch national research center for mathematics and computer science, together with the IMEC/Holst Research Centre from Eindhoven, also in the Netherlands. It was published this July in a reviewed paper at the International Conference on Neuromorphic Systems (https://dl.acm.org/doi/10.1145/3407197.3407225). The algorithm is available as open source (https://github.com/byin- cwi/SRNN- ICONs2020).
Teamleader is CWI researcher and professor of cognitive neurobiology at the University of Amsterdam (UvA) Sander Bohté. ACM Communications talked to Bohté about the research and its applications.
Read the rest of the article on the website of ACM News.