Researchers fuse novel devices with biological inspiration for future AI systems


Blue lines and yellow dots form a loose depiction of a human brain against a black and red background.

Penn State computer scientists are exploring ways to develop more intelligent and efficient artificial intelligence systems by mimicking the human nervous system through a specific approach called spiking neural networks.

Image: iStock/@MF3d

In order to develop more intelligent and efficient artificial intelligence (AI) systems, computer scientists use neuromorphic computing, a field that relies on mimicking the human nervous system in order to create efficient and intelligent computing systems. However, researchers in the nascent field are still working toward success, and the sought-after power efficiencies have yet to be achieved.

Now, thanks to a three-year, $1 million grant from the National Science Foundation, Penn State computer scientists are exploring ways to achieve these power efficiencies through a specific approach called spiking neural networks (SNN). The research is led by principal investigator Chita Das, department head and distinguished professor of computer science and engineering, and co-PIs Vijaykrishnan Narayanan, A. Robert Noll Chair of Computer Science and Engineering and Electrical Engineering, and Abhronil Sengupta, assistant professor of electrical engineering.

The SNN approach uses biologically inspired, event-driven spike-based computation and communication – meaning only operating when needed – in its design. SNN is a subfield of neuromorphic computing. One of the distinguishing features of SNN as a computing paradigm is the integration of the element of time into algorithms and models. Penn State scientists are exploring novel magnetic device structures to directly mimic such temporal non-linear characteristics in hardware, scalable architecture and interconnection fabrics for these devices, along with novel hybrid algorithm designs to leverage the benefits of both SNN models and traditional non-spiking deep learning models.

“This research aims to build a brain-inspired computational paradigm to achieve not just the intelligence but also the power-efficiency of biological systems,” said Sonali Singh, a Penn State doctoral candidate in computer science and engineering. “Our aim is to propose innovations across the computing stack that involve novel hardware devices and circuits, parallel architectures and scalable interconnects as well as powerful algorithms that exploit the temporal and event-driven nature of information processing in biological neurons and synapses.”

If successful, the improved power efficiency and intelligence that mirrors human neural networks could be applied broadly to improve all forms of AI and computing systems.

“Besides being capable of performing machine-learning related tasks such as automatic image, video and speech recognition, spiking neural networks also a present a great opportunity for unsupervised, local on-chip learning along with being a natural fit for the next generation of low-power, low-latency event-driven sensors,” Singh said. “When deployed on neuromorphic platforms, they can serve as intelligent systems capable of operating in power-constrained scenarios such as wearables, smart cameras and other edge devices.”

Sengupta agreed, saying that the holistic research agenda and the potential applications of the research are some of the most exciting aspects of the project for him.

“The interdisciplinary end-to-end co-design aspect of the project, from fundamental material and device explorations to architecture and algorithm design, has the potential for transformative improvements in the efficiency of AI platforms,” he said.

Recent research results related to this work can be viewed here.

/Public Release. View in full here.