Researchers design a butterfly-inspired multisensory neuromorphic platform for integration of visual and chemical cues

It is a known fact that animals require the integration of cues collected from multiple sensory organs to enhance the overall perceptual experience and thereby facilitate better decision-making in most aspects of life. However, despite the importance of multisensory integration in animals, the field of artificial intelligence (AI) and neuromorphic computing has primarily focused on processing unisensory information. This lack of emphasis on multisensory integration can be attributed to the absence of a miniaturized hardware platform capable of co-locating multiple sensing modalities and enabling in-sensor and near-sensor processing. 

a) A simplified abstraction of visual and chemical stimuli from male butterflies and visuo-chemical integration pathway in female butterflies. b) Butterfly-inspired neuromorphic hardware comprising of monolayer MoS2 memtransistor-based visual afferent neuron, graphene-based chemoreceptor neuron, and MoS2 memtransistor-based neuro-mimetic mating circuits. Image credit: Advanced Materials

In their recent study, researchers at Penn State University addressed this limitation by utilizing the chemo-sensing properties of graphene and the photo-sensing capability of monolayer molybdenum disulfide (MoS2) to create a multisensory platform for visuochemical integration. 

 

Additionally, the in-memory-compute capability of MoS2 memtransistors is leveraged to develop neural circuits that facilitate multisensory decision-making. 

The visuochemical integration platform is inspired by intricate courtship of Heliconius butterflies, where female species rely on the integration of visual cues (such as wing color) and chemical cues (such as pheromones) generated by the male butterflies for mate selection. The butterfly manages this with a tiny brain that uses minimal energy. This is in direct contrast to modern computing, which consumes a significant amount of energy.

To mimic this behavior electronically, the researchers developed a hardware platform made of molybdenum sulfide (MoS2) and graphene. The MoS2 portion of the hardware platform is a memtransitor, an electronic that can perform both memory and information processes. The researchers chose MoS2 for its light-sensing capabilities, which mimic the visual capabilities of the butterfly. The graphene portion of the device is a chemitransistor that can detect chemical molecules and mimic the pheromone detection of the butterfly's brain.

The researchers tested their device by exposing their dual-material sensor to different colored lights, mimicking the visual cues, and applying solutions with varying chemical compositions resembling the pheromones released by butterflies. The goal was to see how well their sensor could integrate information from both the photo detector and chemisensor, similar to how a butterfly's mating success relies on matching wing color and pheromone strength.

By measuring the output response, the researchers determined that their devices could seamlessly integrate visual and chemical cues. This highlights the potential for their sensor to process and interpret diverse types of information simultaneously, they said.

"We also introduced adaptability in our sensor's circuits, such that one cue could play a more significant role than the other," said Yikai Zheng, a fourth-year doctoral student in engineering science and mechanics and co-author of the study. "This adaptability is akin to how a female butterfly adjusts her mating behavior in response to varying scenarios in the wild."

The dual sensing in a single device is also more energy efficient, the researchers said, when contrasted with the current way AI systems operate. They collect data from different sensor modules and then shuttle it to a processing module, which can cause delays and excessive energy consumption.

Next, the researchers said they plan to expand from integrating two senses into their device to three senses, mimicking how a crayfish uses visual, tactile, and chemical cues to sense prey and predators. The goal is to develop hardware AI devices capable of handling complex decision-making scenarios in diverse environments.

The butterfly-inspired visuochemical integration platform has significant implications in both robotics and the advancement of neuromorphic computing, going beyond unisensory intelligence and information processing.  

Posted: Apr 03,2024 by Roni Peleg