Skip to main content Start main content

Research on In-sensor Computing was Published in Nature Nanotechnology

4 May 2023


Prof. Yang CHAI, a member of RI-IWEAR, collaborated with the researchers at Yonsei University in Korea and published an article entitled “Optoelectronic graded neurons for bioinspired in-sensor motion perception” in Nature Nanotechnology

 

The motion perception by conventional machine vision usually occupies abundant computation resources, which greatly restricts its application at edge terminals. To perceive the dynamic motion at sensory terminals, it demands the hardware that can conduct visual processing in a more efficient way. Flying insects can agilely detect the motion with a tiny visual system (~800 photoreceptors and 10^5 neurons in the brain), which inspires us to emulate the characteristics with hardware devices for in-sensor motion perception.

 

One of the reasons for the agile visual system of flying insect is its graded neural structure, which exhibits much higher information transmission rate (>1000 bit/s) than that of spiking neuron (~300 bit/s) and allows to fuse spatiotemporal information at sensory terminals. Prof. Chai’s team adopted MoS2 phototransistor to emulate the non-spiking graded neurons of insect vision systems. The charge dynamics of the shallow trapping centres in MoS2 phototransistors mimic the characteristics of graded neurons, exhibiting multilevel response and volatile feature.

 

The optoelectronic graded neuron array can directly perceive different types of motion. The bioinspired sensor array can detect trajectories in the visual field with very economical hardware devices, allowing the efficient perception of the direction of the moving objects. By modulating the charge dynamics of the shallow trapping centres in MoS2 phototransistor, the bioinspired sensor array can recognize the motion with a temporal resolution ranging from 10^1 to 10^6 ms.

 

The bioinspired sensors have potential applications in robotics and artificial intelligence. For example, it could be used to develop more efficient and robots with better motion perception abilities by detecting and responding to the moving objects in the surrounding environment.


Your browser is not the latest version. If you continue to browse our website, Some pages may not function properly.

You are recommended to upgrade to a newer version or switch to a different browser. A list of the web browsers that we support can be found here