Researcher’s new sensor system for Internet of Things devices integrates processing and computing to save power and protect data
It’s been more than a decade since Gartner Research identified the Internet of Things – physical objects with sensors, processing capability, and software that connect and exchange data through the Internet and communications networks – as a technology emerging.
Nowadays, connected devices are indispensable to commercial industries, healthcare and consumer products. Data analytics firm Statista predicts a near tripling of the number of connected devices worldwide, from 9.7 billion in 2020 to more than 29 billion in 2030.
The sensors built into the devices are largely passive, transmitting signals to networked computers which process and send meaningful data back to the device. Kyusang Lee, an assistant professor of materials science and engineering and of electrical and computer engineering at the University of Virginia’s School of Engineering and Applied Science, is working on a way to make sensors smart.
Its smart sensor will sit at the edge of a device, which itself sits at the edge of a wireless network. The smart sensor system also stores and processes data — an emerging area of research he calls artificial intelligence of objects, a research force in Charles L. Brown’s Department of Electrical and Computer Engineering.
“Given the exponential growth of the Internet of Things, we anticipate bottlenecks and delays in data processing and feedback signaling. Sensor output will be less reliable,” Lee said.
The constant pulse of data through wireless and computer networks also consumes power and increases the risk of exposing sensitive data to accidental or unauthorized disclosure and misuse.
Lee’s sensor system addresses both of these challenges. An added benefit is that the sensor can detect and process a wide variety of signals that mimic human biology: image sensing for vision; ultrasonic detection for hearing; pressure sensing and motion and touch stress sensing; and chemical detection to detect viruses.
Lee and members of his Thin Film Devices Lab co-authored a paper in Nature Communications that points to this holistic sensing system. In-sensor Image Memorization and Encoding via Optical Neurons for Bio-stimulus Domain Reduction Toward Visual Cognitive Processing is featured on the publisher’s main page, earning recognition as one of the top 50 recently published papers in a field.
Lee’s sensor system is the culmination of five years of research in electrical and optical materials development and device fabrication, a research strength of the Department of Materials Science and Engineering. His fundamental research in epitaxy – the growth of crystalline material on a coated substrate in 2D – offers a new way to grow thin films.
Lee began this research as a postdoctoral associate in the Department of Mechanical Engineering at the Massachusetts Institute of Technology. Working with mechanical and electrical engineers and materials scientists at MIT, Lee developed a process for growing crystalline compound semiconductors that overcame the limitations imposed by lattice pairing between two material systems.
Lee commercialized his process, serving as chief executive and chief technology officer of a Charlottesville, Va. startup, FSB. The company offers high-quality, large-scale, low-cost gallium nitride substrates for semiconductors commonly used in light-emitting diodes and enables customers to develop high-quality single-crystal semiconductors on graphene.
Lee’s innovation in material synthesis, called remote epitaxy, enables the fabrication of a high-quality, self-contained semiconductor film, meaning that any given layer of material can be engineered with unique properties, independent of layers of material in which it is sandwiched. This flexibility in layer stacking was a prerequisite to building a multifunctional sensor capable of simultaneously collecting and processing different types of signal inputs.
The optoelectronic component of the system integrates image sensing and computing. Lee won a prestigious National Science Foundation CAREER award for developing this intelligent image sensor system, which mimics the human eye. An artificial cornea and artificial iris achieve basic optics, aided by artificial muscles that allow movement and focus. An artificial retina records the image signal, pre-processing the image data.
A co-designed software and hardware artificial neural network completes the sensor system. An artificial synapse, called a memristor, moves preprocessed sensory inputs to the system’s brain, a neuromorphic chip that can perform high-level signal processing.
“It’s very satisfying to post about systems integration,” Lee said. “We are now able to tell a complete story, from materials to integration to application, and present a vision of biomimetic sensor systems. I think our sensor will be particularly useful in robotics that relies on combined sensory inputs and real-time embedded processing. »