News

Tiny switches give lidar record resolution

Approach for a new generation of powerful, low-cost 3D sensors

28.03.2022 - New detection system consists of a matrix of tiny optical antennas and MEMS switches that rapidly turn them on and off.

When Google unveiled its first autonomous cars in 2010, the spinning cylinder mounted on the roofs really stood out. It was the vehicle’s light detection and ranging (lidar) system, which worked like light-based radar. Together with cameras and radar, lidar mapped the environment to help these cars avoid obstacles and drive safely. Since a then, inex­pensive, chip-based cameras and radar systems have moved into the mainstream for collision avoidance and autonomous highway driving. Yet, lidar navigation systems remain unwieldy mechanical devices that cost thousands of dollars. That may be about to change, thanks to a new type of high-resolution lidar chip deve­loped by Ming Wu from the Berkeley Sensor and Actuator Center at the University of Cali­fornia, Berkeley.

Wu’s lidar is based on a focal plane switch array (FPSA), a semi­conductor-based matrix of antennas that gathers light like the sensors found in digital cameras. Its resolution of 16,384 pixels may not sound impressive when compared to the millions of pixels found on smartphone cameras, but it dwarfs the 512 pixels or less found on FPSAs until now, Wu said. Equally significant, the design is scalable to mega­pixel sizes using the same complementary metal-oxide-semi­conductor (CMOS) technology used to produce computer processors, Wu said. This could lead to a new genera­tion of powerful, low-cost 3D sensors for auto­nomous cars, as well as for drones, robots and even smartphones.

Lidar works by capturing reflec­tions of light emitted by its laser. By measuring the time taken for light to return, or changes in beam frequency, lidar can map the environment and clock the speed of objects moving around it. Mechanical lidar systems have powerful lasers that visua­lize objects hundreds of yards away, even in the dark. They also generate 3D maps with high enough resolution for a vehicle’s arti­ficial intelli­gence to distinguish between vehicles, bicycles, pedestrians and other hazards. Yet, putting these capa­bilities on a chip has stymied researchers for more than a decade. The most imposing barrier involves the laser.

“We want to illuminate a very large area,” Wu said. “But if we try to do that, the light becomes too weak to reach a suffi­cient distance. So, as a design trade-off to maintain light intensity, we reduce the areas that we illuminate with our laser light.” That is where the FPSA comes in. It consists of a matrix of tiny optical trans­mitters, or antennas, and switches that rapidly turn them on and off. This way, it can channel all available laser power through a single antenna at a time.

Switching, however, poses problems. Almost all silicon-based lidar systems use thermo-optic switches, which rely on large changes in tempera­ture to produce small changes in the refractive index and bend and redirect laser light from one waveguide to another. Thermo-optic switches, however, are both large and power-hungry. Jam too many onto a chip, and they will generate too much heat to operate properly. This is why existing FPSAs have been limited to 512 pixels or less.

Wu’s solution replaces them with micro­electro­mechanical system (MEMS) switches that physically move the waveguides from one position to another. “The construc­tion is very similar to a freeway exchange,” he said. “So, imagine you are a beam of light going from east to west. We can mechani­cally lower a ramp that will suddenly turn you 90 degrees, so that you are going from north to south.” MEMS switches are a known technology used to route light in communi­cations networks. This is the first time they have been applied to lidar. Compared with thermo-optic switches, they are much smaller, use far less power, switch faster and have very low light losses.

They are the reason Wu can cram 16,384 pixels on a 1-centi­meter-square chip. When the switch turns on a pixel, it emits a laser beam and captures the reflected light. Each pixel is equivalent to 0.6 degrees of the array’s 70-degree field of view. By cycling rapidly through the array, Wu’s FPSA builds up a 3D picture of the world around it. Mounting several of them in a circular confi­guration would produce a 360-degree view around a vehicle. Wu needs to increase FPSA reso­lution and range before his system is ready for commercialization. “While the optical antennas are hard to make smaller, the switches are still the largest components, and we think we can make them a lot smaller,” he said.

Wu also needs to increase the system’s range, which is only 10 meters. “We are certain we can get to 100 meters and believe we could get to 300 meters with continual improve­ment,” he said. If he can, conventional CMOS production technology promises to make inex­pensive chip-sized lidar part of our future. “Just look at how we use cameras,” Wu said. “They’re embedded in vehicles, robots, vacuum cleaners, surveil­lance equipment, biometrics and doors. There will be so many more potential appli­cations once we shrink lidar to the size of a smartphone camera.” (Source: UCB)

Reference: X. Zhang et al.: A large-scale microelectromechanical-systems-based silicon photonics LiDAR, Nature 603, 253 (2022); 10.1038/s41586-022-04415-8

Link: Sensor and Actuator Center, Dept. of Electrical Engineering and Computer Sciences, University of California Berkeley, Berkeley, USA

Digital tools or software can ease your life as a photonics professional by either helping you with your system design or during the manufacturing process or when purchasing components. Check out our compilation:

Proceed to our dossier

Digital tools or software can ease your life as a photonics professional by either helping you with your system design or during the manufacturing process or when purchasing components. Check out our compilation:

Proceed to our dossier