Blog

Apr 3, 2022

Engineering team develops approach to enable simple cameras to see in 3D

Posted by in categories: mobile phones, robotics/AI, transportation

Standard image sensors, like the billion or so already installed in practically every smartphone in use today, capture light intensity and color. Relying on common, off-the-shelf sensor technology—known as CMOS—these cameras have grown smaller and more powerful by the year and now offer tens-of-megapixels resolution. But they’ve still seen in only two dimensions, capturing images that are flat, like a drawing—until now.

Researchers at Stanford University have created a new approach that allows standard image sensors to see in three dimensions. That is, these common cameras could soon be used to measure the distance to objects.

The engineering possibilities are dramatic. Measuring distance between objects with light is currently possible only with specialized and expensive —short for “light detection and ranging”—systems. If you’ve seen a self-driving car tooling around, you can spot it right off by the hunchback of technology mounted to the roof. Most of that gear is the car’s lidar crash-avoidance system, which uses lasers to determine distances between objects.

Comments are closed.