Published 01:30 IST, November 6th 2019
Jumping spiders lead scientists to develop advanced depth sensor
Jumping spiders lead scientists to develop advanced depth sensor at Harvard John A. Paulson School of Engineering and Applied Sciences mentioned study in PNAS.
Advertisement
Researchers at the Harvard John A Paulson School of Engineering and Applied Sciences (SEAS) have developed a compact and efficient depth sensor that could be used onboard microrobots, in small wearable devices, or in lightweight virtual and augmented reality headsets. The sensor has been created on the basis of the study of jumping spiders' mechanism of measuring depth and combines a multifunctional, flat metalens with an ultra-efficient algorithm to measure depth in a single shot. The research which was published in Proceedings of the National Academy of Sciences (PNAS) mentions how jumping spiders measure depth, unlike humans which are technically described as depth from defocus and how the factor aided in the development of recent technology.
Advertisement
How jumping spiders aided in the invention?
Todd Zickler, the William and Ami Kuan Danoff Professor of Electrical Engineering and Computer Science at SEAS and co-senior author of the study mentions that matching calculation, where you take two images and perform a search for the parts that correspond, is computationally burdensome. But humans have a nice, big brain for those computations but spiders are tiny, with tiny brains. Evolution has provided them with a more efficient system to measure depth where each principal eye has a few semi-transparent retinae arranged in layers, and these retinae measure multiple images with different amounts of blur. So, if it looks at a fruit fly with one of its principal eyes, the fly will appear sharper in one retina’s image and blurrier in another. It is the change in a blur that tells the spider about the distance to the fly.
Advertisement
Scientists develop metalens like that of the spiders
Until now large cameras with motorized internal components were required to replicate the mechanism. To solve the issue scientists discovered metalens that can simultaneously produce two images with different blur wrote, Federico Capasso, the Robert L. Wallace Professor of Applied Physics and Vinton Hayes Senior Research Fellow in Electrical Engineering at SEAS and co-senior author of the paper. Federico added that instead of using layered retina to capture multiple simultaneous images, as jumping spiders do, the metalens splits the light and forms two differently-defocused images side-by-side on a photosensor. Qi Guo, a PhD candidate in Zickler’s lab and co-first author of the paper mentioned that an ultra-efficient algorithm, developed by Zickler’s group, then interprets the two images and builds a depth map to represent object distance. With their ability to implement existing and new optical functions much more efficiently, it is a game-changer.
Advertisement
01:30 IST, November 6th 2019