|
|||
Archives Contribute
|
Press Release 12/29/2017 The Camera Culture group at MIT's Media Lab, which comprises of Ramesh Raskar and Achuta Kadambi, has been developing innovative imaging systems by using "time of flight" – an approach that gauges distance by measuring the time it takes light projected into a scene to bounce back to a sensor. There goal is to develop new depth sensors that could be sensitive enough to make self-driving vehicles practical. In a new paper in the journal IEEE Access, the team has presented a new approach to time-of-flight imaging that increases its depth resolution 1,000-fold – the type of resolution that could make self-driving cars practical. The new approach could also enable accurate distance measurements through fog, which has proven to be a major obstacle to the development of self-driving cars. "As you increase the range, your resolution goes down exponentially," said Kadambi, a joint Ph.D. student in electrical engineering and computer science and media arts and sciences. At a range of two meters, existing "time-of-flight" systems have a depth resolution of about a centimeter. That's good enough for the assisted-parking and collision-detection systems on today's cars. Kadambi conducted tests in which he sent a light signal through 500 meters of optical fiber with regularly spaced filters along its length. The tests suggest that at a range of 500 meters, the MIT system should still achieve a depth resolution of only a centimeter. "We're modulating the light at a few gigahertz, so it's like turning a flashlight on and off millions of times per second. But we're changing that electronically, not optically. The combination of the two is really where you get the power for this system," explained Raskar, dead of the Camera Culture group. Last year, Raskar was named the winner of the Lemelson-MIT Prize, which recognizes individuals who translate their ideas into inventions that improve the world. The Indian American researcher was awarded $500,000 for the prestigious prize. You may also access this article through our web-site http://www.lokvani.com/ |
| ||
Home | About Us | Contact Us | Copyrights Help |