Apr. 4, 2013 ? A standard camera takes flat, 2-D pictures. To get 3-D information, such as the distance to a far-away object, scientists can bounce a laser beam off the object and measure how long it takes the light to travel back to a detector. The technique, called time-of-flight (ToF), is already used in machine vision, navigation systems for autonomous vehicles, and other applications, but many current ToF systems have a relatively short range and struggle to image objects that do not reflect laser light well.
A team of Scotland-based physicists has recently tackled these limitations and reported their findings today in the Optical Society's (OSA) open-access journal Optics Express.
The research team, led by Gerald Buller, a professor at Heriot-Watt University in Edinburgh, Scotland, describes a ToF imaging system that can gather high-resolution, 3-D information about objects that are typically very difficult to image, from up to a kilometer away.
The new system works by sweeping a low-power infrared laser beam rapidly over an object. It then records, pixel-by-pixel, the round-trip flight time of the photons in the beam as they bounce off the object and arrive back at the source. The system can resolve depth on the millimeter scale over long distances using a detector that can "count" individual photons.
Although other approaches can have exceptional depth resolution, the ability of the new system to image objects like items of clothing that do not easily reflect laser pulses makes it useful in a wider variety of field situations, says Heriot-Watt University Research Fellow Aongus McCarthy, the first author of the Optics Express paper.
"Our approach gives a low-power route to the depth imaging of ordinary, small targets at very long range," McCarthy says. "Whilst it is possible that other depth-ranging techniques will match or out-perform some characteristics of these measurements, this single-photon counting approach gives a unique trade-off between depth resolution, range, data-acquisition time, and laser-power levels."
The primary use of the system is likely to be scanning static, human-made targets, such as vehicles. With some modifications to the image-processing software, it could also determine their speed and direction.
One of the key characteristics of the system is the long wavelength of laser light the researchers chose. The light has a wavelength of 1,560 nanometers, meaning it is longer, or "redder," than visible light, which is only about 380-750 nanometers in wavelength. This long-wavelength light travels more easily through the atmosphere, is not drowned out by sunlight, and is safe for eyes at low power. Many previous ToF systems could not detect the extra-long wavelengths that the Scottish team's device is specially designed to sense.
The scanner is particularly good at identifying objects hidden behind clutter, such as foliage. However, it cannot render human faces, instead drawing them as dark, featureless areas. This is because at the long wavelength used by the system, human skin does not reflect back a large enough number of photons to obtain a depth measurement. However, the reflectivity of skin can change under different circumstances. "Some reports indicate that humans under duress -- for example, with perspiring skin -- will have significantly greater return signals," and thus should produce better images, McCarthy says.
Outside of target identification, photon-counting depth imaging could be used for a number of scientific purposes, including the remote examination of the health and volume of vegetation and the movement of rock faces, to assess potential hazards. Ultimately, McCarthy says, it could scan and image objects located as far as 10 kilometers away. "It is clear that the system would have to be miniaturized and ruggedized, but we believe that a lightweight, fully portable scanning depth imager is possible and could be a product in less than five years."
Next steps for the team include making the scanner work faster. Although the data for the high-resolution depth images can be acquired in a matter of seconds, currently it takes about five to six minutes from the onset of scanning until a depth image is created by the system. Most of that lag, McCarthy says, is due to the relatively slow processing time of the team's available computer resources. "We are working on reducing this time by using a solid-state drive and a higher specification computer, which could reduce the total time to well under a minute. In the longer term, the use of more dedicated processors will further reduce this time."
The research was funded by the United Kingdom's Engineering and Physical Sciences Research Council.
Share this story on Facebook, Twitter, and Google:
Other social bookmarking and sharing tools:
Story Source:
The above story is reprinted from materials provided by The Optical Society.
Note: Materials may be edited for content and length. For further information, please contact the source cited above.
Journal Reference:
- Aongus McCarthy, Nils J. Krichel, Nathan R. Gemmell, Ximing Ren, Michael G. Tanner, Sander N. Dorenbos, Val Zwiller, Robert H. Hadfield, Gerald S. Buller. Kilometer-range, high resolution depth imaging via 1560 nm wavelength single-photon detection. Optics Express, 2013; 21 (7): 8904 DOI: 10.1364/OE.21.008904
Note: If no author is given, the source is cited instead.
Disclaimer: Views expressed in this article do not necessarily reflect those of ScienceDaily or its staff.
marshawn lynch earthquake bay area clear channel drexel dale george will obama birth certificate
No comments:
Post a Comment