In the “what will they think of next” category, the U.S. military’s DARPA is developing new systems that can theoretically reconstruct a 3-D image of a scene from a single vantage point, including objects not visible in the line-of-sight, using photons. Some are describing the new REVEAL system as a “super 3-D camera” that can see through walls or enable the identification of a threat from long distances away.
DARPA has issued a request for research proposals for methods to extract more information from light particles than traditional cameras can. Researchers note that light particles called photons carry actually many layers of information about the surroundings they pass through, but normal imaging systems such as cameras are only able to gather the small fraction of this information that is in the visible spectrum. Accessing this “photon history” would enable remarkable new imaging systems.
More on REVEAL research program
The DARPA statement noted that the “Revolutionary Enhancement of Visibility by Exploiting Active Light-fields” (REVEAL) program will be funded in two 24-month phases. The initial phase will attempt to establish the fundamental limits of single-viewpoint scene reconstruction, using lab experiments to develop and validate critical concepts and approaches. Phase 2 is designed to evaluate full 3D scene reconstruction under normal illumination conditions as well as create a general theoretical framework for exploiting light’s multiple degrees of freedom. REVEAL is a basic research project and is not intended to actually develop practical hardware, software or imaging systems.
In practical terms, the technology would enable a squad of infantry to “see” the position of a sniper behind a tree, rock or in a building from a distance away using photon history, and thus be in a position to neutralize the threat.
Statement from DARPA
“There are some current limited efforts attempting to exploit some of light’s multiple degrees of freedom, but REVEAL aims to make a revolutionary leap forward by simultaneously addressing all aspects of light,” commented Predrag Milojkovic, program manager in DARPA’s Defense Sciences Office. “In effect, we want to use mathematical methods to coax from photons a little more of a story about where they’ve been and what they’ve seen.”
“Ultimately, collecting all pertinent information about a scene could enable computational generation of arbitrarily located virtual viewpoints and effectively allow ‘flying through the scene’ without changing one’s physical location,” Milojkovic added.