Single Lens, Single Image Passive Ranging
Ranging is often accomplished using active technologies such as radar, lasers, and sonar. Passive systems have utilized stereo vision and other multiple-aperture approaches. In this system, ranging is accomplished by passively measuring the wavefront from unknown objects in the field of view, using a single customized aperture. These wavefronts are modified by an optical mask to provide orthogonal range subspaces. This allows detection of overlapping, unknown objects. These procedures are not ad-hoc, rather they are firmly rooted in Information Theory , and as such provide a powerful basis for system design and evaluation. Technical information and theoretical limitations pertaining specifically to range detection and estimation may be found in a discussion on Orthogonal Passive Ranging.

Example roadway scene to be ranged to (~1/10-scale)
Images are wavefront-coded using a special ranging mask. For ranging, the mask is a cosinusoidal shape and rectangularly separable, formed as cos(x)cos(y). The amplitude profile of such a mask is shown below. Implementation in an optical system is also shown, where the mask is placed close to the aperture stop of the lens system.

Mask amplitude profile, and an example optical system.
Range estimation from images formed using the ranging mask is accomplished using spectral estimation methods. The Ambiguity Function (AF) for the mask provides a display of the Optical Transfer Function (or OTF) of this system, shown below. Broad spectra objects which are out of focus will form images with a sinusoidal spectra, where the frequency of the sinusoid is range-dependent. The two plots below the AF indicate distinctly different peak locations for objects at different misfocus values (i.e. different ranges).

Ambiguity Function and resulting OTFs for ranging mask.
The example roadway scene, as imaged through the ranging mask. This is the "wavefront-coded image" that will be analyzed for spectral peaks.

Wavefront-coded roadway scene, in daytime.
The resulting range-map for the daytime roadway scene. This range map has been computed at the coarse spatial resolution of 50x31; a pixel-by-pixel transformation is also possible, generating range maps nearly the same size as the coded image. Note that a highly aberrated optical system and crude processing have been implemented in these simulations. Optics, detection thresholds, and spectral estimators are far from optimized. However, even with this first-grade approach to a simulator setup, understandable range images have resulted. A refined system will quickly follow...

Range map for daytime scene.
While the system is designed to range in a passive manner (i.e. the system itself provides no sources), it can also range to active sources being controlled by the objects to be ranged to. An example of such a scenario is shown below, with the same roadway scene but simulated for evening illumination levels. Tail light LEDs are on the fire engine and the pickup truck, and head lamps from the "ranging equipped" vehicle are also simulated (thereby illuminating the lane markers and reflector surfaces on the preceding vehicles).

Wavefront-coded roadway scene, in evening.
The resulting range-map for the evening roadway scene. Again, this range map has been computed at the coarse spatial resolution of 50x31.

Range map for evening scene.