The ability to estimate a specific set of parameters, without regard to an unknown set of other parameters that influence the measured data, or nuisance parameters, is described by the Fisher Information matrix, and its inverse the Cramer-Rao bound. Until recently, analytic solutions to the inverse of the Fisher Information matrix have been intractable for all but the simplest of problems. Scharf and McWhorter [1] have recently shown how to analytically compute this inverse for general problems. Through this general inverse they have shown that the ability to estimate the desired parameters of the data is related to the system sensitivity to these parameters that is orthogonal to the system sensitivity related to the nuisance parameters. A summary of this result follows. Later sections apply this theory to particular spatially incoherent optical systems.

Assume that a deterministic model of a particular spatially incoherent optical, or spatially incoherent remote-sensing, system has been found. This model should include all parameters of affecting the deterministic part of the measured signal. Fisher Information is then a measure of the information content of the measured signal relative to a particular parameter. The Cramer-Rao bound is a lower bound on the error variance of the best estimator for estimating this parameter with the given system.

Let the unknown system parameters of a given system be denoted by the length vector

where the noiseless measurement is some vector function of these parameters, say . The superscript denotes transpose. The actual measurement in any real system will always be corrupted by noise. The limit of this noise will be signal dependent shot noise or detector quantization noise. Let the noisy measurement by given by . With no loss of generality, assume a zero mean white gaussian noise with variance . Our ability, on the average, to estimate is bounded by the Cramer-Rao bound [4][3][2]. This bound can describe both biased and unbiased estimators. This work will consider only unbiased estimators. The variance of any unbiased estimator of one component of , say , is bounded below as

where is the Fisher Information matrix of the parameter vector , and is the diagonal element of . Let be the probability density function for the observed noisy data . The Fisher information matrix is then given by

where denotes expected value. Under the zero mean white gaussian noise assumption (3) reduces [1] to

The matrix is called a sensitivity matrix. Assume that the parameter is partitioned into two sets so that . One set of parameters will be those quantities desired from the estimation system. The other set denotes parameters that influence the measured data but whose quantities are not desired. In general, this second set of parameters negatively influences the estimation of the parameters of interest. The undesired parameters are therefore called ``nuisance parameters''. By partitioning the matrix G of (4) as

it can be shown that the inverse of the Fisher Information matrix of (3) is given by [1]

where

is a projection matrix projecting onto the space orthogonal to the space spanned by the matrix , or . The identity matrix is given by .

Notice that (6) is a general geometric formulation of the Cramer-Rao bound for a given general information processing system. The influence of the nuisance parameters on the estimation of the desired parameters is clearly stated. Consider the desired parameters as . Then the Cramer-Rao bound is inversely proportional to the norm of the Fisher Information pertaining to that is orthogonal to the Fisher Information of the nuisance parameters, or . In other words, the ability to estimate the parameters of interest is related to the system sensitivity to these parameters that is orthogonal to the system sensitivity of the nuisance parameters. For many applications this geometric formulation of the Cramer-Rao bound can also be given a spatial frequency interpretation.

Wed Nov 1 12:38:26 MST 1995