This application creates a new NDF
from an existing NDF by dividing the
component of the input NDF by the square root of its VARIANCE
DATA array in the output NDF thus measures the signal to noise ratio in the input
Anomalously small variance values in the input can cause very large spurious values in
the output signal to noise array. To avoid this, pixels that have a variance value
below a given threshold are set bad in the output NDF.
makesnr in out
IN = NDF (Read)
The input NDF. An error is reported if this
NDF does not have a VARIANCE component.
MINVAR = _REAL (Read)
The minimum variance
value to be used. Input pixels that have variance values smaller than this value will
be set bad in the output. The suggested default is determined by first forming a
histogram of the logarithm of the input variance values. The highest peak is then found
in this histogram. The algorithm then moves down from this peak towards lower variance
values until the histogram has dropped to a value equal to the square root of
the peak value, or a significant minimum is encountered in the histogram. The
corresponding variance value is used as the suggested default.
OUT = NDF
The output signal to noise NDF. The VARIANCE component of this NDF will
be filled with the value 1.0 (except that bad DATA values will also have bad
makesnr m51 m51_snr This example divides the
DATA component of the NDF called m51, by the square root of its own VARIANCE
component, rejecting pixels below the default MINVAR value, and writes the resulting
signal-to-noise values to an NDF called m51_hires.
This routine correctly processes the AXIS, DATA, QUALITY, LABEL, TITLE, HISTORY, WCS,
and VARIANCE components of an NDF data structure and propagates all extensions.
The DATA values in the output NDF represent dimensionless ratios, and therefore the
UNITS component is not propagated.