Chapter 2
Good to Know: Starlink

 2.1 File format
 2.2 Parameters
 2.3 How to find the current parameter values
  2.3.1 Extracting a value for scripting
 2.4 How can I view the metadata?
 2.5 What has already been done to the data?
 2.6 How to examine, process or extract a subset of your data
 2.7 How to get help

2.1 File format

Starlink routines run on HDS files (Hierarchical Data System), which normally have a .sdf extension (Starlink Data File). HDS files can describe a multitude of formats. The most common HDS format you will encounter is the NDF (N-Dimensional Data Format)[7]. This is the standard file format for storing data that represent N-dimensional arrays of numbers, such as spectra, and images. The parameter files discussed in Section 2.3 are also HDS format.

Raw data retrieved from the JCMT Science Archive come in FITS format. For information on converting between FITS and NDF see Appendix B.

Tip:
The .sdf extension of HDS filenames is not required when running most Starlink commands (the exception is Picard).

2.2 Parameters

Parameters control the behaviour and data processed in Starlink applications. A parameter expects a value that is one of the following types: string, boolean, integer, or single- or double-precision floating point. You can either specify parameter values on the command line or in response to prompts. Most parameters have sensible defaults leaving you to concentrate on the main parameters. Parameter usage is described in a tutorial.

2.3 How to find the current parameter values

A directory called adam is created in your home-space by default when you run Starlink applications. In this directory you will find HDS files for each of the application that you have run. These files contain the parameters used and results returned (if appropriate) from the last time you ran a particular application.

You can specify a different location for the adam directory by setting the environment variable ADAM_USER to be the path to your alternative location. This is useful when running more than one reduction to avoid interference or access clashes.

To see the ADAM parameters, run Hdstrace on any of the parameter files. For example, to see which parameters were used and the results from last time you ran stats, you can type the following from anywhere on your system:

  % hdstrace ~/adam/stats

This will report the following

  STATS  <STRUC>
     ADAM_DYNDEF    <DEFAULTS>      {structure}
     COMP           <_CHAR*132>     ’DATA’
     ORDER          <_LOGICAL>      TRUE
     MAXIMUM        <_DOUBLE>       36.666870117188
     MAXWCS         <_CHAR*132>     ’10.625136, -0.381122, -6.057133’
     MINWCS         <_CHAR*132>     ’10.720136, -0.017795, -31.05713’
     MEAN           <_DOUBLE>       0.25890738014562
     MINIMUM        <_DOUBLE>       -2.7510244846344
     SIGMA          <_DOUBLE>       0.72429928153974
      ..               ..                 ..
      ..               ..                 ..

You can see stats was run on the data array with ordered statistics as well as the resulting values. Any of the parameters returned by running Hdstrace on an ADAM file can be extracted using the command parget. In the example below, the mean value from the last instance of stats is printed to the screen.

  % parget mean stats

Tip:
Try not to exit from Starlink tasks by breaking in, since this may corrupt the parameter file (reported as an integrity check error). If this happens delete the parameter file. Enter !! at a prompt if possible to have a clean exit.

2.3.1 Extracting a value for scripting

parget is designed to make life easier when passing values between shell scripts. In the C-shell scripting example below, the median value from histat is assigned to the variable med. Note the use of the back quotes.

For more information on scripting our work see Appendix C.

  set med = ‘parget median histat‘

If the parameter comprises a vector of values these can be stored in a C-shell array. For other scripting languages such as Python, the alternative vector format produced by setting parameter VECTOR to TRUE may be more appropriate. Single elements of a parameter array may also be accessed using the array index in parentheses.

In addition to running hdstrace on the ADAM file, you can find a list of all parameter names that can be returned with parget in the Kappa manual under ‘Results Parameters’ for the command in question. Note that these names may be different from the names returned in a terminal when running application.

2.4 How can I view the metadata?

There are two Kappa tasks which are extremely useful for examining your metadata: fitslist and ndftrace, which can be used to view the FITS headers and properties, such as dimensions, of the data respectively. The third option is the stand-alone application Hdstrace.

fitslist This lists the FITS header information for any NDF (raw or reduced). This extensive list includes dates & times, source name, observation type, band width, number of channels, receptor information, exposure time, start and end elevation and opacity. In the example below, just the object name is extracted.
  % fitslist file | grep OBJECT

Likewise, if you know the name of the keyword you want to view, you can use the fitsval command instead, for instance

  % fitsval file OBJECT
ndftrace ndftrace displays the attributes of the NDF data structure. This will tell you, for example, the units of the data, pixel bounds, dimensions, world co-ordinates (WCS), and axis assignations.
  % ndftrace map fullframe fullwcs

An NDF can contain more than one set of world co-ordinates. The fullwcs parameter requests that all sets be listed rather than the currently chosen one.

hdstrace hdstrace lists the name, data type and values of an HDS (Hierarchical Data System) object. The following example shows the structure of a time-series cube, including the pixel origin of the data structure. To show just the first x lines of values for each parameter include the option nlines=x on the command line:
  % hdstrace file nlines=3

Otherwise to see all the lines and information that is available in each extension use nlines=all. The example below displays all the system temperature values and pipes the result to ‘less’ to make viewing easier.

  % hdstrace file.MORE.ACSIS.TSYS nlines=all | less

You can see it descends two levels (into MORE and ACSIS) to retrieve the information. Other information available at this level include receptors and receiver temperatures.

Full details of ndftrace and fitslist can be found in the Kappa manual. Details on Hdstrace can be found in the hdstrace manual.

2.5 What has already been done to the data?

If you are presented with a data file you may wish to see what commands have been run on it and, in the case of a co-added cube, which data went into it. Two Kappa commands can help you with this:

hislist The Kappa command hislist will return the history records of the NDF.
  % hislist myfile brief

Including the brief option returns a simple list of the commands. Omitting this will include the text associated with each command and can be useful for finding out what parameters were used in each case. This works for all NDFs including those reduced by the pipeline, for these all the commands executed as part of the pipeline reduction script will be listed. The version of software used for each step is also reported.

provshow The Kappa command provshow displays the details of the NDFs that were used in the creation of the given file. It includes both immediate parents and older ancestor NDFs.
  % provshow file

2.6 How to examine, process or extract a subset of your data

For all Starlink commands you can specify a sub-section of your data on which to run an application. You do this by appending the bounds of the section to be processed the NDF name. This may be on the command line or in response to a prompt.

The example below runs stats on a sub-cube within your original cube. This sub-cube is defined by bounds given for each axis. The upper and lower bounds for each axis are separated by a colon, while the axes themselves are separated by a comma. Note that the use of quotes is necessary on a UNIX shell command line, but not in response to a prompt or in many other scripting languages.

  % stats ‘cube(10.5:10.0,0.0:0.25,-25.:25.)’

The bounds are given in the co-ordinates of the data (Galactic for Axes 1 and 2 and velocity for Axis 3). You can find the number and names of the axes along with the pixel bounds of your data file by running ndftrace. In the example above the ranges are 10.5 to 10 in Longitude (note the range goes from left to right), 0 to 0.25 in Latitude, and -25 to +25 km s1 in velocity. To leave an axis untouched simply include the comma but do not specify any bounds.

Tip:
Be careful to give world co-ordinate values in floating point. Integers are interpreted as pixel indices.

To define your bounds in FK5 co-ordinates use the following format.

  % stats ‘cube(22h18m:22h32m,05d:05d30m,)’

To write a section of your data into a new file (called newcube.sdf in the example below) use ndfcopy with bounds.

  % ndfcopy ‘cube(10.5:10.0,0.0:0.25,-25:25)’ newcube title=’"l=10.5 sub-section"’

Here the option title defines a title for the new cube which replaces the title derived from the original.

You can also define a region as a number of pixels from a given origin. The example below extracts a 25×25 pixels cube around the position l = 10, b = 0.

  % ndfcopy ‘cube(10.0~25:0.0~25,)’ newcube

The extent of an NDF section can be specified as an arc-distance using WCS co-ordinates in the format of "centre extent".

For instance, ’image(10:12:30 40am,-23:23:43 40am)’ will create a section of extent 40 arcminutes on both axes ("as" and "ad" can be used in place of "am", indicating arcseconds and degrees).

This only works if the NDF’s current Frame is a SkyFrame (an error is reported otherwise).

Tip:
To help identify your sub-region in pixel co-ordinates, open your file in Gaia and change the built-in co-ordinates to PIXEL via Image-Analysis > Change coordinate > Built-in coordinates.

2.7 How to get help


Command

Description

Usage




showme

If you know the name of the Starlink document you want to view, use showme. When run, it launches a new web page or tab displaying the hypertext version of the document.

% showme sun95




findme

findme searches Starlink documents for a keyword. When run, it launches a new web page or tab listing the results.

% findme kappa
% findme heterodyne




docfind

docfind searches the internal list files for keywords. It then searches the document titles. The result is displayed using the UNIX more command.

% docfind kappa




Run routines with prompts

You can run any routine with the option prompt after the command. This will prompt for every parameter available. If you then want a further description of any parameter type ? at the relevant prompt.

% makemap prompt
 % REF - Ref. NDF /!/ > ?