Global Oscillation Network Group****
National Solar Observatory
National Optical Astronomy Observatories
This document was written in 1994. The text has not been updated.
This document presents the requirements for, and the detailed design of, the downstream pipeline processing of site-day calibrated velocity images to site-day l-nu diagrams.
2.1. Software Functional Requirements
2.2. Software Performance Requirements
2.3. Input Data Specifications
2.3.1. Input Images
2.3.2. Input PLMTRIM Data
2.4. Output Data Specifications
2.4.1. Time Series
2.4.2. Power Spectra
2.4.3. l-nu Diagrams
2.5. Disk Space
3. Detailed Design
3.1. Schematic Program Flow
3.2. Algorithm and Design Considerations
3.2.2. Remapping to Heliographic Coordinates
18.104.22.168. The Angles and
3.2.3. Temporal Filtering
3.2.4. Spherical Harmonic Transform
3.2.5. SHT Output
4. Algorithm Verification
4.2. Heliographic Remapping
4.2.1. Correction Tests
4.2.2. P-angle Correction Tests
4.2.3. Offset Angle Correction Test
4.2.4. Elliptical Image Tests
4.3. Temporal Filtering
4.3.1. dnspipe.filter = 1
4.3.2. dnspipe.filter = 2
4.3.3. dnspipe.filter = block_filter_length
4.3.4. dnspipe.filter = filter_file_name
4.4. Spherical Harmonic Transform
5. Timing Results
Appendix I Software Installation
Appendix II DMAC Operator Instructions
The most useful quality assessment diagnostics of the GONG pipeline are the l-nu diagrams which are produced during the last stage of the pipeline processing. Therefore, each site-day of data is to be processed through to l-nu diagrams in order to evaluate the quality of the data for that particular day of observation. Assuming that the l-nu diagrams indicate no problems, then the time series (which are the required science data products) will be archived in the DSDS.
The requirements for the downstream pipeline processing are as follows:
The single performance requirement for this software is one of speed.
This requirement is to be achieved by a combination of efficient software and multiple processors.
The downstream processing code accepts as input 2-dimensional images (normally calibrated velocities) for which the following header keywords must be used.
The PLMTRIM data file is an ASCII file generated by the GRASP task suntrans.plm_trim and used by many of the other SUNTRANS tasks. This file contains the heliographic remapping parameters, and a series of precomputed limits to be used by the SHT subroutine.
A DMAC version of this file is to be stored in GRASP (grasplib) and will be referenced via a hidden task parameter. Users of GRASP will be able to change this parameter to refer to their own files should they desire different remapping parameters or L-max (the maximum value of the spherical harmonic degree l).
The output time series images (which are archived) are to be assembled
according to the scheme
presented in the document GRASP Database Specifications (Anderson, 1988),
(3-dimensional images, one image per l-value, with the x-axis being the time
axis, the y-axis being m-value, band 1 holding the real part of the SHT and
band 2 holding the imaginary part of the SHT) with the following header keywords.
The output power spectra images (which are not to be archived) are to be assembled
according to the scheme
presented in the document GRASP Database Specifications (Anderson, 1988),
(3-dimensional images, one image per l-value, with the x-axis being frequency
axis, the y-axis being m-value, band 1 holding the power spectrum and
band 2 holding the phase spectrum) with the following header keywords.
The output l-nu images (which are to be put in the online archive) are to
be assembled according to the scheme
presented in the document GRASP Database Specifications (Anderson, 1988),
(2-dimensional images, with the x-axis being l-value,
the y-axis being frequency) with the following header keywords.
A single site-day will contain a maximum of 12 hours of data. This means that
there could be a many as 720 images to be processed. The minimum amount
of diskspace necessary to process these images is 643 MB (L-max=250).
In addition, 186 MB of disk space will be required to hold the FITS format time series images (assuming 32-bit FITS) for each site-day processed until the DSDS archive tape is written. Since the logical processing step would be to process a site-week, 1.3GB of disk is required to hold the FITS files waiting to be written to tape.
Thus, the minimum amount of disk required to efficiently process a site-week of data is 1.94 GB.
This section presents the detailed design of DNSPIPE. Many of the routines that DNSPIPE uses are modelled after tasks in the SUNTRANS package of GRASP. Therefore, algorithm descriptions of the spherical harmonic transform, heliographic interpolation and remapping will not be given in this document. What is discussed in this section is the overall schematic flow of the program and important specific design considerations to optimize efficiency.
The parameters for the task dnspipe are the following:
pi> lpar dnspipe input = Input files (trim_file = "pipeline$lib/plm250.sinlat") PLMTRIM's output ASCII trimming file (offset = 90.) Camera offset angle (degrees) (pangle_cor = no) Solar p-angle correction required? (coude_cor = no) Coude correction required (i.e., breadboard dat (turret_cor = no) Turret correction required (i.e., rotator off)? (filter = "pipeline$lib/w21.fltr)Filter file name, or block filter length (interp_type = "spline3") Interpolant (nearest,linear,poly3,poly5,spline3 (geometry = "fndlmb") Use geometry fndlmb|hgeom (apod_type = "cos") Apodization type (none|cos|kb|dc|cb) (apod_par = 2.5) Apodization parameter for kb or dc (apod_cbr = 0.5) Apodization radius for cosine-bell (cb) (ignore_mag = yes) Ignore Magnetograms? (gapfill = yes) Gap fill time series? (powerspec = yes) Output Power Spectra? (lnuspec = yes) Output L-Nu diagrams? (interp_out = no) Output interpolated images? (detrend_out = no) Output interpolated, detrended images? (apod_out = no) Output apodized, interpolated, detrended images (sht_size = 20) Size of SHT hold array (orientation = "counter") Input image orientation (counter|clock) (mode = "ql")
The three optional output switches (interp_out, detrend_out, apod_out) are to be used for the purpose of debugging the code, and more importantly, searching for bad images should the first pass through a data set produce dubious l-nu diagrams.
Apodization is used to roll off the sharp edges of a data string prior to passing that data through a Fourier Transform in order to prevent large sidelobe structures in the output transformation.
The method of apodization used by DNSPIPE follows that of Duvall's (1994) reduction of South Pole data. A unit power apodization image is created in heliographic coordinates. The remapped velocity images are multiplied by the apodization image prior to entry into the spherical harmonic transform subroutine.
The price paid for apodizing is a broader central peak in the power
spectrum. It is not yet clear which type of apodization produces the best
results, or whether velocity data need to be apodized at all because there is
a natural decrease in the observed velocities towards the limb of the image.
Thus, DNSPIPE has five apodization options:
None: No apodization done.
Cosine: Equal area apodization. Response has a narrow core and a high first sidelobe.
Dolph-Chebychev Wider core, and low equal height sidelobes.
Kaiser-Bessel Very wide core, very low sidelobes.
Cosine Bell Very narrow core, but very high, ugly sidelobes. Also known as the Tukey window.
The apodization image is calculated from a 512-square grid upon which a circular image (of value 1) is placed. All of the apodizations listed above are radially symmetric about the center of the disk image, and thus depend only on the angular distance from disk center. Consider a unit sphere projected onto the image plane to produce the observed disk image of radius Rd as shown in the figure below.
The angular distance, rho, to each pixel in the disk image is given by:
The apodization windows are then given by:
where the parameter Rcb is the fraction of the disk radius outside of which the window roll-off is to occur. The following figures are taken from Harris, et al. (1978), and show representative filter responses for the apodization window options discussed above.
The input disk images are remapped to a heliographic latitude-longitude grid that is symmetric in latitude about the equator and symmetric in longitude about the central meridian (i.e., there are an even number of points in each grid dimension).
The heliographic longitude grid is specified in the range The heliographic latitude grid is actually specified in colatitude, , ( = 90-latitude) or sine-colatitude. The number of longitude points, longitude dispersion, number of colatitude/sine-colatitude points, colatitude/sine-colatitude dispersion, and latitude type (i.e., colatitude or sine-colatitude) are obtained from the PLM_TRIM file.
The remapping to the heliographic grid is done using the image interpolator software of IRAF. For each point in the heliographic grid, (,L), the corresponding point on the input image, (x,y) must be calculated, and the value of the input image at that point is assigned to the output.
Consider an orthogonal coordinate system with the origin at the center of the solar disk (the sub-earth point, E) of radius r0, the y-axis parallel to the solar rotation axis and increasing to the north, and the x-axis increasing to the west as shown in the diagram below.
The angle is determined by ephemeris calculations and depends on the time of observation. The ephemeris used by DNSPIPE was originally written by Lytle (1988) for reduction of Kitt Peak Vacuum Telescope data in the VTEL package of IRAF. This ephemeris ignores lunar and planetary perturbations.
The angle in the equations given in the previous section is the angle (measured eastward on the sky) from the y-axis of the camera coordinate sytem to the heliographic north rotation axis. This angle is the sum of the P-angle (angle eastward between celestial north and heliographic north) and the camera offset angle which is determined by the configuration of the instrument. The configuration of the instrument depends on the type of light feed and/or camera mounting.
where the camera_offset angle is the remaining angle between celestial north
and heliographic north at noon when P and Coude angles haven accounted for.
This angle must be determined via from the data using the method described
in the SUNTRANS User's Guide.
Should the camera rotator not be functioning for whatever reason, then
where the turret_angle (the angle eastward between the celestial north and the y-axis of the camera) depends on the latitude of the observer (lat), the declination of the sun (dec) and the hour angle of sun (HA) and is given by
The current thought on removing the low frequency signal from the velocity images is divided into a number of possible options. The main three are to use:
a 21-minute block running mean, renormalized across data gaps, a 2-point backward difference filter, deterministic (i.e., modelled) detrending.
dnspipe has been written in such a fashion, that it will accept for the parameter dnspipe.filter a number (positive integer), or a filename:
The SHT algorithm is described by Williams (1993) and won't be discussed in this document. The algorithm used by DNSPIPE is the same, although it's implementation differs slightly.
The SHT subroutine processes a number of images (maximum of 20) simultaneously, holding all the data for these in memory. For each image, the output from the SHT subroutine is a complex coefficient array where ordered as follows:
m=0 m=1 m=l l=0 l=1 l=2 ... LMAX l=1 l=2 ... LMAX l=LMAX
l=0 l=1 l=2 ... l=LMAX m=0 m=0 m=1 m=0 m=1 m=2 m=1 m=2 ... m=l
For each value of l, there are (2l+1) m-elements that must be written to a single column in the time series for that value of l. Writing columns one (or several) at a time is highly inefficient. Also, most systems will only allow on the order of 60 disk files to be open at the same time, thus a lot of time would be spent opening and closing time series files each time a column is to be written.
The solution devised for DNSPIPE is to transpose the complex coefficient array in memory and then write that array out to an intermediate disk image. These intermediate images will be 2-dimensional, complex, with the x-axis being time, and the y-axis being the l/m reordered SHT coefficient vector. The length of the x-axis is determined by the number of images that can be simultaneously worked on. Once the SHT calculation is completed for all the input images, all of the intermediate files are opened (at most 36 files open simultaneously if 20 images processed at a time) and a subroutine looping over l-values reads the appropriate data from each intermediate file and inserts that data into the time series for that l-value.
Algorithm verification was done for several portions of the code. In most cases, verification was accomplished by executing DNSPIPE on artificial data sets that were specifically created to produce a certain output from a particular execution of the program.
To check the apodization, a set of artificial images of size 512x512, pixel value 1, and limb parameters:
FNDLMBXC= 256. FNDLMBYC= 256. FNDLMBMI= 200. FNDLMBMA= 200. FNDLMBAN= 0.
The magnitude distributions are shown below. Note that response of the windows is not quite the same as in §3.2.1 (especially for the Dolph-Chebychev window) due to a combination of the window being applied to two dimensions (i.e., with radial symmetry) and the remapping not extending all the way to the limb.
Basic tests were performed to verify the B0-angle correction, the p-angle correction, camera offset angle correction and the input image orientation. Artificial images with specified B0 and p angles were created for these tests. DNSPIPE was used to reduce these images and output remapped images which could be examined for correctness. An artificial image is shown below for B0=20 degrees, xp=0, and xoffset=90.
When these images are remapped to heliographic coordinates, the grid lines should become orthogonal and symmetric about the equator and the central meridian out to the solar limb. The fiducial marks should appear at the top (i.e., heliographic north) and right (i.e., heliographic west) of the output image. The results of the tests are given in the following sections.
For the B0-angle tests, the artificial images were given large B0 values to make it easy to analyse the output. The black corners in the following images are extrapolated data from beyond the visible limb. For real data, |B0|<=6 degrees.
For the P-angle tests, the artificial images were made with xB0=0 and P=±15 degrees. For real data x|P|<=20 degrees.
This test was conducted on an artificial data set with B0=20, xP=15, an offset angle of 90 degrees and counterclockwise orientation. Since the tests described above verify the corrections for all the possible B and xP angles as well as orientation, only a single test here is needed to demonstrate that the offset angle is correctly added to the sum of angles that define the angular distance eastward from the y-axis of the camera to the rotation axis (heliographic north) of the sun.
Note that verification of the offset correction for GONG breadboard data, and prototype/field data taken with the camera rotator off, has been done on real data sets and will not be discussed here.
Reduction of prototype data on a regular basis has shown that we are correctly dealing with the elliptical images produced by the GONG instrument.
Constant value artificial images were used to test the four main types of temporal filters. A group of 30 artificial images (f_001 to f_030) were created each having a constant pixel value corresponding to the number of the image. These images were reduced by DNSPIPE saving the detrended, interpolated output. The mean values of the output images were then inspected and compared with prediction.
The mean value of the output detrended images should be the same as the input images as no filtering is supposed to be done.
The mean of the output detrended images should be the difference of the mean values of the current input image and the image before it.
The group of 30 artificial images were processed with a 21-point block running mean. The mean of the output should be zero for images 11 through 20, negative for images 1 through 10, and positive for images 21 through 31.
Real verification of the SHT calculation cannot be done at DMAC since all of the SHT code is related to a common source, the South Pole pipeline. Independent verification is planned for the fall of 1994, by having the SOI pipeline process a day of GONG data (and vice versa).
The fact that we currently process prototype data and do not see anything unexpected in the power spectra leads us to believe that the SHT code does not contain any major bugs.
As this software will be part of external IRAF package GRASP, no special installation is required, other than described in the GRASP Installation Guide for IRAF (Anderson, 1993). The mkpkg file for DNSPIPE is shown below.
# Make DNSPIPE $call relink $exit update: $call relink $call install ; relink: $set LIBS = "$(LIBS) -liminterp -lgrutil" $update libpkg.a $omake x_dnspipe.x $link x_dnspipe.o libpkg.a $(OBJS) $(LIBS) -o xx_dnspipe.e ; libpkg.a: t_dnspipe.x <imhdr.h> <imset.h> <math.h> <math/iminterp.h> \\ "dnspipe.h" dns_apodize.x <math.h> <mach.h> "dnspipe.h" dns_dt.x dns_ephem.x <math.h> dns_filter.x "dnspipe.h" dns_initsht.x "dnspipe.h" dns_interp.x <math.h> <math/iminterp.h> "dnspipe.h" dns_latlong.x <math.h> "dnspipe.h" dns_reorder.x "dnspipe.h" dns_sht.x <math.h> "dnspipe.h" dns_shtall.x <math.h> "dnspipe.h" dns_outsht.x <math.h> "dnspipe.h" dns_spigot.x <imhdr.h> <imset.h> "dnspipe.h" dns_transpose.x dns_tseries.x <imhdr.h> <imset.h> <mwset.h> "dnspipe.h" dns_turret.x <imhdr.h> <math.h> dns_wwlegtran.x <math.h> "dnspipe.h" dns_wwlegall.x <math.h> "dnspipe.h" Bessel_i0.x Dolph_Cheby.x <math.h> fftrc.f fourt.f ; install: $move xx_dnspipe.e graspbin$x_dnspipe.e ; debug: $set XFLAGS = "$(XFLAGS) -c -z -F -g -q" $set LFLAGS = "$(LFLAGS) -z -g -q" $call relink ; memcheck: $set XFLAGS = "$(XFLAGS) -z" $set LFLAGS = "$(LFLAGS) -z" $set OBJS = "/lac2/iraf/extern/memcheck/memcheck.o /lac2/iraf/extern/memcheck/zzdbg.o /usr/lib/debug/malloc.o /usr/lib/debug/mallocmap.o" $call relink ; listing: !lpr -Pversatec t_dnspipe.x dns_*.x dnspipe.h ;
DMAC Operator Instructions
The following is the general procedure to be followed by DMAC operators when reducing site day data.
% cl cl> grasp gr> pipeline pi>
pi> !tar -vxbf 126 /dev/nrst9 # Get .toc and .lbl file pi> edit v000100.toc # Find tape file numbers for velocity data # Normally, this data starts at file 2 and # is sequential on tape pi> !mt -f /dev/rst9 rew pi> mkdir BByyddmm # Create a directory using UT date of data pi> cd BByyddmm # Go to that directory pi> !mt -f /dev/rst9 fsf 1 # Skip to proper file on tape pi> !tar -vxbf 126 /dev/nrst9 # Extract the FITS files pi> rfits *.fits 1 data datat=r old+ # Convert to IRAF images pi> delete *.fits # Delete the FITS files pi> cd .. pi> mkdir BByyddmm # Create directory for next day pi> !tar -vxbf 126 /dev/nrst9 # Get past EOF of last tar file pi> !tar -vxbf 126 /dev/nrst9 # Extract next data set pi> rfits *.fits 1 data datat=r old+ # Convert to IRAF images pi> delete *.fits # Delete the FITS files ...
pi> cd data_directory # Go to data directory pi> files *vci*.imh > inlist # Prepare input list for DNSPIPE pi> unlearn dsnpipe # Set DNSPIPE parameters to pipeline default pi> dnspipe @inlist # Execute
pi> imdelete *vci*.imh,*vdp*.imh pi> mkdir tseries # make directory for time series FITS files pi> cd tseries pi> files ../*vdt*.imh > in pi> copy in out pi> !replace '../' '' out pi> !replace 'imh' 'fits' out pi> wfits @in @out pi> delete in,out pi> cd .. pi> mkdir lnudiagrams # make directory for L-Nu diagram FITS files pi> cd lnudiagrams pi> files ../*vdl*.imh > in pi> copy in out pi> !replace '../' '' out pi> !replace 'imh' 'fits' out pi> wfits @in @out pi> delete in,out pi> cd ..
pi> imdelete *.imh
****The Global Oscillation Network Group (GONG) is a community-based
project funded principally by the National Science Foundation and administered
by the National Solar Observatory. NSO is a division of the National Optical
Astronomy Observatories, operated by the Association of Universities for Research
in Astronomy, Inc. under a cooperative agreement with the National Science
****This page was last updated by David Landy, on Jan 6, 1999.