XMM-Newton
Science Analysis System: User Guide
Next: 6.7.1 Astrometry in OM images
Up: 6 Analysis of OM optical monitor data
Previous: 6.6.3 Example of grism data processing
6.7 Analysing OM data
As it has been pointed out already, OM data in image mode and fast mode with
the normal filters, are fully processed by the SAS pipeline. All data,
including the grisms can be run through the corresponding chain: for each
exposure of a given observation, all necessary corrections are applied to
the data files. Then a source detection algorithm is used to identify the
sources present in the image. Standard aperture photometry is applied to
obtain the count rates for all detected sources. These rates are corrected
for coincidence losses, dead time and time dependent sensitivity degradation
of the detector, and finally OM instrumental
magnitudes and standard colour corrections are computed as well as absolute
fluxes. A final source
list is obtained from all exposures and filters. The detector geometric
distortion correction and astrometric corrections
are applied to each source's position, and also the whole image is converted
to sky coordinates space and rotated so as to have the North on the top.
Default image windows are also combined to obtain a mosaic (per filter)
of the FOV. In the case of grism data, the spectra are searched for, extracted
and calibrated in wavelength and absolute flux. Astrometry is also performed
on grism data to compute the astronomical coordinates of the sources whose
spectra have been extracted.
However, one has to be aware of some peculiarities of OM data before starting
the analysis.
- Artifacts
OM images show several artifacts. These artifacts are generally only
visible when viewing the OM images in a high contrast and in
logarithmic display (see the OM dedicated section of the UHB for some
example images). Artifacts can however affect the accuracy of a
measurement, e.g. by increasing the background level. The following
artifacts may be present in the raw OM images.
- Out of time events: very bright sources with count rates of
several tens of counts per second show a strip of events along the
readout direction. These events correspond to out of time photons
arriving while the detector is read out.
- "Smoke rings": bright sources generate smoke ring like structures,
which are located radially away from the centre of the field of
view. These rings are caused by photons which are reflected off the
detector entrance window back onto the detector photocathode.
- Fixed pattern noise: raw OM images show a modulo 8 regular pattern
originating from imperfections of the event centroiding algorithm in
the instrument electronics. This modulo 8 pattern is removed during image
analysis by the task ommodmap.
- Straylight features: straylight is caused by a chamfer in
the OM detector housing which reflects light from outside the OM FOV onto
the active detector area. This reflected celestial background light
sums up onto a circular area with an increased background rate in
the centre of the OM field of view. The light coming from bright
sources and reflected by the chamfer can also produce loop like
structures in an OM image. These loops can degenerate into long
streaks depending on the source positions.
- Grism Data
The complexity of grism data already mentioned earlier increases by the
eventual presence of the artifacts described before
(see § 6.4.4). Great care has to be taken when analysing
the output products of grism data processing with SAS.
The following points should also be kept in mind:
- The OM operates in photon counting mode but images are accumulated
on board. Good time intervals (GTI) have no meaning in OM data.
Either the full exposure is selected or discarded.
- Contrary to the X-ray instruments, an OM exposure does not provide
direct energy information except when grisms are selected.
- As it has been explained, a flatfield response of unity is assumed.
- Coincidence losses can occur which depend on the source brightness
and on the CCD frame rate. The frame rate itself depends on the selected
configuration of the detector science and tracking windows. If two or
more events are located close to each other within the same CCD readout
frame, they are detected and counted as one event. Also when photon splashes
overlap, a mis-location is given by the centroiding algorithm. Another
consequence of coincidence losses is event depletion, occurring at high
count rates around the central source position.
- Corrections for coincidence losses and for detector deadtime are
applied when aperture photometry is performed on the detected sources. These
corrections are not applied to any of the produced images.
- The OM filters do not form a proper photometric system. However the
photometric calibration of the instrument, based on observations with OM and
from the ground, allows the SAS to obtain standard U, B, V magnitudes and
colours in the Johnson system. (This applies to stars. Extended objects should
be treated with care.) For the UV filters (UVW1, UVM2 and UVW2) AB
Magnitudes have been defined in addition to the instrumental system.
An absolute flux conversion is provided for all filters.
- The OM point spread function (PSF) has wide wings. This is taken into
account in the application of the calibration with SAS through the proper
setting of photometric apertures. A similar approach has to be taken if one
wants to make an independent processing and analysis of OM data using any
other data reduction package.
- Straylight features, already mentioned, complicate the background
subtraction in some cases,
specially when the target star is located in or close to a straylight feature.
- Imaging and fast mode data are corrected for time sensitivity
degradation, so that data of a given source (count rates, photometry) obtained
at different epochs can be compared. Grism data processing does not correct for
time sensitivity degradation.
In principle, after SAS has been run there is no need for further data
reduction. The observation
source list should contain the calibrated data with their errors. Therefore,
the user can proceed with the analysis and interpretation of the processed
data. However,
some checking is recommended to verify the consistency of the data output.
The whole data processing can be repeated easily by a Guest Observer
or XSA user, should any calibration file be updated, and what is more
important, in case of doubtful results. The processing chains or the
pipeline, apply default options
in all SAS tasks, which eventually can be changed by the GO in order to
improve the quality of the results. In particular, the source detection
is very sensitive to the artifacts that are common in OM.
In what follows, it is outlined the checks that a user should perform on OM
processed data (by the standard Pipeline or by running SAS), and the use of
one of the tasks, omdetect,
where the user can modify parameters affecting the source detection and
therefore the overall results of the data analysis.
- Checking omichain output products:
- The first thing to do is to overlay the image source list onto the sky
image. This can be achieved with implot, which
will overplot the detected sources on the corresponding image,
and it will allow checks that all sources are real and that the one(s)
of interested has/have been properly identified. A new task,
srcdisplay does the same job and it is more friendly to use.
If the REGION files
produced by omdetect have been preserved (in manual run of the chain or
the task), then ds9 can also be used to plot the detections
on top of the image.
If the background is strongly affected by straylight features, this
check is very important.
Alternatively, the task omsource can be used to do the checking. In this
case, the obtained results can be modified by re-doing the photometry in an
interactive way.
- Inspection of the combined source list (e.g. using fv) will allow
us to check that the sources of interest have been picked up in all the
filters where they are visible and that the combined list contains colours
and standard magnitudes for them.
- Special care should be taken in examining the quality flags assigned to
all detected sources. Their meaning is described in detail in the SAS on-line
documentation (task omdetect).
- Check the tracking corrections: although the pointing stability of
XMM-Newton is very good, one can verify it by examining the corresponding
tracking history PDF file.
- When there are several consecutive exposures in a given filter, different
values of the frametime parameter (e.g. in the default image
mode, where each of the 5 exposures has different frametime) can introduce a
variation of a few percent in the count rate from one exposure to the next.
This is due to the coincidence loss correction, which depends strongly on the
frametime. The resulting jumps in average count rate may be more severe
for bright sources.
- When there are several exposures with the same filter, omichain combines the
corresponding images and then it attempts a deeper detection in the co-added
image. New sources may appear as a consequence of the increase in S/N. However,
these sources do not have a proper coincidence loss correction. Since the new
detected sources are very faint, the error is only a few percent. These sources
are flagged in the final merged list (OBSMER file).
- Checking omichain output products:
- Inspection of the light curves:
One can look at the PDF files containing the light curves to check that
there is some signal detected and measured (both in the source and in the
background).
- Checking the presence of source(s) in the fast window pseudo-image.
This can be done easily by displaying this image with SAOImage (ds9) or
fv.
Another possibility is to overlay the detected sources onto the pseudo-image
using the task implot or ds9.
- Checking with ds9) or fv, the presence and centering
of the source(s) in the fast window pseudo-image.
- Using bkgfromimage=yes in omichain (or omlcbuild if running
step by step) will give better results if the source is bright or if there are more
than one sources in the fast mode window. This is the default since SAS 12.
- Jumps due to coincidence loss correction. A fast mode observation consists
of a series of exposures with one or several filters. As discussed for default image mode,
the different values of the frametime in consecutive exposures can produce jumps
in the average count rate level from one exposure to the next.
- Checking omgchain output products:
- Checking the detection of zero and first orders
The correct correlation of the zero and first order of the spectra is
essential. It can be verified by displaying the rotated image and overlaying
on it the detections from the SPCREG file with ds9. SAS 8.0
produces a picture of the grism image with superimposed extraction regions.
The source list
SPECLI file indicates also these correlations. If the users wants to analyse
all detections, then REGION and SWSRLI files should be examined.
- Position of zero order
The position of the zero order (centroid) is the zero point of the wavelength
scale. It should be verified, e.g. using ds9 as pointed out before.
- Identifying the spectra.
The final spectra present in the SPECTR.FIT file can be identified as said
before by overlaying the SPCREG file on the rotated image or by looking at
the produced .PS file. In addition,
the astronomical coordinates of the sources showing spectra are computed
when running omgchain.
- Improving the source detection:
- For image data, if the source of interest is close to straylight
features or to other sources
it may not be detected with the default settings of the omdetect task.
The parameters have to be changed: nsigma and minsignificance
(see SAS documentation, or run omdetect -h for details)
omdetect set=your_image.fit outset=your_sourcelist.fit \
regionfile=your_region.dat nsigma=p minsignificance=q
where your_image.fit, should have been produced by ommodmap.
The default values for nsigma and minsignificance
are 2 and 1, respectively.
Invoking omdetect with regionfile=your_region.dat
will allow a fast checking by overlaying the newly detected source
positions on the image with ds9 using the created region file.
It may be easier to use the task omsource interactively to improve
the photometry of any detected source, to identify new undetected sources
and add them to the list, or even to repeat completely the detection and
photometry in the whole image.
- In case of fast mode, if there are more than one sources in the fast
window, then the detection
will be affected and therefore the whole light curve as well. Some parameters
have to be changed in omdetect as in the case of image mode
(see the online SAS documentation, or run omdetect -h
for details).
Invoking omdetect with regionfile=your_region.dat
will allow a quick check by overlaying the newly detected sources
positions on the image with ds9 using the created region file.
- On grism data, as with normal image data, omdetect can be used
to improve the detection. However, it is recommended to use omgsource
to select the spectra interactively.
Subsections
Next: 6.7.1 Astrometry in OM images
Up: 6 Analysis of OM optical monitor data
Previous: 6.6.3 Example of grism data processing
European Space Agency - XMM-Newton Science Operations Centre