XMM-Newton Science Analysis System: User Guide


next up previous contents
Next: 6.7.1 Astrometry in OM images Up: 6 Analysis of OM optical monitor data Previous: 6.6.3 Example of grism data processing


6.7 Analysing OM data

As it has been pointed out already, OM data in image mode and fast mode with the normal filters, are fully processed by the SAS pipeline. All data, including the grisms can be run through the corresponding chain: for each exposure of a given observation, all necessary corrections are applied to the data files. Then a source detection algorithm is used to identify the sources present in the image. Standard aperture photometry is applied to obtain the count rates for all detected sources. These rates are corrected for coincidence losses, dead time and time dependent sensitivity degradation of the detector, and finally OM instrumental magnitudes and standard colour corrections are computed as well as absolute fluxes. A final source list is obtained from all exposures and filters. The detector geometric distortion correction and astrometric corrections are applied to each source's position, and also the whole image is converted to sky coordinates space and rotated so as to have the North on the top. Default image windows are also combined to obtain a mosaic (per filter) of the FOV. In the case of grism data, the spectra are searched for, extracted and calibrated in wavelength and absolute flux. Astrometry is also performed on grism data to compute the astronomical coordinates of the sources whose spectra have been extracted.

However, one has to be aware of some peculiarities of OM data before starting the analysis.

The following points should also be kept in mind:

  1. The OM operates in photon counting mode but images are accumulated on board. Good time intervals (GTI) have no meaning in OM data. Either the full exposure is selected or discarded.

  2. Contrary to the X-ray instruments, an OM exposure does not provide direct energy information except when grisms are selected.

  3. As it has been explained, a flatfield response of unity is assumed.

  4. Coincidence losses can occur which depend on the source brightness and on the CCD frame rate. The frame rate itself depends on the selected configuration of the detector science and tracking windows. If two or more events are located close to each other within the same CCD readout frame, they are detected and counted as one event. Also when photon splashes overlap, a mis-location is given by the centroiding algorithm. Another consequence of coincidence losses is event depletion, occurring at high count rates around the central source position.

  5. Corrections for coincidence losses and for detector deadtime are applied when aperture photometry is performed on the detected sources. These corrections are not applied to any of the produced images.

  6. The OM filters do not form a proper photometric system. However the photometric calibration of the instrument, based on observations with OM and from the ground, allows the SAS to obtain standard U, B, V magnitudes and colours in the Johnson system. (This applies to stars. Extended objects should be treated with care.) For the UV filters (UVW1, UVM2 and UVW2) AB Magnitudes have been defined in addition to the instrumental system. An absolute flux conversion is provided for all filters.

  7. The OM point spread function (PSF) has wide wings. This is taken into account in the application of the calibration with SAS through the proper setting of photometric apertures. A similar approach has to be taken if one wants to make an independent processing and analysis of OM data using any other data reduction package.

  8. Straylight features, already mentioned, complicate the background subtraction in some cases, specially when the target star is located in or close to a straylight feature.

  9. Imaging and fast mode data are corrected for time sensitivity degradation, so that data of a given source (count rates, photometry) obtained at different epochs can be compared. Grism data processing does not correct for time sensitivity degradation.

In principle, after SAS has been run there is no need for further data reduction. The observation source list should contain the calibrated data with their errors. Therefore, the user can proceed with the analysis and interpretation of the processed data. However, some checking is recommended to verify the consistency of the data output.

The whole data processing can be repeated easily by a Guest Observer or XSA user, should any calibration file be updated, and what is more important, in case of doubtful results. The processing chains or the pipeline, apply default options in all SAS tasks, which eventually can be changed by the GO in order to improve the quality of the results. In particular, the source detection is very sensitive to the artifacts that are common in OM.

In what follows, it is outlined the checks that a user should perform on OM processed data (by the standard Pipeline or by running SAS), and the use of one of the tasks, omdetect, where the user can modify parameters affecting the source detection and therefore the overall results of the data analysis.

  1. Checking omichain output products:

  2. Checking omichain output products:

  3. Checking omgchain output products:

  4. Improving the source detection:

  5. On grism data, as with normal image data, omdetect can be used to improve the detection. However, it is recommended to use omgsource to select the spectra interactively.



Subsections
next up previous contents
Next: 6.7.1 Astrometry in OM images Up: 6 Analysis of OM optical monitor data Previous: 6.6.3 Example of grism data processing
European Space Agency - XMM-Newton Science Operations Centre