DataStore

class gammapy.data.DataStore(hdu_table=None, obs_table=None)[source]

Bases: object

IACT data store.

The data selection and access happens using an observation and an HDU index file as described at Data storage.

For a usage example see cta.html

Parameters
hdu_tableHDUIndexTable

HDU index table

obs_tableObservationTable

Observation index table

Examples

Here’s an example how to create a DataStore to access H.E.S.S. data:

>>> from gammapy.data import DataStore
>>> data_store = DataStore.from_dir('$GAMMAPY_DATA/hess-dl3-dr1')
>>> data_store.info()
Data store:
HDU index table:
BASE_DIR: /home/runner/work/gammapy/gammapy/gammapy-datasets/hess-dl3-dr1
Rows: 630
OBS_ID: 20136 -- 47829
HDU_TYPE: ['aeff', 'bkg', 'edisp', 'events', 'gti', 'psf']
HDU_CLASS: ['aeff_2d', 'bkg_3d', 'edisp_2d', 'events', 'gti', 'psf_table']


Observation table:
Observatory name: 'N/A'
Number of observations: 105

Attributes Summary

DEFAULT_HDU_TABLE

Default HDU table filename.

DEFAULT_OBS_TABLE

Default observation table filename.

Methods Summary

check([checks])

Check index tables and data files.

copy_obs(obs_id, outdir[, hdu_class, …])

Create a new DataStore containing a subset of observations.

from_dir(base_dir[, hdu_table_filename, …])

Create from a directory.

from_events_files(paths)

Create from a list of event filenames.

from_file(filename[, hdu_hdu, hdu_obs])

Create from a FITS file.

get_observations([obs_id, skip_missing, …])

Generate a Observations.

info([show])

Print some info.

obs(obs_id)

Access a given Observation.

Attributes Documentation

DEFAULT_HDU_TABLE = 'hdu-index.fits.gz'

Default HDU table filename.

DEFAULT_OBS_TABLE = 'obs-index.fits.gz'

Default observation table filename.

Methods Documentation

check(checks='all')[source]

Check index tables and data files.

This is a generator that yields a list of dicts.

copy_obs(obs_id, outdir, hdu_class=None, verbose=False, overwrite=False)[source]

Create a new DataStore containing a subset of observations.

Parameters
obs_idarray-like, ObservationTable

List of observations to copy

outdirstr, Path

Directory for the new store

hdu_classlist of str

see gammapy.data.HDUIndexTable.VALID_HDU_CLASS

verbosebool

Print copied files

overwritebool

Overwrite

classmethod from_dir(base_dir, hdu_table_filename=None, obs_table_filename=None)[source]

Create from a directory.

Parameters
base_dirstr, Path

Base directory of the data files.

hdu_table_filenamestr, Path

Filename of the HDU index file. May be specified either relative to base_dir or as an absolute path. If None, the default filename will be looked for.

obs_table_filenamestr, Path

Filename of the observation index file. May be specified either relative to base_dir or as an absolute path. If None, the default filename will be looked for.

classmethod from_events_files(paths)[source]

Create from a list of event filenames.

HDU and observation index tables will be created from the EVENTS header.

IRFs are found only if you have a CALDB environment variable set, and if the EVENTS files contain the following keys:

  • TELESCOP (example: TELESCOP = CTA)

  • CALDB (example: CALDB = 1dc)

  • IRF (example: IRF = South_z20_50h)

This method is useful specifically if you want to load data simulated with ctobssim

Examples

This is how you can access a single event list:

from gammapy.data import DataStore
path = "$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_110380.fits"
data_store = DataStore.from_events_files([path])
observations = data_store.get_observations()

You can now analyse this data as usual (see any Gammapy tutorial).

If you have multiple event files, you have to make the list. Here’s an example using Path.glob to get a list of all events files in a given folder:

import os
from pathlib import Path
path = Path(os.environ["GAMMAPY_DATA"]) / "cta-1dc/data"
paths = list(path.rglob("*.fits"))
data_store = DataStore.from_events_files(paths)
observations = data_store.get_observations()

Note that you have a lot of flexibility to select the observations you want, by having a few lines of custom code to prepare paths, or to select a subset via a method on the data_store or the observations objects.

If you want to generate HDU and observation index files, write the tables to disk:

data_store.hdu_table.write("hdu-index.fits.gz")
data_store.obs_table.write("obs-index.fits.gz")
classmethod from_file(filename, hdu_hdu='HDU_INDEX', hdu_obs='OBS_INDEX')[source]

Create from a FITS file.

The FITS file must contain both index files.

Parameters
filenamestr, Path

FITS filename

hdu_hdustr or int

FITS HDU name or number for the HDU index table

hdu_obsstr or int

FITS HDU name or number for the observation index table

Returns
data_storeDataStore

Data store

get_observations(obs_id=None, skip_missing=False, required_irf='all')[source]

Generate a Observations.

Parameters
obs_idlist

Observation IDs (default of None means “all”)

skip_missingbool, optional

Skip missing observations, default: False

required_irflist of str

Runs will be added to the list of observations only if the required IRFs are present. Otherwise, the given run will be skipped Available options are: * aeff : Effective area * bkg : Background * edisp: Energy dispersion * psf : Point Spread Function By default, all the IRFs are required.

Returns
observationsObservations

Container holding a list of Observation

info(show=True)[source]

Print some info.

obs(obs_id)[source]

Access a given Observation.

Parameters
obs_idint

Observation ID.

Returns
observationObservation

Observation container