DataStore

class gammapy.data.DataStore(hdu_table=None, obs_table=None)[source]

Bases: object

IACT data store.

The data selection and access happens using an observation and an HDU index file as described at IACT data storage.

See cta_1dc_introduction.html for usage examples.

Parameters:
hdu_table : HDUIndexTable

HDU index table

obs_table : ObservationTable

Observation index table

Examples

Here’s an example how to create a DataStore to access H.E.S.S. data:

>>> from gammapy.data import DataStore
>>> data_store = DataStore.from_dir('$GAMMAPY_DATA/hess-dl3-dr1')
>>> data_store.info()

Attributes Summary

DEFAULT_HDU_TABLE Default HDU table filename.
DEFAULT_OBS_TABLE Default observation table filename.

Methods Summary

check(self[, checks]) Check index tables and data files.
copy_obs(self, obs_id, outdir[, hdu_class, …]) Create a new DataStore containing a subset of observations.
from_config(config) Create from a config dict.
from_dir(base_dir[, hdu_table_filename, …]) Create from a directory.
from_events_files(paths) Create from a list of event filenames.
from_file(filename[, hdu_hdu, hdu_obs]) Create from a FITS file.
get_observations(self[, obs_id, skip_missing]) Generate a Observations.
info(self[, show]) Print some info.
obs(self, obs_id) Access a given DataStoreObservation.

Attributes Documentation

DEFAULT_HDU_TABLE = 'hdu-index.fits.gz'

Default HDU table filename.

DEFAULT_OBS_TABLE = 'obs-index.fits.gz'

Default observation table filename.

Methods Documentation

check(self, checks='all')[source]

Check index tables and data files.

This is a generator that yields a list of dicts.

copy_obs(self, obs_id, outdir, hdu_class=None, verbose=False, overwrite=False)[source]

Create a new DataStore containing a subset of observations.

Parameters:
obs_id : array-like, ObservationTable

List of observations to copy

outdir : str, Path

Directory for the new store

hdu_class : list of str

see gammapy.data.HDUIndexTable.VALID_HDU_CLASS

verbose : bool

Print copied files

overwrite : bool

Overwrite

classmethod from_config(config)[source]

Create from a config dict.

classmethod from_dir(base_dir, hdu_table_filename=None, obs_table_filename=None)[source]

Create from a directory.

Parameters:
base_dir : str, Path

Base directory of the data files.

hdu_table_filename : str, Path

Filename of the HDU index file. May be specified either relative to base_dir or as an absolute path. If None, the default filename will be looked for.

obs_table_filename : str, Path

Filename of the observation index file. May be specified either relative to base_dir or as an absolute path. If None, the default filename will be looked for.

classmethod from_events_files(paths)[source]

Create from a list of event filenames.

HDU and observation index tables will be created from the EVENTS header.

IRFs are found only if you have a CALDB environment variable set, and if the EVENTS files contain the following keys:

  • TELESCOP (example: TELESCOP = CTA)
  • CALDB (example: CALDB = 1dc)
  • IRF (example: IRF = South_z20_50h)

This method is useful specifically if you want to load data simulated with ctobssim

Examples

This is how you can access a single event list:

from gammapy.data import DataStore
path = "$GAMMAPY_DATA/cta-1dc/data/baseline/gps/gps_baseline_110380.fits"
data_store = DataStore.from_events_files([path])
observations = data_store.get_observations()

You can now analyse this data as usual (see any Gammapy tutorial).

If you have multiple event files, you have to make the list. Here’s an example using Path.glob to get a list of all events files in a given folder:

import os
from pathlib import Path
path = Path(os.environ["GAMMAPY_DATA"]) / "cta-1dc/data"
paths = list(path.rglob("*.fits"))
data_store = DataStore.from_events_files(paths)
observations = data_store.get_observations()

Note that you have a lot of flexibility to select the observations you want, by having a few lines of custom code to prepare paths, or to select a subset via a method on the data_store or the observations objects.

If you want to generate HDU and observation index files, write the tables to disk:

data_store.hdu_table.write("hdu-index.fits.gz")
data_store.obs_table.write("obs-index.fits.gz")
classmethod from_file(filename, hdu_hdu='HDU_INDEX', hdu_obs='OBS_INDEX')[source]

Create from a FITS file.

The FITS file must contain both index files.

Parameters:
filename : str, Path

FITS filename

hdu_hdu : str or int

FITS HDU name or number for the HDU index table

hdu_obs : str or int

FITS HDU name or number for the observation index table

get_observations(self, obs_id=None, skip_missing=False)[source]

Generate a Observations.

Parameters:
obs_id : list

Observation IDs (default of None means “all”)

skip_missing : bool, optional

Skip missing observations, default: False

Returns:
observations : Observations

Container holding a list of DataStoreObservation

info(self, show=True)[source]

Print some info.

obs(self, obs_id)[source]

Access a given DataStoreObservation.

Parameters:
obs_id : int

Observation ID.

Returns:
observation : DataStoreObservation

Observation container