MapDatasetOnOff

class gammapy.cube.MapDatasetOnOff(models=None, counts=None, counts_off=None, acceptance=None, acceptance_off=None, exposure=None, mask_fit=None, psf=None, edisp=None, background_model=None, name='', evaluation_mode='local', mask_safe=None, gti=None)[source]

Bases: gammapy.cube.MapDataset

Map dataset for on-off likelihood fitting.

Parameters
modelsSkyModels

Source sky models.

countsWcsNDMap

Counts cube

counts_offWcsNDMap

Ring-convolved counts cube

acceptanceWcsNDMap

Acceptance from the IRFs

acceptance_offWcsNDMap

Acceptance off

exposureWcsNDMap

Exposure cube

mask_fitndarray

Mask to apply to the likelihood for fitting.

psfPSFKernel

PSF kernel

edispEnergyDispersion

Energy dispersion

background_modelBackgroundModel

Background model to use for the fit.

evaluation_mode{“local”, “global”}

Model evaluation mode. The “local” mode evaluates the model components on smaller grids to save computation time. This mode is recommended for local optimization algorithms. The “global” evaluation mode evaluates the model components on the full map. This mode is recommended for global optimization algorithms.

mask_safendarray

Mask defining the safe data range.

gtiGTI

GTI of the observation or union of GTI if it is a stacked observation

Attributes Summary

alpha

Exposure ratio between signal and background regions

background

Predicted background in the on region.

data_shape

Shape of the counts or background data (tuple)

excess

Excess (counts - alpha * counts_off)

likelihood_type

mask

Combined fit and safe mask

models

Models (SkyModels).

parameters

List of parameters (Parameters)

tag

Methods Summary

copy(self)

A deep copy.

create(geom[, energy_axis_true, migra_axis, …])

Create a MapDataset object with zero filled maps.

cutout(self, position, width[, mode])

Cutout map dataset.

fake(self, background_model[, random_state])

Simulate fake counts (on and off) for the current model and reduced IRFs.

from_dict(data, components, models)

Create from dicts and models list generated from YAML serialization.

from_geoms(geom, geom_exposure, geom_psf, …)

Create a MapDatasetOnOff object with zero filled maps according to the specified geometries

from_hdulist(hdulist[, name])

Create map dataset from list of HDUs.

npred(self)

Predicted source and background counts (Map).

plot_residuals(self[, method, …])

Plot spatial and spectral residuals.

read(filename[, name])

Read map dataset from file.

residuals(self[, method])

Compute residuals map.

stack(self, other)

Stack another dataset in place.

stat_array(self)

Likelihood per bin given the current model parameters

stat_sum(self)

Total likelihood given the current model parameters.

to_dict(self[, filename])

Convert to dict for YAML serialization.

to_hdulist(self)

Convert map dataset to list of HDUs.

to_image(self[, spectrum])

Create images by summing over the energy axis.

to_spectrum_dataset(self, on_region[, …])

Return a ~gammapy.spectrum.SpectrumDataset from on_region.

write(self, filename[, overwrite])

Write map dataset to file.

Attributes Documentation

alpha

Exposure ratio between signal and background regions

background

Predicted background in the on region.

Notice that this definition is valid under the assumption of cash statistic.

data_shape

Shape of the counts or background data (tuple)

excess

Excess (counts - alpha * counts_off)

likelihood_type = 'wstat'
mask

Combined fit and safe mask

models

Models (SkyModels).

parameters

List of parameters (Parameters)

tag = 'MapDatasetOnOff'

Methods Documentation

copy(self)

A deep copy.

classmethod create(geom, energy_axis_true=None, migra_axis=None, rad_axis=None, binsz_irf=None, reference_time='2000-01-01', name='', **kwargs)

Create a MapDataset object with zero filled maps.

Parameters
geomWcsGeom

Reference target geometry in reco energy, used for counts and background maps

energy_axis_trueMapAxis

True energy axis used for IRF maps

migra_axisMapAxis

Migration axis for the energy dispersion map

rad_axisMapAxis

Rad axis for the psf map

binsz_irffloat

IRF Map pixel size in degrees.

reference_timeTime

the reference time to use in GTI definition

namestr

Name of the dataset.

Returns
empty_mapsMapDataset

A MapDataset containing zero filled maps

cutout(self, position, width, mode='trim')

Cutout map dataset.

Parameters
positionSkyCoord

Center position of the cutout region.

widthtuple of Angle

Angular sizes of the region in (lon, lat) in that specific order. If only one value is passed, a square region is extracted.

mode{‘trim’, ‘partial’, ‘strict’}

Mode option for Cutout2D, for details see Cutout2D.

Returns
cutoutMapDataset

Cutout map dataset.

fake(self, background_model, random_state='random-seed')[source]

Simulate fake counts (on and off) for the current model and reduced IRFs.

This method overwrites the counts defined on the dataset object.

Parameters
random_state{int, ‘random-seed’, ‘global-rng’, RandomState}

Defines random number generator initialisation. Passed to get_random_state.

classmethod from_dict(data, components, models)

Create from dicts and models list generated from YAML serialization.

classmethod from_geoms(geom, geom_exposure, geom_psf, geom_edisp, reference_time='2000-01-01', name='', **kwargs)[source]

Create a MapDatasetOnOff object with zero filled maps according to the specified geometries

Parameters
geomgammapy.maps.WcsGeom

geometry for the counts, counts_off, acceptance and acceptance_off maps

geom_exposuregammapy.maps.WcsGeom

geometry for the exposure map

geom_psfgammapy.maps.WcsGeom

geometry for the psf map

geom_edispgammapy.maps.WcsGeom

geometry for the energy dispersion map

reference_timeTime

the reference time to use in GTI definition

namestr

Name of the dataset.

Returns
empty_mapsMapDatasetOnOff

A MapDatasetOnOff containing zero filled maps

classmethod from_hdulist(hdulist, name='')[source]

Create map dataset from list of HDUs.

Parameters
hdulistHDUList

List of HDUs.

Returns
datasetMapDataset

Map dataset.

npred(self)

Predicted source and background counts (Map).

plot_residuals(self, method='diff', smooth_kernel='gauss', smooth_radius='0.1 deg', region=None, figsize=(12, 4), **kwargs)

Plot spatial and spectral residuals.

The spectral residuals are extracted from the provided region, and the normalization used for the residuals computation can be controlled using the method parameter. If no region is passed, only the spatial residuals are shown.

Parameters
method{“diff”, “diff/model”, “diff/sqrt(model)”}

Method used to compute the residuals, see MapDataset.residuals()

smooth_kernel{‘gauss’, ‘box’}

Kernel shape.

smooth_radius: `~astropy.units.Quantity`, str or float

Smoothing width given as quantity or float. If a float is given it is interpreted as smoothing width in pixels.

region: `~regions.Region`

Region (pixel or sky regions accepted)

figsizetuple

Figure size used for the plotting.

**kwargsdict

Keyword arguments passed to imshow.

Returns
ax_image, ax_specAxes,

Image and spectrum axes.

classmethod read(filename, name='')

Read map dataset from file.

Parameters
filenamestr

Filename to read from.

Returns
datasetMapDataset

Map dataset.

residuals(self, method='diff')

Compute residuals map.

Parameters
method: {“diff”, “diff/model”, “diff/sqrt(model)”}
Method used to compute the residuals. Available options are:
  • “diff” (default): data - model

  • “diff/model”: (data - model) / model

  • “diff/sqrt(model)”: (data - model) / sqrt(model)

Returns
residualsgammapy.maps.WcsNDMap

Residual map.

stack(self, other)[source]

Stack another dataset in place.

The acceptance of the stacked dataset is normalized to 1, and the stacked acceptance_off is scaled so that:

\[\alpha_\text{stacked} = \frac{1}{a_\text{off}} = \frac{\alpha_1\text{OFF}_1 + \alpha_2\text{OFF}_2}{\text{OFF}_1 + OFF_2}\]
Parameters
otherMapDatasetOnOff

Other dataset

stat_array(self)[source]

Likelihood per bin given the current model parameters

stat_sum(self)[source]

Total likelihood given the current model parameters.

to_dict(self, filename='')

Convert to dict for YAML serialization.

to_hdulist(self)[source]

Convert map dataset to list of HDUs.

Returns
hdulistHDUList

Map dataset list of HDUs.

to_image(self, spectrum=None)

Create images by summing over the energy axis.

Exposure is weighted with an assumed spectrum, resulting in a weighted mean exposure image.

Currently the PSFMap and EdispMap are dropped from the resulting image dataset.

Parameters
spectrumSpectralModel

Spectral model to compute the weights. Default is power-law with spectral index of 2.

Returns
datasetMapDataset

Map dataset containing images.

to_spectrum_dataset(self, on_region, containment_correction=False)

Return a ~gammapy.spectrum.SpectrumDataset from on_region.

Counts and background are summed in the on_region.

Effective area is taken from the average exposure divided by the livetime. Here we assume it is the sum of the GTIs.

EnergyDispersion is obtained at the on_region center. Only regions with centers are supported.

Parameters
on_regionSkyRegion

the input ON region on which to extract the spectrum

containment_correctionbool

Apply containment correction for point sources and circular on regions

Returns
datasetSpectrumDataset

the resulting reduced dataset

write(self, filename, overwrite=False)

Write map dataset to file.

Parameters
filenamestr

Filename to write to.

overwritebool

Overwrite file if it exists.