MapFit

class gammapy.cube.MapFit(model, counts, exposure, background=None, mask=None, psf=None, edisp=None)[source]

Bases: gammapy.utils.fitting.Fit

Perform sky model likelihood fit on maps.

This is the first go at such a class. It’s geared to the SpectrumFit class which does the 1D spectrum fit.

Parameters:

model : SkyModel

Fit model

counts : WcsNDMap

Counts cube

exposure : WcsNDMap

Exposure cube

background : WcsNDMap

Background Cube

mask : WcsNDMap

Mask to apply for the fit. All the pixels that contain 1 or True are included in the fit, all others are ignored.

psf : PSFKernel

PSF kernel

edisp : EnergyDispersion

Energy dispersion

Attributes Summary

stat Likelihood per bin given the current model parameters

Methods Summary

confidence(parameter[, backend, sigma]) Estimate confidence interval.
covariance([backend]) Estimate the covariance matrix.
likelihood_contour() Compute likelihood contour.
likelihood_profile(parameter[, values, …]) Compute likelihood profile.
minos_contour(x, y[, numpoints, sigma]) Compute MINOS contour.
minos_profile() Compute MINOS profile.
optimize([backend]) Run the optimization.
run([optimize_opts, covariance_opts]) Run all fitting steps.
total_stat(parameters) Total likelihood given the current model parameters

Attributes Documentation

stat

Likelihood per bin given the current model parameters

Methods Documentation

confidence(parameter, backend='minuit', sigma=1, **kwargs)

Estimate confidence interval.

Extra kwargs are passed to the backend. E.g. iminuit.Minuit.minos supports a maxcall option.

Parameters:

backend : str

Which backend to use (see gammapy.utils.fitting.registry)

parameter : Parameter

Parameter of interest

sigma : float

Number of standard deviations for the confidence level

Returns:

result : dict

Dictionary with keys “errp”, ‘errn”, “success” and “nfev”.

covariance(backend='minuit')

Estimate the covariance matrix.

Assumes that the model parameters are already optimised.

Parameters:

backend : str

Which backend to use (see gammapy.utils.fitting.registry)

Returns:

result : CovarianceResult

Results

likelihood_contour()

Compute likelihood contour.

The method used is to vary two parameters, keeping all others fixed. So this is taking a “slice” or “scan” of the likelihood.

See also: Fit.minos_contour

Parameters:TODO
Returns:TODO
likelihood_profile(parameter, values=None, bounds=2, nvalues=11)

Compute likelihood profile.

The method used is to vary one parameter, keeping all others fixed. So this is taking a “slice” or “scan” of the likelihood.

See also: Fit.minos_profile.

Parameters:

parameter : Parameter

Parameter of interest

values : Quantity (optional)

Parameter values to evaluate the likelihood for.

bounds : int or tuple of float

When an int is passed the bounds are computed from bounds * sigma from the best fit value of the parameter, where sigma corresponds to the one sigma error on the parameter. If a tuple of floats is given those are taken as the min and max values and nvalues are linearly spaced between those.

nvalues : int

Number of parameter grid points to use.

Returns:

results : dict

Dictionary with keys “values” and “likelihood”.

minos_contour(x, y, numpoints=10, sigma=1.0)

Compute MINOS contour.

Calls iminuit.Minuit.mncontour.

This is a contouring algorithm for a 2D function which is not simply the likelihood function. That 2D function is given at each point (par_1, par_2) by re-optimising all other free parameters, and taking the likelihood at that point.

Very compute-intensive and slow.

Parameters:

x, y : Parameter

Parameters of interest

numpoints : int

Number of contour points

sigma : float

Number of standard deviations for the confidence level

Returns:

result : dict

Dictionary with keys “x”, “y” (Numpy arrays with contour points) and a boolean flag “success”. The result objects from mncontour are in the additional keys “x_info” and “y_info”.

minos_profile()

Compute MINOS profile.

The method used is to vary one parameter, then re-optimise all other free parameters and to take the likelihood at that point.

See also: Fit.likelihood_profile

Calls iminuit.Minuit.mnprofile

optimize(backend='minuit', **kwargs)

Run the optimization.

Parameters:

backend : str

Which backend to use (see gammapy.utils.fitting.registry)

**kwargs : dict

Keyword arguments passed to the optimizer. For the "minuit" backend see https://iminuit.readthedocs.io/en/latest/api.html#iminuit.Minuit for a detailed description of the available options. For the "sherpa" backend you can from the options method = {"simplex",  "levmar", "moncar", "gridsearch"} Those methods are described and compared in detail on http://cxc.cfa.harvard.edu/sherpa/methods/index.html. The available options of the optimization methods are described on the following pages in detail:

Returns:

fit_result : FitResult

Results

run(optimize_opts=None, covariance_opts=None)

Run all fitting steps.

Parameters:

optimize_opts : dict

Options passed to Fit.optimize.

covariance_opts : dict

Options passed to Fit.covariance.

Returns:

fit_result : FitResult

Results

total_stat(parameters)[source]

Total likelihood given the current model parameters