Fit

class gammapy.modeling.Fit(datasets)[source]

Bases: object

Fit class.

The fit class provides a uniform interface to multiple fitting backends. Currently available: “minuit”, “sherpa” and “scipy”

Parameters:
datasets : Dataset, list of Dataset or Datasets

Dataset or joint datasets to be fitted.

Methods Summary

confidence(self, parameter[, backend, …]) Estimate confidence interval.
covariance(self[, backend]) Estimate the covariance matrix.
likelihood_contour(self) Compute likelihood contour.
likelihood_profile(self, parameter[, …]) Compute likelihood profile.
minos_contour(self, x, y[, numpoints, sigma]) Compute MINOS contour.
optimize(self[, backend]) Run the optimization.
run(self[, optimize_opts, covariance_opts]) Run all fitting steps.

Methods Documentation

confidence(self, parameter, backend='minuit', sigma=1, reoptimize=True, **kwargs)[source]

Estimate confidence interval.

Extra kwargs are passed to the backend. E.g. iminuit.Minuit.minos supports a maxcall option.

For the scipy backend kwargs are forwarded to brentq. If the confidence estimation fails, the bracketing interval can be adapted by modifying the the upper bound of the interval (b) value.

Parameters:
backend : str

Which backend to use (see gammapy.modeling.registry)

parameter : Parameter

Parameter of interest

sigma : float

Number of standard deviations for the confidence level

reoptimize : bool

Re-optimize other parameters, when computing the confidence region.

**kwargs : dict

Keyword argument passed ot the confidence estimation method.

Returns:
result : dict

Dictionary with keys “errp”, ‘errn”, “success” and “nfev”.

covariance(self, backend='minuit')[source]

Estimate the covariance matrix.

Assumes that the model parameters are already optimised.

Parameters:
backend : str

Which backend to use (see gammapy.modeling.registry)

Returns:
result : CovarianceResult

Results

likelihood_contour(self)[source]

Compute likelihood contour.

The method used is to vary two parameters, keeping all others fixed. So this is taking a “slice” or “scan” of the likelihood.

See also: Fit.minos_contour

Parameters:
TODO
Returns:
TODO
likelihood_profile(self, parameter, values=None, bounds=2, nvalues=11, reoptimize=False, optimize_opts=None)[source]

Compute likelihood profile.

The method used is to vary one parameter, keeping all others fixed. So this is taking a “slice” or “scan” of the likelihood.

See also: Fit.minos_profile.

Parameters:
parameter : Parameter

Parameter of interest

values : Quantity (optional)

Parameter values to evaluate the likelihood for.

bounds : int or tuple of float

When an int is passed the bounds are computed from bounds * sigma from the best fit value of the parameter, where sigma corresponds to the one sigma error on the parameter. If a tuple of floats is given those are taken as the min and max values and nvalues are linearly spaced between those.

nvalues : int

Number of parameter grid points to use.

reoptimize : bool

Re-optimize other parameters, when computing the likelihood profile.

Returns:
results : dict

Dictionary with keys “values” and “likelihood”.

minos_contour(self, x, y, numpoints=10, sigma=1.0)[source]

Compute MINOS contour.

Calls iminuit.Minuit.mncontour.

This is a contouring algorithm for a 2D function which is not simply the likelihood function. That 2D function is given at each point (par_1, par_2) by re-optimising all other free parameters, and taking the likelihood at that point.

Very compute-intensive and slow.

Parameters:
x, y : Parameter

Parameters of interest

numpoints : int

Number of contour points

sigma : float

Number of standard deviations for the confidence level

Returns:
result : dict

Dictionary with keys “x”, “y” (Numpy arrays with contour points) and a boolean flag “success”. The result objects from mncontour are in the additional keys “x_info” and “y_info”.

optimize(self, backend='minuit', **kwargs)[source]

Run the optimization.

Parameters:
backend : str

Which backend to use (see gammapy.modeling.registry)

**kwargs : dict

Keyword arguments passed to the optimizer. For the "minuit" backend see https://iminuit.readthedocs.io/en/latest/api.html#iminuit.Minuit for a detailed description of the available options. If there is an entry ‘migrad_opts’, those options will be passed to iminuit.Minuit.migrad().

For the "sherpa" backend you can from the options method = {"simplex",  "levmar", "moncar", "gridsearch"} Those methods are described and compared in detail on http://cxc.cfa.harvard.edu/sherpa/methods/index.html. The available options of the optimization methods are described on the following pages in detail:

For the "scipy" backend the available options are desribed in detail here: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html

Returns:
fit_result : FitResult

Results

run(self, optimize_opts=None, covariance_opts=None)[source]

Run all fitting steps.

Parameters:
optimize_opts : dict

Options passed to Fit.optimize.

covariance_opts : dict

Options passed to Fit.covariance.

Returns:
fit_result : FitResult

Results