Fit

class gammapy.utils.fitting.Fit(datasets)[source]

Bases: object

Fit class.

The fit class provides a uniform interface to multiple fitting backends. Currently available: “minuit”, “sherpa” and “scipy”

Parameters:
datasets : Dataset, list of Dataset or Datasets

Dataset or joint datasets to be fitted.

Methods Summary

confidence(parameter[, backend, sigma]) Estimate confidence interval.
covariance([backend]) Estimate the covariance matrix.
likelihood_contour() Compute likelihood contour.
likelihood_profile(parameter[, values, …]) Compute likelihood profile.
minos_contour(x, y[, numpoints, sigma]) Compute MINOS contour.
optimize([backend]) Run the optimization.
run([optimize_opts, covariance_opts]) Run all fitting steps.
total_stat(parameters) Total likelihood given the current model parameters

Methods Documentation

confidence(parameter, backend='minuit', sigma=1, **kwargs)[source]

Estimate confidence interval.

Extra kwargs are passed to the backend. E.g. iminuit.Minuit.minos supports a maxcall option.

Parameters:
backend : str

Which backend to use (see gammapy.utils.fitting.registry)

parameter : Parameter

Parameter of interest

sigma : float

Number of standard deviations for the confidence level

Returns:
result : dict

Dictionary with keys “errp”, ‘errn”, “success” and “nfev”.

covariance(backend='minuit')[source]

Estimate the covariance matrix.

Assumes that the model parameters are already optimised.

Parameters:
backend : str

Which backend to use (see gammapy.utils.fitting.registry)

Returns:
result : CovarianceResult

Results

likelihood_contour()[source]

Compute likelihood contour.

The method used is to vary two parameters, keeping all others fixed. So this is taking a “slice” or “scan” of the likelihood.

See also: Fit.minos_contour

Parameters:
TODO
Returns:
TODO
likelihood_profile(parameter, values=None, bounds=2, nvalues=11, reoptimize=False, optimize_opts=None)[source]

Compute likelihood profile.

The method used is to vary one parameter, keeping all others fixed. So this is taking a “slice” or “scan” of the likelihood.

See also: Fit.minos_profile.

Parameters:
parameter : Parameter

Parameter of interest

values : Quantity (optional)

Parameter values to evaluate the likelihood for.

bounds : int or tuple of float

When an int is passed the bounds are computed from bounds * sigma from the best fit value of the parameter, where sigma corresponds to the one sigma error on the parameter. If a tuple of floats is given those are taken as the min and max values and nvalues are linearly spaced between those.

nvalues : int

Number of parameter grid points to use.

reoptimize : bool

Re-optimize other parameters, when computing the likelihood profile.

Returns:
results : dict

Dictionary with keys “values” and “likelihood”.

minos_contour(x, y, numpoints=10, sigma=1.0)[source]

Compute MINOS contour.

Calls iminuit.Minuit.mncontour.

This is a contouring algorithm for a 2D function which is not simply the likelihood function. That 2D function is given at each point (par_1, par_2) by re-optimising all other free parameters, and taking the likelihood at that point.

Very compute-intensive and slow.

Parameters:
x, y : Parameter

Parameters of interest

numpoints : int

Number of contour points

sigma : float

Number of standard deviations for the confidence level

Returns:
result : dict

Dictionary with keys “x”, “y” (Numpy arrays with contour points) and a boolean flag “success”. The result objects from mncontour are in the additional keys “x_info” and “y_info”.

optimize(backend='minuit', **kwargs)[source]

Run the optimization.

Parameters:
backend : str

Which backend to use (see gammapy.utils.fitting.registry)

**kwargs : dict

Keyword arguments passed to the optimizer. For the "minuit" backend see https://iminuit.readthedocs.io/en/latest/api.html#iminuit.Minuit for a detailed description of the available options. If there is an entry ‘migrad_opts’, those options will be passed to iminuit.Minuit.migrad().

For the "sherpa" backend you can from the options method = {"simplex",  "levmar", "moncar", "gridsearch"} Those methods are described and compared in detail on http://cxc.cfa.harvard.edu/sherpa/methods/index.html. The available options of the optimization methods are described on the following pages in detail:

For the "scipy" backend the available options are desribed in detail here: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html

Returns:
fit_result : FitResult

Results

run(optimize_opts=None, covariance_opts=None)[source]

Run all fitting steps.

Parameters:
optimize_opts : dict

Options passed to Fit.optimize.

covariance_opts : dict

Options passed to Fit.covariance.

Returns:
fit_result : FitResult

Results

total_stat(parameters)[source]

Total likelihood given the current model parameters