Note

You are not reading the most up to date version of Gammapy documentation.
Access the latest stable version v1.3 or the list of Gammapy releases.

FluxPointFit

class gammapy.spectrum.FluxPointFit(model, data, stat='chi2')[source]

Bases: gammapy.utils.fitting.Fit

Fit a set of flux points with a parametric model.

Parameters:

model : SpectralModel

Spectral model

data : FluxPoints

Flux points.

Examples

Load flux points from file and fit with a power-law model:

from astropy import units as u
from gammapy.spectrum import FluxPoints, FluxPointFit
from gammapy.spectrum.models import PowerLaw

filename = '$GAMMAPY_EXTRA/test_datasets/spectrum/flux_points/diff_flux_points.fits'
flux_points = FluxPoints.read(filename)

model = PowerLaw()

fit = FluxPointFit(model, flux_points)
result = fit.run()
print(result)
print(result.model)

Attributes Summary

Methods Summary

confidence(parameter[, backend, sigma]) Estimate confidence interval.
covariance([backend]) Estimate the covariance matrix.
likelihood_contour() Compute likelihood contour.
likelihood_profile(parameter[, values, …]) Compute likelihood profile.
minos_contour(x, y[, numpoints, sigma]) Compute MINOS contour.
minos_profile() Compute MINOS profile.
optimize([backend]) Run the optimization.
run([optimize_opts, covariance_opts]) Run all fitting steps.
total_stat(parameters) Total likelihood given the current model parameters

Attributes Documentation

stat

Methods Documentation

confidence(parameter, backend='minuit', sigma=1, **kwargs)

Estimate confidence interval.

Extra kwargs are passed to the backend. E.g. iminuit.Minuit.minos supports a maxcall option.

Parameters:

backend : str

Which backend to use (see gammapy.utils.fitting.registry)

parameter : Parameter

Parameter of interest

sigma : float

Number of standard deviations for the confidence level

Returns:

result : dict

Dictionary with keys “errp”, ‘errn”, “success” and “nfev”.

covariance(backend='minuit')

Estimate the covariance matrix.

Assumes that the model parameters are already optimised.

Parameters:

backend : str

Which backend to use (see gammapy.utils.fitting.registry)

Returns:

result : CovarianceResult

Results

likelihood_contour()

Compute likelihood contour.

The method used is to vary two parameters, keeping all others fixed. So this is taking a “slice” or “scan” of the likelihood.

See also: Fit.minos_contour

Parameters:TODO
Returns:TODO
likelihood_profile(parameter, values=None, bounds=2, nvalues=11)

Compute likelihood profile.

The method used is to vary one parameter, keeping all others fixed. So this is taking a “slice” or “scan” of the likelihood.

See also: Fit.minos_profile.

Parameters:

parameter : Parameter

Parameter of interest

values : Quantity (optional)

Parameter values to evaluate the likelihood for.

bounds : int or tuple of float

When an int is passed the bounds are computed from bounds * sigma from the best fit value of the parameter, where sigma corresponds to the one sigma error on the parameter. If a tuple of floats is given those are taken as the min and max values and nvalues are linearly spaced between those.

nvalues : int

Number of parameter grid points to use.

Returns:

results : dict

Dictionary with keys “values” and “likelihood”.

minos_contour(x, y, numpoints=10, sigma=1.0)

Compute MINOS contour.

Calls iminuit.Minuit.mncontour.

This is a contouring algorithm for a 2D function which is not simply the likelihood function. That 2D function is given at each point (par_1, par_2) by re-optimising all other free parameters, and taking the likelihood at that point.

Very compute-intensive and slow.

Parameters:

x, y : Parameter

Parameters of interest

numpoints : int

Number of contour points

sigma : float

Number of standard deviations for the confidence level

Returns:

result : dict

Dictionary with keys “x”, “y” (Numpy arrays with contour points) and a boolean flag “success”. The result objects from mncontour are in the additional keys “x_info” and “y_info”.

minos_profile()

Compute MINOS profile.

The method used is to vary one parameter, then re-optimise all other free parameters and to take the likelihood at that point.

See also: Fit.likelihood_profile

Calls iminuit.Minuit.mnprofile

optimize(backend='minuit', **kwargs)

Run the optimization.

Parameters:

backend : str

Which backend to use (see gammapy.utils.fitting.registry)

**kwargs : dict

Keyword arguments passed to the optimizer. For the "minuit" backend see https://iminuit.readthedocs.io/en/latest/api.html#iminuit.Minuit for a detailed description of the available options. For the "sherpa" backend you can from the options method = {"simplex",  "levmar", "moncar", "gridsearch"} Those methods are described and compared in detail on http://cxc.cfa.harvard.edu/sherpa/methods/index.html. The available options of the optimization methods are described on the following pages in detail:

Returns:

fit_result : FitResult

Results

run(optimize_opts=None, covariance_opts=None)

Run all fitting steps.

Parameters:

optimize_opts : dict

Options passed to Fit.optimize.

covariance_opts : dict

Options passed to Fit.covariance.

Returns:

fit_result : FitResult

Results

total_stat(parameters)[source]

Total likelihood given the current model parameters