.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "tutorials/starting/analysis_1.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_tutorials_starting_analysis_1.py: High level interface ==================== Introduction to 3D analysis using the Gammapy high level interface. Prerequisites ------------- - Understanding the gammapy data workflow, in particular what are DL3 events and instrument response functions (IRF). Context ------- This notebook is an introduction to gammapy analysis using the high level interface. Gammapy analysis consists in two main steps. The first one is data reduction: user selected observations are reduced to a geometry defined by the user. It can be 1D (spectrum from a given extraction region) or 3D (with a sky projection and an energy axis). The resulting reduced data and instrument response functions (IRF) are called datasets in Gammapy. The second step consists in setting a physical model on the datasets and fitting it to obtain relevant physical information. **Objective: Create a 3D dataset of the Crab using the H.E.S.S. DL3 data release 1 and perform a simple model fitting of the Crab nebula.** Proposed approach ----------------- This notebook uses the high level `~gammapy.analysis.Analysis` class to orchestrate data reduction. In its current state, `~gammapy.analysis.Analysis` supports the standard analysis cases of joint or stacked 3D and 1D analyses. It is instantiated with an `~gammapy.analysis.AnalysisConfig` object that gives access to analysis parameters either directly or via a YAML config file. To see what is happening under-the-hood and to get an idea of the internal API, a second notebook performs the same analysis without using the `~gammapy.analysis.Analysis` class. In summary, we have to: - Create an `~gammapy.analysis.AnalysisConfig` object and edit it to define the analysis configuration: - Define what observations to use - Define the geometry of the dataset (data and IRFs) - Define the model we want to fit on the dataset. - Instantiate a `~gammapy.analysis.Analysis` from this configuration and run the different analysis steps - Observation selection - Data reduction - Model fitting - Estimating flux points Finally, we will compare the results against a reference model. .. GENERATED FROM PYTHON SOURCE LINES 68-71 Setup ----- .. GENERATED FROM PYTHON SOURCE LINES 71-79 .. code-block:: Python from pathlib import Path from astropy import units as u # %matplotlib inline import matplotlib.pyplot as plt from gammapy.analysis import Analysis, AnalysisConfig .. GENERATED FROM PYTHON SOURCE LINES 80-82 Check setup ----------- .. GENERATED FROM PYTHON SOURCE LINES 82-86 .. code-block:: Python from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() .. rst-class:: sphx-glr-script-out .. code-block:: none System: python_executable : /home/runner/work/gammapy-docs/gammapy-docs/gammapy/.tox/build_docs/bin/python python_version : 3.9.19 machine : x86_64 system : Linux Gammapy package: version : 1.3.dev283+g0ce322691 path : /home/runner/work/gammapy-docs/gammapy-docs/gammapy/.tox/build_docs/lib/python3.9/site-packages/gammapy Other packages: numpy : 1.26.4 scipy : 1.13.0 astropy : 5.2.2 regions : 0.8 click : 8.1.7 yaml : 6.0.1 IPython : 8.18.1 jupyterlab : not installed matplotlib : 3.8.4 pandas : not installed healpy : 1.16.6 iminuit : 2.25.2 sherpa : 4.16.0 naima : 0.10.0 emcee : 3.1.6 corner : 2.2.2 ray : 2.20.0 Gammapy environment variables: GAMMAPY_DATA : /home/runner/work/gammapy-docs/gammapy-docs/gammapy-datasets/dev .. GENERATED FROM PYTHON SOURCE LINES 87-99 Analysis configuration ---------------------- For configuration of the analysis we use the `YAML `__ data format. YAML is a machine readable serialisation format, that is also friendly for humans to read. In this tutorial we will write the configuration file just using Python strings, but of course the file can be created and modified with any text editor of your choice. Here is what the configuration for our analysis looks like: .. GENERATED FROM PYTHON SOURCE LINES 99-105 .. code-block:: Python config = AnalysisConfig() # the AnalysisConfig gives access to the various parameters used from logging to reduced dataset geometries print(config) .. rst-class:: sphx-glr-script-out .. code-block:: none AnalysisConfig general: log: {level: info, filename: null, filemode: null, format: null, datefmt: null} outdir: . n_jobs: 1 datasets_file: null models_file: null observations: datastore: /home/runner/work/gammapy-docs/gammapy-docs/gammapy-datasets/dev/hess-dl3-dr1 obs_ids: [] obs_file: null obs_cone: {frame: null, lon: null, lat: null, radius: null} obs_time: {start: null, stop: null} required_irf: [aeff, edisp, psf, bkg] datasets: type: 1d stack: true geom: wcs: skydir: {frame: null, lon: null, lat: null} binsize: 0.02 deg width: {width: 5.0 deg, height: 5.0 deg} binsize_irf: 0.2 deg selection: {offset_max: 2.5 deg} axes: energy: {min: 1.0 TeV, max: 10.0 TeV, nbins: 5} energy_true: {min: 0.5 TeV, max: 20.0 TeV, nbins: 16} map_selection: [counts, exposure, background, psf, edisp] background: method: null exclusion: null parameters: {} safe_mask: methods: [aeff-default] parameters: {} on_region: {frame: null, lon: null, lat: null, radius: null} containment_correction: true fit: fit_range: {min: null, max: null} flux_points: energy: {min: null, max: null, nbins: null} source: source parameters: {selection_optional: all} excess_map: correlation_radius: 0.1 deg parameters: {} energy_edges: {min: null, max: null, nbins: null} light_curve: time_intervals: {start: null, stop: null} energy_edges: {min: null, max: null, nbins: null} source: source parameters: {selection_optional: all} .. GENERATED FROM PYTHON SOURCE LINES 106-109 Setting the data to use ~~~~~~~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 112-120 We want to use Crab runs from the H.E.S.S. DL3-DR1. We define here the datastore and a cone search of observations pointing with 5 degrees of the Crab nebula. Parameters can be set directly or as a python dict. PS: do not forget to setup your environment variable `$GAMMAPY_DATA` to your local directory containing the H.E.S.S. DL3-DR1 as described in :ref:`quickstart-setup`. .. GENERATED FROM PYTHON SOURCE LINES 120-134 .. code-block:: Python # We define the datastore containing the data config.observations.datastore = "$GAMMAPY_DATA/hess-dl3-dr1" # We define the cone search parameters config.observations.obs_cone.frame = "icrs" config.observations.obs_cone.lon = "83.633 deg" config.observations.obs_cone.lat = "22.014 deg" config.observations.obs_cone.radius = "5 deg" # Equivalently we could have set parameters with a python dict # config.observations.obs_cone = {"frame": "icrs", "lon": "83.633 deg", "lat": "22.014 deg", "radius": "5 deg"} .. GENERATED FROM PYTHON SOURCE LINES 135-138 Setting the reduced datasets geometry ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 138-164 .. code-block:: Python # We want to perform a 3D analysis config.datasets.type = "3d" # We want to stack the data into a single reduced dataset config.datasets.stack = True # We fix the WCS geometry of the datasets config.datasets.geom.wcs.skydir = { "lon": "83.633 deg", "lat": "22.014 deg", "frame": "icrs", } config.datasets.geom.wcs.width = {"width": "2 deg", "height": "2 deg"} config.datasets.geom.wcs.binsize = "0.02 deg" # We now fix the energy axis for the counts map config.datasets.geom.axes.energy.min = "1 TeV" config.datasets.geom.axes.energy.max = "10 TeV" config.datasets.geom.axes.energy.nbins = 10 # We now fix the energy axis for the IRF maps (exposure, etc) config.datasets.geom.axes.energy_true.min = "0.5 TeV" config.datasets.geom.axes.energy_true.max = "20 TeV" config.datasets.geom.axes.energy_true.nbins = 20 .. GENERATED FROM PYTHON SOURCE LINES 165-168 Setting the background normalization maker ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 168-173 .. code-block:: Python config.datasets.background.method = "fov_background" config.datasets.background.parameters = {"method": "scale"} .. GENERATED FROM PYTHON SOURCE LINES 174-177 Setting the exclusion mask ~~~~~~~~~~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 180-185 In order to properly adjust the background normalisation on regions without gamma-ray signal, one needs to define an exclusion mask for the background normalisation. For this tutorial, we use the following one ``$GAMMAPY_DATA/joint-crab/exclusion/exclusion_mask_crab.fits.gz`` .. GENERATED FROM PYTHON SOURCE LINES 185-191 .. code-block:: Python config.datasets.background.exclusion = ( "$GAMMAPY_DATA/joint-crab/exclusion/exclusion_mask_crab.fits.gz" ) .. GENERATED FROM PYTHON SOURCE LINES 192-203 Setting modeling and fitting parameters ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ``Analysis`` can perform a few modeling and fitting tasks besides data reduction. Parameters have then to be passed to the configuration object. Here we define the energy range on which to perform the fit. We also set the energy edges used for flux point computation as well as the correlation radius to compute excess and significance maps. .. GENERATED FROM PYTHON SOURCE LINES 203-210 .. code-block:: Python config.fit.fit_range.min = 1 * u.TeV config.fit.fit_range.max = 10 * u.TeV config.flux_points.energy = {"min": "1 TeV", "max": "10 TeV", "nbins": 4} config.excess_map.correlation_radius = 0.1 * u.deg .. GENERATED FROM PYTHON SOURCE LINES 211-214 We’re all set. But before we go on let’s see how to save or import ``AnalysisConfig`` objects though YAML files. .. GENERATED FROM PYTHON SOURCE LINES 217-222 Using YAML configuration files ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ One can export/import the `~gammapy.modeling.AnalysisConfig` to/from a YAML file. .. GENERATED FROM PYTHON SOURCE LINES 222-229 .. code-block:: Python config.write("config.yaml", overwrite=True) config = AnalysisConfig.read("config.yaml") print(config) .. rst-class:: sphx-glr-script-out .. code-block:: none AnalysisConfig general: log: {level: info, filename: null, filemode: null, format: null, datefmt: null} outdir: . n_jobs: 1 datasets_file: null models_file: null observations: datastore: /home/runner/work/gammapy-docs/gammapy-docs/gammapy-datasets/dev/hess-dl3-dr1 obs_ids: [] obs_file: null obs_cone: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 5.0 deg} obs_time: {start: null, stop: null} required_irf: [aeff, edisp, psf, bkg] datasets: type: 3d stack: true geom: wcs: skydir: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg} binsize: 0.02 deg width: {width: 2.0 deg, height: 2.0 deg} binsize_irf: 0.2 deg selection: {offset_max: 2.5 deg} axes: energy: {min: 1.0 TeV, max: 10.0 TeV, nbins: 10} energy_true: {min: 0.5 TeV, max: 20.0 TeV, nbins: 20} map_selection: [counts, exposure, background, psf, edisp] background: method: fov_background exclusion: /home/runner/work/gammapy-docs/gammapy-docs/gammapy-datasets/dev/joint-crab/exclusion/exclusion_mask_crab.fits.gz parameters: {method: scale} safe_mask: methods: [aeff-default] parameters: {} on_region: {frame: null, lon: null, lat: null, radius: null} containment_correction: true fit: fit_range: {min: 1.0 TeV, max: 10.0 TeV} flux_points: energy: {min: 1.0 TeV, max: 10.0 TeV, nbins: 4} source: source parameters: {selection_optional: all} excess_map: correlation_radius: 0.1 deg parameters: {} energy_edges: {min: null, max: null, nbins: null} light_curve: time_intervals: {start: null, stop: null} energy_edges: {min: null, max: null, nbins: null} source: source parameters: {selection_optional: all} .. GENERATED FROM PYTHON SOURCE LINES 230-236 Running the analysis -------------------- We first create an `~gammapy.analysis.Analysis` object from our configuration. .. GENERATED FROM PYTHON SOURCE LINES 236-240 .. code-block:: Python analysis = Analysis(config) .. rst-class:: sphx-glr-script-out .. code-block:: none Setting logging config: {'level': 'INFO', 'filename': None, 'filemode': None, 'format': None, 'datefmt': None} .. GENERATED FROM PYTHON SOURCE LINES 241-247 Observation selection ~~~~~~~~~~~~~~~~~~~~~ We can directly select and load the observations from disk using `~gammapy.analysis.Analysis.get_observations()`: .. GENERATED FROM PYTHON SOURCE LINES 247-251 .. code-block:: Python analysis.get_observations() .. rst-class:: sphx-glr-script-out .. code-block:: none Fetching observations. Observations selected: 4 out of 4. Number of selected observations: 4 .. GENERATED FROM PYTHON SOURCE LINES 252-255 The observations are now available on the `~gammapy.modeling.Analysis` object. The selection corresponds to the following ids: .. GENERATED FROM PYTHON SOURCE LINES 255-259 .. code-block:: Python print(analysis.observations.ids) .. rst-class:: sphx-glr-script-out .. code-block:: none ['23523', '23526', '23559', '23592'] .. GENERATED FROM PYTHON SOURCE LINES 260-264 To see how to explore observations, please refer to the following notebook: :doc:`CTA with Gammapy ` or :doc:`HESS with Gammapy ` .. GENERATED FROM PYTHON SOURCE LINES 267-274 Data reduction -------------- Now we proceed to the data reduction. In the config file we have chosen a WCS map geometry, energy axis and decided to stack the maps. We can run the reduction using `~gammapy.modeling.Analysis.get_datasets()`: .. GENERATED FROM PYTHON SOURCE LINES 276-279 .. code-block:: Python analysis.get_datasets() .. rst-class:: sphx-glr-script-out .. code-block:: none Creating reference dataset and makers. Creating the background Maker. Start the data reduction loop. Computing dataset for observation 23523 Running MapDatasetMaker Running SafeMaskMaker Running FoVBackgroundMaker Computing dataset for observation 23526 Running MapDatasetMaker Running SafeMaskMaker Running FoVBackgroundMaker Computing dataset for observation 23559 Running MapDatasetMaker Running SafeMaskMaker Running FoVBackgroundMaker Computing dataset for observation 23592 Running MapDatasetMaker Running SafeMaskMaker Running FoVBackgroundMaker .. GENERATED FROM PYTHON SOURCE LINES 280-283 As we have chosen to stack the data, there is finally one dataset contained which we can print: .. GENERATED FROM PYTHON SOURCE LINES 283-287 .. code-block:: Python print(analysis.datasets["stacked"]) .. rst-class:: sphx-glr-script-out .. code-block:: none MapDataset ---------- Name : stacked Total counts : 2485 Total background counts : 1997.49 Total excess counts : 487.51 Predicted counts : 1997.49 Predicted background counts : 1997.49 Predicted excess counts : nan Exposure min : 2.73e+08 m2 s Exposure max : 3.52e+09 m2 s Number of total bins : 100000 Number of fit bins : 100000 Fit statistic type : cash Fit statistic value (-2 log(L)) : nan Number of models : 0 Number of parameters : 0 Number of free parameters : 0 .. GENERATED FROM PYTHON SOURCE LINES 288-294 As you can see the dataset comes with a predefined background model out of the data reduction, but no source model has been set yet. The counts, exposure and background model maps are directly available on the dataset and can be printed and plotted: .. GENERATED FROM PYTHON SOURCE LINES 294-299 .. code-block:: Python counts = analysis.datasets["stacked"].counts counts.smooth("0.05 deg").plot_interactive() .. image-sg:: /tutorials/starting/images/sphx_glr_analysis_1_001.png :alt: analysis 1 :srcset: /tutorials/starting/images/sphx_glr_analysis_1_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none interactive(children=(SelectionSlider(continuous_update=False, description='Select energy:', layout=Layout(width='50%'), options=('1.00 TeV - 1.26 TeV', '1.26 TeV - 1.58 TeV', '1.58 TeV - 2.00 TeV', '2.00 TeV - 2.51 TeV', '2.51 TeV - 3.16 TeV', '3.16 TeV - 3.98 TeV', '3.98 TeV - 5.01 TeV', '5.01 TeV - 6.31 TeV', '6.31 TeV - 7.94 TeV', '7.94 TeV - 10.0 TeV'), style=SliderStyle(description_width='initial'), value='1.00 TeV - 1.26 TeV'), RadioButtons(description='Select stretch:', index=1, options=('linear', 'sqrt', 'log'), style=DescriptionStyle(description_width='initial'), value='sqrt'), Output()), _dom_classes=('widget-interact',)) .. GENERATED FROM PYTHON SOURCE LINES 300-304 We can also compute the map of the sqrt_ts (significance) of the excess counts above the background. The correlation radius to sum counts is defined in the config file. .. GENERATED FROM PYTHON SOURCE LINES 304-309 .. code-block:: Python analysis.get_excess_map() analysis.excess_map["sqrt_ts"].plot(add_cbar=True) plt.show() .. image-sg:: /tutorials/starting/images/sphx_glr_analysis_1_002.png :alt: analysis 1 :srcset: /tutorials/starting/images/sphx_glr_analysis_1_002.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Computing excess maps. .. GENERATED FROM PYTHON SOURCE LINES 310-317 Save dataset to disk -------------------- It is common to run the preparation step independent of the likelihood fit, because often the preparation of maps, PSF and energy dispersion is slow if you have a lot of data. We first create a folder: .. GENERATED FROM PYTHON SOURCE LINES 317-322 .. code-block:: Python path = Path("analysis_1") path.mkdir(exist_ok=True) .. GENERATED FROM PYTHON SOURCE LINES 323-326 And then write the maps and IRFs to disk by calling the dedicated `~gammapy.datasets.Datasets.write` method: .. GENERATED FROM PYTHON SOURCE LINES 326-331 .. code-block:: Python filename = path / "crab-stacked-dataset.fits.gz" analysis.datasets[0].write(filename, overwrite=True) .. GENERATED FROM PYTHON SOURCE LINES 332-338 Model fitting ------------- Now we define a model to be fitted to the dataset. Here we use its YAML definition to load it: .. GENERATED FROM PYTHON SOURCE LINES 338-369 .. code-block:: Python model_config = """ components: - name: crab type: SkyModel spatial: type: PointSpatialModel frame: icrs parameters: - name: lon_0 value: 83.63 unit: deg - name: lat_0 value: 22.014 unit: deg spectral: type: PowerLawSpectralModel parameters: - name: amplitude value: 1.0e-12 unit: cm-2 s-1 TeV-1 - name: index value: 2.0 unit: '' - name: reference value: 1.0 unit: TeV frozen: true """ .. GENERATED FROM PYTHON SOURCE LINES 370-372 Now we set the model on the analysis object: .. GENERATED FROM PYTHON SOURCE LINES 372-376 .. code-block:: Python analysis.set_models(model_config) .. rst-class:: sphx-glr-script-out .. code-block:: none Reading model. Models Component 0: SkyModel Name : crab Datasets names : None Spectral model type : PowerLawSpectralModel Spatial model type : PointSpatialModel Temporal model type : Parameters: index : 2.000 +/- 0.00 amplitude : 1.00e-12 +/- 0.0e+00 1 / (cm2 s TeV) reference (frozen): 1.000 TeV lon_0 : 83.630 +/- 0.00 deg lat_0 : 22.014 +/- 0.00 deg Component 1: FoVBackgroundModel Name : stacked-bkg Datasets names : ['stacked'] Spectral model type : PowerLawNormSpectralModel Parameters: norm : 1.000 +/- 0.00 tilt (frozen): 0.000 reference (frozen): 1.000 TeV .. GENERATED FROM PYTHON SOURCE LINES 377-379 Finally we run the fit: .. GENERATED FROM PYTHON SOURCE LINES 381-386 .. code-block:: Python analysis.run_fit() print(analysis.fit_result) .. rst-class:: sphx-glr-script-out .. code-block:: none Fitting datasets. OptimizeResult backend : minuit method : migrad success : True message : Optimization terminated successfully. nfev : 259 total stat : 19991.97 CovarianceResult backend : minuit method : hesse success : True message : Hesse terminated successfully. OptimizeResult backend : minuit method : migrad success : True message : Optimization terminated successfully. nfev : 259 total stat : 19991.97 CovarianceResult backend : minuit method : hesse success : True message : Hesse terminated successfully. .. GENERATED FROM PYTHON SOURCE LINES 387-389 This is how we can write the model back to file again: .. GENERATED FROM PYTHON SOURCE LINES 389-397 .. code-block:: Python filename = path / "model-best-fit.yaml" analysis.models.write(filename, overwrite=True) with filename.open("r") as f: print(f.read()) .. rst-class:: sphx-glr-script-out .. code-block:: none components: - name: crab type: SkyModel spectral: type: PowerLawSpectralModel parameters: - name: index value: 2.5563052715230388 error: 0.10317325958321796 - name: amplitude value: 4.55026401977972e-11 unit: cm-2 s-1 TeV-1 error: 3.733360190578827e-12 - name: reference value: 1.0 unit: TeV spatial: type: PointSpatialModel frame: icrs parameters: - name: lon_0 value: 83.619794257232 unit: deg error: 0.003131946268913696 - name: lat_0 value: 22.024551293715163 unit: deg error: 0.0029490222317034775 - type: FoVBackgroundModel datasets_names: - stacked spectral: type: PowerLawNormSpectralModel parameters: - name: norm value: 0.9864771491419372 error: 0.023473560586078527 - name: tilt value: 0.0 - name: reference value: 1.0 unit: TeV covariance: model-best-fit_covariance.dat metadata: creator: Gammapy 1.3.dev283+g0ce322691 date: '2024-05-06T16:05:36.207337' origin: null .. GENERATED FROM PYTHON SOURCE LINES 398-401 Flux points ~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 401-421 .. code-block:: Python analysis.config.flux_points.source = "crab" # Example showing how to change the FluxPointsEstimator parameters: analysis.config.flux_points.energy.nbins = 5 config_dict = { "selection_optional": "all", "n_sigma": 2, # Number of sigma to use for asymmetric error computation "n_sigma_ul": 3, # Number of sigma to use for upper limit computation } analysis.config.flux_points.parameters = config_dict analysis.get_flux_points() # Example showing how to change just before plotting the threshold on the signal significance # (points vs upper limits), even if this has no effect with this data set. fp = analysis.flux_points.data fp.sqrt_ts_threshold_ul = 5 ax_sed, ax_residuals = analysis.flux_points.plot_fit() plt.show() .. image-sg:: /tutorials/starting/images/sphx_glr_analysis_1_003.png :alt: analysis 1 :srcset: /tutorials/starting/images/sphx_glr_analysis_1_003.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Calculating flux points. Reoptimize = False ignored for iminuit backend Reoptimize = False ignored for iminuit backend Reoptimize = False ignored for iminuit backend Reoptimize = False ignored for iminuit backend Reoptimize = False ignored for iminuit backend Reoptimize = False ignored for iminuit backend Reoptimize = False ignored for iminuit backend Reoptimize = False ignored for iminuit backend Reoptimize = False ignored for iminuit backend Reoptimize = False ignored for iminuit backend Inferred format: gadf-sed e_ref dnde ... sqrt_ts TeV 1 / (cm2 s TeV) ... ------------------ ---------------------- ... ------------------ 1.2589254117941673 2.356263170943497e-11 ... 24.264516271942103 1.9952623149688797 8.855887646638273e-12 ... 22.400023002295445 3.1622776601683795 2.4888969467051004e-12 ... 16.7660365375951 5.011872336272724 6.15228678433255e-13 ... 11.908700583573673 7.943282347242818 2.439306769641669e-13 ... 8.300451666228708 .. GENERATED FROM PYTHON SOURCE LINES 422-426 The flux points can be exported to a fits table following the format defined `here `_ .. GENERATED FROM PYTHON SOURCE LINES 426-431 .. code-block:: Python filename = path / "flux-points.fits" analysis.flux_points.write(filename, overwrite=True) .. rst-class:: sphx-glr-script-out .. code-block:: none Inferred format: gadf-sed .. GENERATED FROM PYTHON SOURCE LINES 432-435 To check the fit is correct, we compute the map of the sqrt_ts of the excess counts above the current model. .. GENERATED FROM PYTHON SOURCE LINES 435-441 .. code-block:: Python analysis.get_excess_map() analysis.excess_map["sqrt_ts"].plot(add_cbar=True, cmap="RdBu", vmin=-5, vmax=5) plt.show() .. image-sg:: /tutorials/starting/images/sphx_glr_analysis_1_004.png :alt: analysis 1 :srcset: /tutorials/starting/images/sphx_glr_analysis_1_004.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Computing excess maps. .. GENERATED FROM PYTHON SOURCE LINES 442-451 What’s next ----------- You can look at the same analysis without the high level interface in :doc:`/tutorials/starting/analysis_2` You can see how to perform a 1D spectral analysis of the same data in :doc:`/tutorials/analysis-1d/spectral_analysis` .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 13.133 seconds) .. _sphx_glr_download_tutorials_starting_analysis_1.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/gammapy/gammapy-webpage/main?urlpath=lab/tree/notebooks/dev/tutorials/starting/analysis_1.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: analysis_1.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: analysis_1.py ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_