.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "tutorials/analysis-1d/spectral_analysis_hli.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_tutorials_analysis-1d_spectral_analysis_hli.py: Spectral analysis with the HLI ============================== Introduction to 1D analysis using the Gammapy high level interface. Prerequisites ------------- - Understanding the gammapy data workflow, in particular what are DL3 events and instrument response functions (IRF). Context ------- This notebook is an introduction to gammapy analysis using the high level interface. Gammapy analysis consists of two main steps. The first one is data reduction: user selected observations are reduced to a geometry defined by the user. It can be 1D (spectrum from a given extraction region) or 3D (with a sky projection and an energy axis). The resulting reduced data and instrument response functions (IRF) are called datasets in Gammapy. The second step consists of setting a physical model on the datasets and fitting it to obtain relevant physical information. **Objective: Create a 1D dataset of the Crab using the H.E.S.S. DL3 data release 1 and perform a simple model fitting of the Crab nebula.** Proposed approach ----------------- This notebook uses the high level `~gammapy.analysis.Analysis` class to orchestrate data reduction and run the data fits. In its current state, `Analysis` supports the standard analysis cases of joint or stacked 3D and 1D analyses. It is instantiated with an `~gammapy.analysis.AnalysisConfig` object that gives access to analysis parameters either directly or via a YAML config file. To see what is happening under-the-hood and to get an idea of the internal API, a second notebook performs the same analysis without using the `~gammapy.analysis.Analysis` class. In summary, we have to: - Create an `~gammapy.analysis.AnalysisConfig` object and the analysis configuration: - Define what observations to use - Define the geometry of the dataset (data and IRFs) - Define the model we want to fit on the dataset. - Instantiate a `~gammapy.analysis.Analysis` from this configuration and run the different analysis steps - Observation selection - Data reduction - Model fitting - Estimating flux points .. GENERATED FROM PYTHON SOURCE LINES 65-71 .. code-block:: Python from pathlib import Path # %matplotlib inline import matplotlib.pyplot as plt .. GENERATED FROM PYTHON SOURCE LINES 72-75 Setup ----- .. GENERATED FROM PYTHON SOURCE LINES 75-79 .. code-block:: Python from IPython.display import display from gammapy.analysis import Analysis, AnalysisConfig from gammapy.modeling.models import Models .. GENERATED FROM PYTHON SOURCE LINES 80-82 Check setup ----------- .. GENERATED FROM PYTHON SOURCE LINES 82-86 .. code-block:: Python from gammapy.utils.check import check_tutorials_setup check_tutorials_setup() .. rst-class:: sphx-glr-script-out .. code-block:: none System: python_executable : /home/runner/work/gammapy-docs/gammapy-docs/gammapy/.tox/build_docs/bin/python python_version : 3.9.20 machine : x86_64 system : Linux Gammapy package: version : 1.3 path : /home/runner/work/gammapy-docs/gammapy-docs/gammapy/.tox/build_docs/lib/python3.9/site-packages/gammapy Other packages: numpy : 1.26.4 scipy : 1.13.1 astropy : 5.2.2 regions : 0.8 click : 8.1.7 yaml : 6.0.2 IPython : 8.18.1 jupyterlab : not installed matplotlib : 3.9.2 pandas : not installed healpy : 1.17.3 iminuit : 2.30.1 sherpa : 4.16.1 naima : 0.10.0 emcee : 3.1.6 corner : 2.2.3 ray : 2.39.0 Gammapy environment variables: GAMMAPY_DATA : /home/runner/work/gammapy-docs/gammapy-docs/gammapy-datasets/1.3 .. GENERATED FROM PYTHON SOURCE LINES 87-99 Analysis configuration ---------------------- For configuration of the analysis we use the `YAML `__ data format. YAML is a machine-readable serialisation format, that is also friendly for humans to read. In this tutorial we will write the configuration file just using Python strings, but of course the file can be created and modified with any text editor of your choice. Here is what the configuration for our analysis looks like: .. GENERATED FROM PYTHON SOURCE LINES 99-131 .. code-block:: Python yaml_str = """ observations: datastore: $GAMMAPY_DATA/hess-dl3-dr1 obs_cone: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 5 deg} datasets: type: 1d stack: true geom: axes: energy: {min: 0.5 TeV, max: 30 TeV, nbins: 20} energy_true: {min: 0.1 TeV, max: 50 TeV, nbins: 40} on_region: {frame: icrs, lon: 83.633 deg, lat: 22.014 deg, radius: 0.11 deg} containment_correction: true safe_mask: methods: ['aeff-default', 'aeff-max'] parameters: {aeff_percent: 0.1} background: method: reflected fit: fit_range: {min: 1 TeV, max: 20 TeV} flux_points: energy: {min: 1 TeV, max: 20 TeV, nbins: 8} source: 'crab' """ config = AnalysisConfig.from_yaml(yaml_str) print(config) .. rst-class:: sphx-glr-script-out .. code-block:: none AnalysisConfig general: log: level: info filename: null filemode: null format: null datefmt: null outdir: . n_jobs: 1 datasets_file: null models_file: null observations: datastore: /home/runner/work/gammapy-docs/gammapy-docs/gammapy-datasets/1.3/hess-dl3-dr1 obs_ids: [] obs_file: null obs_cone: frame: icrs lon: 83.633 deg lat: 22.014 deg radius: 5.0 deg obs_time: start: null stop: null required_irf: - aeff - edisp - psf - bkg datasets: type: 1d stack: true geom: wcs: skydir: frame: null lon: null lat: null binsize: 0.02 deg width: width: 5.0 deg height: 5.0 deg binsize_irf: 0.2 deg selection: offset_max: 2.5 deg axes: energy: min: 0.5 TeV max: 30.0 TeV nbins: 20 energy_true: min: 0.1 TeV max: 50.0 TeV nbins: 40 map_selection: - counts - exposure - background - psf - edisp background: method: reflected exclusion: null parameters: {} safe_mask: methods: - aeff-default - aeff-max parameters: aeff_percent: 0.1 on_region: frame: icrs lon: 83.633 deg lat: 22.014 deg radius: 0.11 deg containment_correction: true fit: fit_range: min: 1.0 TeV max: 20.0 TeV flux_points: energy: min: 1.0 TeV max: 20.0 TeV nbins: 8 source: crab parameters: selection_optional: all excess_map: correlation_radius: 0.1 deg parameters: {} energy_edges: min: null max: null nbins: null light_curve: time_intervals: start: null stop: null energy_edges: min: null max: null nbins: null source: source parameters: selection_optional: all metadata: creator: Gammapy 1.3 date: '2024-11-26T10:10:19.746294' origin: null .. GENERATED FROM PYTHON SOURCE LINES 132-135 Note that you can save this string into a yaml file and load it as follow: .. GENERATED FROM PYTHON SOURCE LINES 135-141 .. code-block:: Python # config = AnalysisConfig.read("config-1d.yaml") # # the AnalysisConfig gives access to the various parameters used from logging to reduced dataset geometries # print(config) .. GENERATED FROM PYTHON SOURCE LINES 142-145 Using data stored into your computer ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 148-157 Here, we want to use Crab runs from the H.E.S.S. DL3-DR1. We have defined the datastore and a cone search of observations pointing with 5 degrees of the Crab nebula. Parameters can be set directly or as a python dict. PS: do not forget to set up your environment variable `$GAMMAPY_DATA` to your local directory containing the H.E.S.S. DL3-DR1 as described in :ref:`quickstart-setup`. .. GENERATED FROM PYTHON SOURCE LINES 160-163 Setting the exclusion mask ~~~~~~~~~~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 166-171 In order to properly adjust the background normalisation on regions without gamma-ray signal, one needs to define an exclusion mask for the background normalisation. For this tutorial, we use the following one `$GAMMAPY_DATA/joint-crab/exclusion/exclusion_mask_crab.fits.gz` .. GENERATED FROM PYTHON SOURCE LINES 171-177 .. code-block:: Python config.datasets.background.exclusion = ( "$GAMMAPY_DATA/joint-crab/exclusion/exclusion_mask_crab.fits.gz" ) .. GENERATED FROM PYTHON SOURCE LINES 178-181 We’re all set. But before we go on let’s see how to save or import `~gammapy.analysis.AnalysisConfig` objects though YAML files. .. GENERATED FROM PYTHON SOURCE LINES 184-189 Using YAML configuration files for setting/writing the Data Reduction parameters ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ One can export/import the `~gammapy.analysis.AnalysisConfig` to/from a YAML file. .. GENERATED FROM PYTHON SOURCE LINES 189-196 .. code-block:: Python config.write("config.yaml", overwrite=True) config = AnalysisConfig.read("config.yaml") print(config) .. rst-class:: sphx-glr-script-out .. code-block:: none AnalysisConfig general: log: level: info filename: null filemode: null format: null datefmt: null outdir: . n_jobs: 1 datasets_file: null models_file: null observations: datastore: /home/runner/work/gammapy-docs/gammapy-docs/gammapy-datasets/1.3/hess-dl3-dr1 obs_ids: [] obs_file: null obs_cone: frame: icrs lon: 83.633 deg lat: 22.014 deg radius: 5.0 deg obs_time: start: null stop: null required_irf: - aeff - edisp - psf - bkg datasets: type: 1d stack: true geom: wcs: skydir: frame: null lon: null lat: null binsize: 0.02 deg width: width: 5.0 deg height: 5.0 deg binsize_irf: 0.2 deg selection: offset_max: 2.5 deg axes: energy: min: 0.5 TeV max: 30.0 TeV nbins: 20 energy_true: min: 0.1 TeV max: 50.0 TeV nbins: 40 map_selection: - counts - exposure - background - psf - edisp background: method: reflected exclusion: /home/runner/work/gammapy-docs/gammapy-docs/gammapy-datasets/1.3/joint-crab/exclusion/exclusion_mask_crab.fits.gz parameters: {} safe_mask: methods: - aeff-default - aeff-max parameters: aeff_percent: 0.1 on_region: frame: icrs lon: 83.633 deg lat: 22.014 deg radius: 0.11 deg containment_correction: true fit: fit_range: min: 1.0 TeV max: 20.0 TeV flux_points: energy: min: 1.0 TeV max: 20.0 TeV nbins: 8 source: crab parameters: selection_optional: all excess_map: correlation_radius: 0.1 deg parameters: {} energy_edges: min: null max: null nbins: null light_curve: time_intervals: start: null stop: null energy_edges: min: null max: null nbins: null source: source parameters: selection_optional: all metadata: creator: Gammapy 1.3 date: '2024-11-26T10:10:19.771508' origin: null .. GENERATED FROM PYTHON SOURCE LINES 197-200 Running the first step of the analysis: the Data Reduction ---------------------------------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 203-209 Configuration of the analysis ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ We first create an `~gammapy.analysis.Analysis` object from our configuration. .. GENERATED FROM PYTHON SOURCE LINES 209-213 .. code-block:: Python analysis = Analysis(config) .. GENERATED FROM PYTHON SOURCE LINES 214-220 Observation selection ~~~~~~~~~~~~~~~~~~~~~ We can directly select and load the observations from disk using `~gammapy.analysis.Analysis.get_observations()`: .. GENERATED FROM PYTHON SOURCE LINES 220-224 .. code-block:: Python analysis.get_observations() .. GENERATED FROM PYTHON SOURCE LINES 225-228 The observations are now available on the `Analysis` object. The selection corresponds to the following ids: .. GENERATED FROM PYTHON SOURCE LINES 228-232 .. code-block:: Python print(analysis.observations.ids) .. rst-class:: sphx-glr-script-out .. code-block:: none ['23523', '23526', '23559', '23592'] .. GENERATED FROM PYTHON SOURCE LINES 233-236 To see how to explore observations, please refer to the following notebook: :doc:`/tutorials/data/cta` or :doc:`/tutorials/data/hess` .. GENERATED FROM PYTHON SOURCE LINES 239-246 Running the Data Reduction ~~~~~~~~~~~~~~~~~~~~~~~~~~ Now we proceed to the data reduction. In the config file we have chosen a WCS map geometry, energy axis and decided to stack the maps. We can run the reduction using `.get_datasets()`: .. GENERATED FROM PYTHON SOURCE LINES 248-251 .. code-block:: Python analysis.get_datasets() .. GENERATED FROM PYTHON SOURCE LINES 252-258 Results exploration ~~~~~~~~~~~~~~~~~~~ As we have chosen to stack the data, one can print what contains the unique entry of the datasets: .. GENERATED FROM PYTHON SOURCE LINES 258-262 .. code-block:: Python print(analysis.datasets[0]) .. rst-class:: sphx-glr-script-out .. code-block:: none SpectrumDatasetOnOff -------------------- Name : stacked Total counts : 427 Total background counts : 25.86 Total excess counts : 401.14 Predicted counts : 43.14 Predicted background counts : 43.14 Predicted excess counts : nan Exposure min : 2.90e+07 m2 s Exposure max : 2.64e+09 m2 s Number of total bins : 20 Number of fit bins : 18 Fit statistic type : wstat Fit statistic value (-2 log(L)) : 1396.10 Number of models : 0 Number of parameters : 0 Number of free parameters : 0 Total counts_off : 581 Acceptance : 70 Acceptance off : 1756 .. GENERATED FROM PYTHON SOURCE LINES 263-270 As you can see the dataset uses WStat with the background computed with the Reflected Background method during the data reduction, but no source model has been set yet. The counts, exposure and background, etc are directly available on the dataset and can be printed: .. GENERATED FROM PYTHON SOURCE LINES 270-280 .. code-block:: Python info_table = analysis.datasets.info_table() info_table print( f"Tobs={info_table['livetime'].to('h')[0]:.1f} Excess={info_table['excess'].value[0]:.1f} \ Significance={info_table['sqrt_ts'][0]:.2f}" ) .. rst-class:: sphx-glr-script-out .. code-block:: none Tobs=1.8 h Excess=401.1 Significance=37.04 .. GENERATED FROM PYTHON SOURCE LINES 281-288 Save dataset to disk ~~~~~~~~~~~~~~~~~~~~ It is common to run the preparation step independent of the likelihood fit, because often the preparation of counts, collection are and energy dispersion is slow if you have a lot of data. We first create a folder: .. GENERATED FROM PYTHON SOURCE LINES 288-293 .. code-block:: Python path = Path("hli_spectrum_analysis") path.mkdir(exist_ok=True) .. GENERATED FROM PYTHON SOURCE LINES 294-297 And then write the stacked dataset to disk by calling the dedicated `write()` method: .. GENERATED FROM PYTHON SOURCE LINES 297-302 .. code-block:: Python filename = path / "crab-stacked-dataset.fits.gz" analysis.datasets.write(filename, overwrite=True) .. GENERATED FROM PYTHON SOURCE LINES 303-306 Model fitting ------------- .. GENERATED FROM PYTHON SOURCE LINES 309-317 Creation of the model ~~~~~~~~~~~~~~~~~~~~~ First, let’s create a model to be adjusted. As we are performing a 1D Analysis, only a spectral model is needed within the `SkyModel` object. Here is a pre-defined YAML configuration file created for this 1D analysis: .. GENERATED FROM PYTHON SOURCE LINES 317-345 .. code-block:: Python model_str = """ components: - name: crab type: SkyModel spectral: type: PowerLawSpectralModel parameters: - name: index frozen: false scale: 1.0 unit: '' value: 2.6 - name: amplitude frozen: false scale: 1.0 unit: cm-2 s-1 TeV-1 value: 5.0e-11 - name: reference frozen: true scale: 1.0 unit: TeV value: 1.0 """ model_1d = Models.from_yaml(model_str) print(model_1d) .. rst-class:: sphx-glr-script-out .. code-block:: none Models Component 0: SkyModel Name : crab Datasets names : None Spectral model type : PowerLawSpectralModel Spatial model type : Temporal model type : Parameters: index : 2.600 +/- 0.00 amplitude : 5.00e-11 +/- 0.0e+00 1 / (cm2 s TeV) reference (frozen): 1.000 TeV .. GENERATED FROM PYTHON SOURCE LINES 346-348 Or from a yaml file, e.g. .. GENERATED FROM PYTHON SOURCE LINES 348-353 .. code-block:: Python # model_1d = Models.read("model-1d.yaml") # print(model_1d) .. GENERATED FROM PYTHON SOURCE LINES 354-356 Now we set the model on the analysis object: .. GENERATED FROM PYTHON SOURCE LINES 356-360 .. code-block:: Python analysis.set_models(model_1d) .. GENERATED FROM PYTHON SOURCE LINES 361-368 Setting fitting parameters ~~~~~~~~~~~~~~~~~~~~~~~~~~ `Analysis` can perform a few modeling and fitting tasks besides data reduction. Parameters have then to be passed to the configuration object. .. GENERATED FROM PYTHON SOURCE LINES 371-374 Running the fit ~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 376-379 .. code-block:: Python analysis.run_fit() .. GENERATED FROM PYTHON SOURCE LINES 380-383 Exploration of the fit results ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 383-389 .. code-block:: Python print(analysis.fit_result) display(model_1d.to_parameters_table()) .. rst-class:: sphx-glr-script-out .. code-block:: none OptimizeResult backend : minuit method : migrad success : True message : Optimization terminated successfully. nfev : 37 total stat : 10.29 CovarianceResult backend : minuit method : hesse success : True message : Hesse terminated successfully. model type name value unit ... min max frozen link prior ----- ---- --------- ---------- -------------- ... --- --- ------ ---- ----- crab index 2.6768e+00 ... nan nan False crab amplitude 4.6795e-11 cm-2 s-1 TeV-1 ... nan nan False crab reference 1.0000e+00 TeV ... nan nan True .. GENERATED FROM PYTHON SOURCE LINES 390-393 To check the fit is correct, we compute the excess spectrum with the predicted counts. .. GENERATED FROM PYTHON SOURCE LINES 393-401 .. code-block:: Python ax_spectrum, ax_residuals = analysis.datasets[0].plot_fit() ax_spectrum.set_ylim(0.1, 200) ax_spectrum.set_xlim(0.2, 60) ax_residuals.set_xlim(0.2, 60) plt.show() .. image-sg:: /tutorials/analysis-1d/images/sphx_glr_spectral_analysis_hli_001.png :alt: spectral analysis hli :srcset: /tutorials/analysis-1d/images/sphx_glr_spectral_analysis_hli_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 402-407 Serialisation of the fit result ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ This is how we can write the model back to file again: .. GENERATED FROM PYTHON SOURCE LINES 407-415 .. code-block:: Python filename = path / "model-best-fit.yaml" analysis.models.write(filename, overwrite=True) with filename.open("r") as f: print(f.read()) .. rst-class:: sphx-glr-script-out .. code-block:: none components: - name: crab type: SkyModel spectral: type: PowerLawSpectralModel parameters: - name: index value: 2.6768369882260035 error: 0.1035002116788955 - name: amplitude value: 4.679478008080033e-11 unit: cm-2 s-1 TeV-1 error: 4.678683717532289e-12 - name: reference value: 1.0 unit: TeV covariance: model-best-fit_covariance.dat metadata: creator: Gammapy 1.3 date: '2024-11-26T10:10:23.327335' origin: null .. GENERATED FROM PYTHON SOURCE LINES 416-419 Creation of the Flux points --------------------------- .. GENERATED FROM PYTHON SOURCE LINES 422-425 Running the estimation ~~~~~~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 425-433 .. code-block:: Python analysis.get_flux_points() crab_fp = analysis.flux_points.data crab_fp_table = crab_fp.to_table(sed_type="dnde", formatted=True) display(crab_fp_table) .. rst-class:: sphx-glr-script-out .. code-block:: none e_ref e_min e_max dnde ... is_ul counts success norm_scan TeV TeV TeV 1 / (cm2 s TeV) ... ------ ------ ------ --------------- ... ----- ------ ------- -------------- 1.134 0.924 1.392 2.835e-11 ... False 56.0 True 0.200 .. 5.000 1.708 1.392 2.096 1.193e-11 ... False 102.0 True 0.200 .. 5.000 2.572 2.096 3.156 4.325e-12 ... False 71.0 True 0.200 .. 5.000 3.873 3.156 4.753 1.002e-12 ... False 31.0 True 0.200 .. 5.000 5.833 4.753 7.158 4.655e-13 ... False 24.0 True 0.200 .. 5.000 7.929 7.158 8.784 1.527e-13 ... False 6.0 True 0.200 .. 5.000 10.779 8.784 13.228 9.692e-14 ... False 11.0 True 0.200 .. 5.000 16.233 13.228 19.921 1.523e-14 ... False 3.0 True 0.200 .. 5.000 .. GENERATED FROM PYTHON SOURCE LINES 434-436 Let’s plot the flux points with their likelihood profile .. GENERATED FROM PYTHON SOURCE LINES 436-444 .. code-block:: Python fig, ax_sed = plt.subplots() crab_fp.plot(ax=ax_sed, sed_type="e2dnde", color="darkorange") ax_sed.set_ylim(1.0e-12, 2.0e-10) ax_sed.set_xlim(0.5, 40) crab_fp.plot_ts_profiles(ax=ax_sed, sed_type="e2dnde") plt.show() .. image-sg:: /tutorials/analysis-1d/images/sphx_glr_spectral_analysis_hli_002.png :alt: spectral analysis hli :srcset: /tutorials/analysis-1d/images/sphx_glr_spectral_analysis_hli_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 445-452 Serialisation of the results ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The flux points can be exported to a fits table following the format defined `here `__ .. GENERATED FROM PYTHON SOURCE LINES 452-457 .. code-block:: Python filename = path / "flux-points.fits" analysis.flux_points.write(filename, overwrite=True) .. GENERATED FROM PYTHON SOURCE LINES 458-461 Plotting the final results of the 1D Analysis --------------------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 464-467 We can plot of the spectral fit with its error band overlaid with the flux points: .. GENERATED FROM PYTHON SOURCE LINES 467-473 .. code-block:: Python ax_sed, ax_residuals = analysis.flux_points.plot_fit() ax_sed.set_ylim(1.0e-12, 1.0e-9) ax_sed.set_xlim(0.5, 40) plt.show() .. image-sg:: /tutorials/analysis-1d/images/sphx_glr_spectral_analysis_hli_003.png :alt: spectral analysis hli :srcset: /tutorials/analysis-1d/images/sphx_glr_spectral_analysis_hli_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 474-483 What’s next? ------------ You can look at the same analysis without the high level interface in :doc:`/tutorials/analysis-1d/spectral_analysis` As we can store the best model fit, you can overlay the fit results of both methods on a unique plot. .. _sphx_glr_download_tutorials_analysis-1d_spectral_analysis_hli.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/gammapy/gammapy-webpage/v1.3?urlpath=lab/tree/notebooks/1.3/tutorials/analysis-1d/spectral_analysis_hli.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: spectral_analysis_hli.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: spectral_analysis_hli.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: spectral_analysis_hli.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_