.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "tutorials/analysis-time/light_curve_flare.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_tutorials_analysis-time_light_curve_flare.py: Light curves for flares ======================= Compute the light curve of a PKS 2155-304 flare on 10 minutes time intervals. Prerequisites ------------- - Understanding of how the light curve estimator works, please refer to the :doc:`light curve notebook `. Context ------- Frequently, especially when studying flares of bright sources, it is necessary to explore the time behaviour of a source on short time scales, in particular on time scales shorter than observing runs. A typical example is given by the flare of PKS 2155-304 during the night from July 29 to 30 2006. See the `following article `__. **Objective: Compute the light curve of a PKS 2155-304 flare on 5 minutes time intervals, i.e. smaller than the duration of individual observations.** Proposed approach ----------------- We have seen in the general presentation of the light curve estimator, see the :doc:`light curve notebook `, Gammapy produces datasets in a given time interval, by default that of the parent observation. To be able to produce datasets on smaller time steps, it is necessary to split the observations into the required time intervals. This is easily performed with the `~gammapy.data.Observations.select_time` method of `~gammapy.data.Observations`. If you pass it a list of time intervals it will produce a list of time filtered observations in a new `~gammapy.data.Observations` object. Data reduction can then be performed and will result in datasets defined on the required time intervals and light curve estimation can proceed directly. In summary, we have to: - Select relevant `~gammapy.data.Observations` from the `~gammapy.data.DataStore` - Apply the time selection in our predefined time intervals to obtain a new `~gammapy.data.Observations` - Perform the data reduction (in 1D or 3D) - Define the source model - Extract the light curve from the reduced dataset Here, we will use the PKS 2155-304 observations from the `H.E.S.S. first public test data release `__. We will use time intervals of 5 minutes duration. The tutorial is implemented with the intermediate level API. Setup ----- As usual, we’ll start with some general imports… .. GENERATED FROM PYTHON SOURCE LINES 65-92 .. code-block:: Python import logging import numpy as np import astropy.units as u from astropy.coordinates import Angle, SkyCoord from astropy.time import Time from regions import CircleSkyRegion # %matplotlib inline import matplotlib.pyplot as plt log = logging.getLogger(__name__) from gammapy.data import DataStore from gammapy.datasets import Datasets, SpectrumDataset from gammapy.estimators import LightCurveEstimator from gammapy.estimators.utils import get_rebinned_axis from gammapy.makers import ( ReflectedRegionsBackgroundMaker, SafeMaskMaker, SpectrumDatasetMaker, ) from gammapy.maps import MapAxis, RegionGeom from gammapy.modeling.models import PowerLawSpectralModel, SkyModel from gammapy.modeling import Fit .. GENERATED FROM PYTHON SOURCE LINES 93-98 Select the data --------------- We first set the datastore. .. GENERATED FROM PYTHON SOURCE LINES 98-102 .. code-block:: Python data_store = DataStore.from_dir("$GAMMAPY_DATA/hess-dl3-dr1/") .. GENERATED FROM PYTHON SOURCE LINES 103-105 Now we select observations within 2 degrees of PKS 2155-304. .. GENERATED FROM PYTHON SOURCE LINES 105-119 .. code-block:: Python target_position = SkyCoord(329.71693826 * u.deg, -30.2255890 * u.deg, frame="icrs") selection = dict( type="sky_circle", frame="icrs", lon=target_position.ra, lat=target_position.dec, radius=2 * u.deg, ) obs_ids = data_store.obs_table.select_observations(selection)["OBS_ID"] observations = data_store.get_observations(obs_ids) print(f"Number of selected observations : {len(observations)}") .. rst-class:: sphx-glr-script-out .. code-block:: none Number of selected observations : 21 .. GENERATED FROM PYTHON SOURCE LINES 120-125 Define time intervals --------------------- We create the list of time intervals, each of duration 10 minutes. Each time interval is an `astropy.time.Time` object, containing a start and stop time. .. GENERATED FROM PYTHON SOURCE LINES 125-134 .. code-block:: Python t0 = Time("2006-07-29T20:30") duration = 10 * u.min n_time_bins = 35 times = t0 + np.arange(n_time_bins) * duration time_intervals = [Time([tstart, tstop]) for tstart, tstop in zip(times[:-1], times[1:])] print(time_intervals[0].mjd) .. rst-class:: sphx-glr-script-out .. code-block:: none [53945.85416667 53945.86111111] .. GENERATED FROM PYTHON SOURCE LINES 135-145 Filter the observations list in time intervals ---------------------------------------------- Here we apply the list of time intervals to the observations with `~gammapy.data.Observations.select_time()`. This will return a new list of Observations filtered by ``time_intervals``. For each time interval, a new observation is created that converts the intersection of the GTIs and time interval. .. GENERATED FROM PYTHON SOURCE LINES 145-152 .. code-block:: Python short_observations = observations.select_time(time_intervals) # check that observations have been filtered print(f"Number of observations after time filtering: {len(short_observations)}\n") print(short_observations[1].gti) .. rst-class:: sphx-glr-script-out .. code-block:: none Number of observations after time filtering: 44 GTI info: - Number of GTIs: 1 - Duration: 599.9999999999978 s - Start: 207520865.18400002 s MET - Start: 2006-07-29T20:40:00.000 (time standard: UTC) - Stop: 207521465.184 s MET - Stop: 2006-07-29T20:50:00.000 (time standard: UTC) .. GENERATED FROM PYTHON SOURCE LINES 153-159 As we can see, we have now observations of duration equal to the chosen time step. Now data reduction and light curve extraction can proceed exactly as before. .. GENERATED FROM PYTHON SOURCE LINES 162-170 Building 1D datasets from the new observations ---------------------------------------------- Here we will perform the data reduction in 1D with reflected regions. *Beware, with small time intervals the background normalization with OFF regions might become problematic.* .. GENERATED FROM PYTHON SOURCE LINES 173-183 Defining the geometry ~~~~~~~~~~~~~~~~~~~~~ We define the energy axes. As usual, the true energy axis has to cover a wider range to ensure a good coverage of the measured energy range chosen. We need to define the ON extraction region. Its size follows typical spectral extraction regions for H.E.S.S. analyses. .. GENERATED FROM PYTHON SOURCE LINES 183-196 .. code-block:: Python # Target definition energy_axis = MapAxis.from_energy_bounds("0.4 TeV", "20 TeV", nbin=10) energy_axis_true = MapAxis.from_energy_bounds( "0.1 TeV", "40 TeV", nbin=20, name="energy_true" ) on_region_radius = Angle("0.11 deg") on_region = CircleSkyRegion(center=target_position, radius=on_region_radius) geom = RegionGeom.create(region=on_region, axes=[energy_axis]) .. GENERATED FROM PYTHON SOURCE LINES 197-203 Creation of the data reduction makers ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ We now create the dataset and background makers for the selected geometry. .. GENERATED FROM PYTHON SOURCE LINES 203-211 .. code-block:: Python dataset_maker = SpectrumDatasetMaker( containment_correction=True, selection=["counts", "exposure", "edisp"] ) bkg_maker = ReflectedRegionsBackgroundMaker() safe_mask_masker = SafeMaskMaker(methods=["aeff-max"], aeff_percent=10) .. GENERATED FROM PYTHON SOURCE LINES 212-217 Creation of the datasets ~~~~~~~~~~~~~~~~~~~~~~~~ Now we perform the actual data reduction in the ``time_intervals``. .. GENERATED FROM PYTHON SOURCE LINES 217-230 .. code-block:: Python datasets = Datasets() dataset_empty = SpectrumDataset.create(geom=geom, energy_axis_true=energy_axis_true) for obs in short_observations: dataset = dataset_maker.run(dataset_empty.copy(), obs) dataset_on_off = bkg_maker.run(dataset, obs) dataset_on_off = safe_mask_masker.run(dataset_on_off, obs) datasets.append(dataset_on_off) .. GENERATED FROM PYTHON SOURCE LINES 231-242 Define underlying model ----------------------- Since we use forward folding to obtain the flux points in each bin, exact values will depend on the underlying model. In this example, we use a power law as used in the `reference paper `__. As we have are only using spectral datasets, we do not need any spatial models. **Note** : All time bins must have the same spectral model. To see how to investigate spectral variability, see :doc:`time resolved spectroscopy notebook `. .. GENERATED FROM PYTHON SOURCE LINES 242-246 .. code-block:: Python spectral_model = PowerLawSpectralModel(amplitude=1e-10 * u.Unit("1 / (cm2 s TeV)")) sky_model = SkyModel(spatial_model=None, spectral_model=spectral_model, name="pks2155") .. GENERATED FROM PYTHON SOURCE LINES 247-252 Assign to model to all datasets ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ We assign each dataset its spectral model .. GENERATED FROM PYTHON SOURCE LINES 252-260 .. code-block:: Python datasets.models = sky_model fit = Fit() result = fit.run(datasets) print(result.models.to_parameters_table()) .. rst-class:: sphx-glr-script-out .. code-block:: none model type name value unit ... min max frozen link prior ------- ---- --------- ---------- -------------- ... --- --- ------ ---- ----- pks2155 index 3.5392e+00 ... nan nan False pks2155 amplitude 9.4675e-11 TeV-1 s-1 cm-2 ... nan nan False pks2155 reference 1.0000e+00 TeV ... nan nan True .. GENERATED FROM PYTHON SOURCE LINES 261-272 Extract the light curve ----------------------- We first create the `~gammapy.estimators.LightCurveEstimator` for the list of datasets we just produced. We give the estimator the name of the source component to be fitted. We can directly compute the light curve in multiple energy bins by supplying a list of ``energy_edges``. By default, the likelihood scan is computed from 0.2 to 5.0. Here, we increase the max value to 10.0, because we are dealing with a large flare. .. GENERATED FROM PYTHON SOURCE LINES 272-283 .. code-block:: Python lc_maker_1d = LightCurveEstimator( energy_edges=[0.7, 1, 20] * u.TeV, source="pks2155", time_intervals=time_intervals, selection_optional="all", n_jobs=4, ) lc_maker_1d.norm.scan_max = 10 .. GENERATED FROM PYTHON SOURCE LINES 284-289 We can now perform the light curve extraction itself. To compare with the `reference paper `__, we select the 0.7-20 TeV range. .. GENERATED FROM PYTHON SOURCE LINES 289-293 .. code-block:: Python lc_1d = lc_maker_1d.run(datasets) .. GENERATED FROM PYTHON SOURCE LINES 294-296 Finally we plot the result for the 1D lightcurve: .. GENERATED FROM PYTHON SOURCE LINES 296-303 .. code-block:: Python plt.figure(figsize=(8, 6)) plt.tight_layout() plt.subplots_adjust(bottom=0.3) lc_1d.plot(marker="o", axis_name="time", sed_type="flux") plt.show() .. image-sg:: /tutorials/analysis-time/images/sphx_glr_light_curve_flare_001.png :alt: light curve flare :srcset: /tutorials/analysis-time/images/sphx_glr_light_curve_flare_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 304-309 Light curves once obtained can be rebinned using the likelihood profiles. Here, we rebin 3 adjacent bins together, to get 30 minute bins. We will first slice ``lc_1d`` to obtain the lightcurve in the first energy bin .. GENERATED FROM PYTHON SOURCE LINES 309-328 .. code-block:: Python slices = {"energy": slice(0, 1)} sliced_lc = lc_1d.slice_by_idx(slices) print(sliced_lc) axis_new = get_rebinned_axis( sliced_lc, method="fixed-bins", group_size=3, axis_name="time" ) print(axis_new) lc_new = sliced_lc.resample_axis(axis_new) plt.figure(figsize=(8, 6)) plt.tight_layout() plt.subplots_adjust(bottom=0.3) ax = sliced_lc.plot(label="original") lc_new.plot(ax=ax, label="rebinned") plt.legend() plt.show() .. image-sg:: /tutorials/analysis-time/images/sphx_glr_light_curve_flare_002.png :alt: light curve flare :srcset: /tutorials/analysis-time/images/sphx_glr_light_curve_flare_002.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none FluxPoints ---------- geom : RegionGeom axes : ['lon', 'lat', 'energy', 'time'] shape : (1, 1, 1, 34) quantities : ['norm', 'norm_err', 'norm_errn', 'norm_errp', 'norm_ul', 'norm_sensitivity', 'ts', 'npred', 'npred_excess', 'stat', 'stat_null', 'stat_scan', 'counts', 'success'] ref. model : pl n_sigma : 1 n_sigma_ul : 2 sqrt_ts_threshold_ul : 2 sed type init : likelihood TimeMapAxis ----------- name : time nbins : 12 reference time : 2006-07-29 20:31:05.184 scale : tt time min. : 2006-07-29 20:31:05.184 time max. : 2006-07-30 02:11:05.184 total time : 5.6666666666583865 h .. GENERATED FROM PYTHON SOURCE LINES 329-331 We can use the sliced lightcurve to understand the variability, as shown in the :doc:`/tutorials/analysis-time/variability_estimation` tutorial. .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 52.327 seconds) .. _sphx_glr_download_tutorials_analysis-time_light_curve_flare.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/gammapy/gammapy-webpage/v2.0?urlpath=lab/tree/notebooks/2.0/tutorials/analysis-time/light_curve_flare.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: light_curve_flare.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: light_curve_flare.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: light_curve_flare.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_