This is a fixed-text formatted version of a Jupyter notebook

Binned light curve simulation and fitting

Prerequisites:

Context

Frequently, studies of variable sources (eg: decaying GRB light curves, AGN flares, etc) require time variable simulations. For most use cases, generating an event list is an overkill, and it suffices to use binned simulations using a temporal model.

Objective: Simulate and fit a time decaying light curve of a source with CTA using the CTA 1DC response

Proposed approach:

We will simulate 10 spectral datasets within given time intervals (Good Time Intervals) following a given spectral (a power law) and temporal profile (an exponential decay, with a decay time of 6 hr ). These are then analysed using the light curve estimator to obtain flux points. Then, we re-fit the simulated datasets to reconstruct back the injected profiles.

In summary, necessary steps are:

  • Choose observation parameters including a list of gammapy.data.GTI

  • Define temporal and spectral models from :ref:model-gallery as per science case

  • Perform the simulation (in 1D or 3D)

  • Extract the light curve from the reduced dataset as shown in light curve notebook

  • Optionaly, we show here how to fit the simulated datasets using a source model

Setup

As usual, we’ll start with some general imports…

Setup

[1]:
%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
import astropy.units as u
from astropy.coordinates import SkyCoord, Angle
from astropy.time import Time
from regions import CircleSkyRegion

import logging

log = logging.getLogger(__name__)

And some gammapy specific imports

[2]:
from gammapy.data import Observation
from gammapy.irf import load_cta_irfs
from gammapy.datasets import SpectrumDataset, Datasets
from gammapy.modeling.models import (
    PowerLawSpectralModel,
    ExpDecayTemporalModel,
    SkyModel,
)
from gammapy.maps import MapAxis
from gammapy.estimators import LightCurveEstimator
from gammapy.makers import SpectrumDatasetMaker
from gammapy.modeling import Fit

Simulating a light curve

We will simulate 10 datasets using an PowerLawSpectralModel and a ExpDecayTemporalModel. The important thing to note here is how to attach a different GTI to each dataset.

[3]:
# Loading IRFs
irfs = load_cta_irfs(
    "$GAMMAPY_DATA/cta-1dc/caldb/data/cta/1dc/bcf/South_z20_50h/irf_file.fits"
)
Invalid unit found in background table! Assuming (s-1 MeV-1 sr-1)
[4]:
# Reconstructed and true energy axis
center = SkyCoord(0.0, 0.0, unit="deg", frame="galactic")
energy_axis = MapAxis.from_edges(
    np.logspace(-0.5, 1.0, 10), unit="TeV", name="energy", interp="log"
)
energy_axis_true = MapAxis.from_edges(
    np.logspace(-1.2, 2.0, 31), unit="TeV", name="energy_true", interp="log"
)

on_region_radius = Angle("0.11 deg")
on_region = CircleSkyRegion(center=center, radius=on_region_radius)
[5]:
# Pointing position
pointing = SkyCoord(0.5, 0.5, unit="deg", frame="galactic")

Note that observations are usually conducted in Wobble mode, in which the source is not in the center of the camera. This allows to have a symmetrical sky position from which background can be estimated.

[6]:
# Define the source model: A combination of spectral and temporal model

gti_t0 = Time("2020-03-01")
spectral_model = PowerLawSpectralModel(
    index=3, amplitude="1e-11 cm-2 s-1 TeV-1", reference="1 TeV"
)
temporal_model = ExpDecayTemporalModel(t0="6 h", t_ref=gti_t0.mjd * u.d)

model_simu = SkyModel(
    spectral_model=spectral_model,
    temporal_model=temporal_model,
    name="model-simu",
)
/Users/adonath/software/anaconda3/envs/gammapy-dev/lib/python3.7/site-packages/astropy/units/quantity.py:477: RuntimeWarning: overflow encountered in exp
  result = super().__array_ufunc__(function, method, *arrays, **kwargs)
/Users/adonath/software/anaconda3/envs/gammapy-dev/lib/python3.7/site-packages/astropy/units/quantity.py:477: RuntimeWarning: invalid value encountered in subtract
  result = super().__array_ufunc__(function, method, *arrays, **kwargs)
[7]:
# Look at the model
model_simu.parameters.to_table()
[7]:
Table length=5
namevalueunitminmaxfrozenerror
str9float64str14float64float64boolint64
index3.0000e+00nannanFalse0.000e+00
amplitude1.0000e-11cm-2 s-1 TeV-1nannanFalse0.000e+00
reference1.0000e+00TeVnannanTrue0.000e+00
t02.5000e-01dnannanFalse0.000e+00
t_ref5.8909e+04dnannanTrue0.000e+00

Now, define the start and observation livetime wrt to the reference time, gti_t0

[8]:
n_obs = 10
tstart = [1, 2, 3, 5, 8, 10, 20, 22, 23, 24] * u.h
lvtm = [55, 25, 26, 40, 40, 50, 40, 52, 43, 47] * u.min

Now perform the simulations

[9]:
datasets = Datasets()

empty = SpectrumDataset.create(
    e_reco=energy_axis, e_true=energy_axis_true, region=on_region, name="empty"
)

maker = SpectrumDatasetMaker(selection=["exposure", "background", "edisp"])

for idx in range(n_obs):
    obs = Observation.create(
        pointing=pointing,
        livetime=lvtm[idx],
        tstart=tstart[idx],
        irfs=irfs,
        reference_time=gti_t0,
        obs_id=idx,
    )
    empty_i = empty.copy(name=f"dataset-{idx}")
    dataset = maker.run(empty_i, obs)
    dataset.models = model_simu
    dataset.fake()
    datasets.append(dataset)

The reduced datasets have been successfully simulated. Let’s take a quick look into our datasets.

[10]:
datasets.info_table()
[10]:
Table length=10
namecountsbackgroundexcesssqrt_tsnprednpred_backgroundnpred_signalexposure_minexposure_maxlivetimeontimecounts_ratebackground_rateexcess_raten_binsn_fit_binsstat_typestat_sum
m2 sm2 sss1 / s1 / s1 / s
str9int64float64float64float64float64float64float64float64float64float64float64float64float64float64int64int64str4float64
dataset-080520.30377174963533784.696228250364765.99594588024884825.92541115399820.30377174963533805.6216394043627216137902.0582394616025275854.0869873299.9999999999993299.9999999999990.2439393939393940.0061526581059501020.2377867358334439299cash-6456.367158795519
dataset-12989.228987158925152288.771012841074838.64460511081799332.142788273227679.228987158925152322.9138011143025498244500.935563427284216297.3122691500.01500.00.198666666666666660.0061526581059501010.1925140085607165499cash-1797.4710822441302
dataset-22769.598146645282158266.4018533547178436.34929048817018293.482506271000559.598146645282158283.8843596257184102174280.972985957575584949.204761560.01560.00.176923076923076930.0061526581059501010.1707704188171268299cash-1559.7025929394501
dataset-331114.766379454280242296.2336205457197636.09765120832382321.77294502210214.766379454280242307.0065655678218157191201.4969014511654746075.699632400.02400.00.129583333333333330.0061526581059501010.1234306752273832399cash-1883.9474995748571
dataset-420414.76637945428022189.233620545719826.32196264783094200.9752742042407514.76637945428022186.2088947499605157191201.496901211654746075.6996142399.99999999999642399.99999999999640.085000000000000130.00615265810595010.0788473418940500399cash-1084.5718690277151
dataset-521218.457974317850304193.542025682149725.45462664361616182.985804817372418.457974317850304164.52783049952214196489001.8711268314568432594.6245383000.03000.00.070666666666666670.0061526581059501010.0645140085607165799cash-1148.7554107527278
dataset-64014.76637945428024225.233620545719765.40877806774031939.9670129664429114.76637945428024225.20063351216266157191201.4969014511654746075.699632400.02400.00.0166666666666666660.0061526581059501010.01051400856071656799cash-79.45903376664398
dataset-73119.19629329056431611.8037067094356842.471300608430315342.29064881603676419.19629329056431623.094355525472444204348561.945971915151169898.409523120.03120.00.0099358974358974350.0061526581059501010.003783239329947334599cash-47.659394690074194
dataset-82915.87385791335125913.1261420866487412.949543215802214632.2381748000913515.87385791335125916.364316886740088168980541.6091690712528852031.3771022580.02580.00.011240310077519380.00615265810595010.00508765197156927999cash-56.277645546824964
dataset-93817.35049585877928620.6495041412207144.27578499698385632.40901879708617.35049585877928615.058522938306712184699661.7588592213694326638.9470652820.02820.00.013475177304964540.0061526581059501010.00732251919901443799cash-81.4508661140261

Extract the lightcurve

This section uses standard light curve estimation tools for a 1D extraction. Only a spectral model needs to be defined in this case. Since the estimator returns the integrated flux separately for each time bin, the temporal model need not be accounted for at this stage.

[11]:
# Define the model:
spectral_model = PowerLawSpectralModel(
    index=3, amplitude="1e-11 cm-2 s-1 TeV-1", reference="1 TeV"
)
model_fit = SkyModel(spectral_model=spectral_model, name="model-fit")
[12]:
# Attach model to each dataset
for dataset in datasets:
    dataset.models = model_fit
[13]:
%%time
lc_maker_1d = LightCurveEstimator(
    energy_edges=[energy_axis.edges[0], energy_axis.edges[-1]],
    source="model-fit",
)
lc_1d = lc_maker_1d.run(datasets)
CPU times: user 4.93 s, sys: 50.7 ms, total: 4.98 s
Wall time: 5.05 s
[14]:
lc_1d.table["is_ul"] = lc_1d.table["ts"] < 1
[15]:
ax = lc_1d.plot(marker="o", label="3D")
../_images/tutorials_light_curve_simulation_24_0.png

We have the reconstructed lightcurve at this point. Further standard analyis might involve modeling the temporal profiles with an analytical or theoretical model. You may do this using your favourite fitting package, one possible option being curve_fit inside scipy.optimize.

In the next section, we show how to simulatenously fit the all datasets using a given temporal model. This does a joint fitting across the different datasets, while simultaneously miniminsing across the temporal model parameters as well. We will fit the amplitude, spectral index and the decay time scale. Note that t_ref should be fixed by default for the ExpDecayTemporalModel.

For modelling and fitting more complex flares, you should attach the relevant model to each group of datasets. The paramters of a model in a given group of dataset will be tied. For more details on joint fitting in gammapy, see here.

Fit the datasets

[16]:
# Define the model:
spectral_model1 = PowerLawSpectralModel(
    index=2.0, amplitude="1e-12 cm-2 s-1 TeV-1", reference="1 TeV"
)
temporal_model1 = ExpDecayTemporalModel(t0="10 h", t_ref=gti_t0.mjd * u.d)

model = SkyModel(
    spectral_model=spectral_model1,
    temporal_model=temporal_model1,
    name="model-test",
)
/Users/adonath/software/anaconda3/envs/gammapy-dev/lib/python3.7/site-packages/astropy/units/quantity.py:477: RuntimeWarning: overflow encountered in exp
  result = super().__array_ufunc__(function, method, *arrays, **kwargs)
/Users/adonath/software/anaconda3/envs/gammapy-dev/lib/python3.7/site-packages/astropy/units/quantity.py:477: RuntimeWarning: invalid value encountered in subtract
  result = super().__array_ufunc__(function, method, *arrays, **kwargs)
[17]:
model.parameters.to_table()
[17]:
Table length=5
namevalueunitminmaxfrozenerror
str9float64str14float64float64boolint64
index2.0000e+00nannanFalse0.000e+00
amplitude1.0000e-12cm-2 s-1 TeV-1nannanFalse0.000e+00
reference1.0000e+00TeVnannanTrue0.000e+00
t04.1667e-01dnannanFalse0.000e+00
t_ref5.8909e+04dnannanTrue0.000e+00
[18]:
datasets.models = model
[19]:
%%time
# Do a joint fit
fit = Fit(datasets)
result = fit.run()
CPU times: user 8.73 s, sys: 83.1 ms, total: 8.82 s
Wall time: 9.11 s
[20]:
result.parameters.to_table()
[20]:
Table length=5
namevalueunitminmaxfrozenerror
str9float64str14float64float64boolfloat64
index2.9993e+00nannanFalse3.228e-02
amplitude9.4745e-12cm-2 s-1 TeV-1nannanFalse3.295e-13
reference1.0000e+00TeVnannanTrue0.000e+00
t02.5955e-01dnannanFalse8.852e-03
t_ref5.8909e+04dnannanTrue0.000e+00

We see that the fitted parameters match well with the simulated ones!

Exercises

  1. Re-do the analysis with MapDataset instead of SpectralDataset

  2. Model the flare of PKS 2155-304 which you obtained using the light curve flare tutorial. Use a combination of a Gaussian and Exponential flare profiles, and fit using scipy.optimize.curve_fit

  3. Do a joint fitting of the datasets.

[ ]: