{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "
\n", "\n", "**This is a fixed-text formatted version of a Jupyter notebook**\n", "\n", "- Try online [![Binder](https://static.mybinder.org/badge.svg)](https://mybinder.org/v2/gh/gammapy/gammapy-webpage/v0.17?urlpath=lab/tree/modeling_2D.ipynb)\n", "- You can contribute with your own notebooks in this\n", "[GitHub repository](https://github.com/gammapy/gammapy/tree/master/tutorials).\n", "- **Source files:**\n", "[modeling_2D.ipynb](../_static/notebooks/modeling_2D.ipynb) |\n", "[modeling_2D.py](../_static/notebooks/modeling_2D.py)\n", "
\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Modeling and fitting 2D images using Gammapy\n", "\n", "## Prerequisites:\n", " - To understand how a generel modelling and fiiting works in gammapy, please refer to the [analysis_3d tutorial](analysis_3d.ipynb)\n", "\n", "## Context:\n", "We often want the determine the position and morphology of an object. To do so, we don't necessarily have to resort to a full 3D fitting but can perform a simple image fitting, in particular, in an energy range where the PSF does not vary strongly, or if we want to explore a possible energy dependance of the morphology.\n", "\n", "\n", "## Objective: \n", "To localize a source and/or constrain its morphology.\n", "\n", "## Proposed approach:\n", "\n", "The first step here, as in most analysis with DL3 data, is to create reduced datasets. For this, we will use the `Analysis` class to create a single set of stacked maps with a single bin in energy (thus, an *image* which behaves as a *cube*). This, we will then model with a spatial model of our choice, while keeping the spectral model fixed to an integrated power law." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Setup\n", "As usual, we'll start with some general imports..." ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "%matplotlib inline\n", "import astropy.units as u\n", "import numpy as np\n", "from astropy.coordinates import SkyCoord\n", "from astropy.time import Time\n", "from regions import CircleSkyRegion\n", "from astropy.coordinates import Angle\n", "\n", "import logging\n", "\n", "log = logging.getLogger(__name__)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's import gammapy specific classes and functions" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "from gammapy.analysis import Analysis, AnalysisConfig" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Creating the config file" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now, we create a config file for out analysis. You may load this from disc if you have a pre-defined config file.\n", "\n", "Here, we use 3 simulated CTA runs of the galactic center." ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "config = AnalysisConfig()\n", "# Selecting the observations\n", "config.observations.datastore = \"$GAMMAPY_DATA/cta-1dc/index/gps/\"\n", "config.observations.obs_ids = [110380, 111140, 111159]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Technically, gammapy implements 2D analysis as a special case of 3D analysis (one one bin in energy). So, we must specify the type of analysis as *3D*, and define the geometry of the analysis." ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [], "source": [ "config.datasets.type = \"3d\"\n", "config.datasets.geom.wcs.skydir = {\n", " \"lon\": \"0 deg\",\n", " \"lat\": \"0 deg\",\n", " \"frame\": \"galactic\",\n", "} # The WCS geometry - centered on the galactic center\n", "config.datasets.geom.wcs.fov = {\"width\": \"10 deg\", \"height\": \"8 deg\"}\n", "config.datasets.geom.wcs.binsize = \"0.02 deg\"\n", "\n", "# The FoV radius to use for cutouts\n", "config.datasets.geom.selection.offset_max = 3.5 * u.deg\n", "\n", "# We now fix the energy axis for the counts map - (the reconstructed energy binning)\n", "config.datasets.geom.axes.energy.min = \"0.1 TeV\"\n", "config.datasets.geom.axes.energy.max = \"10 TeV\"\n", "config.datasets.geom.axes.energy.nbins = 1\n", "\n", "config.datasets.geom.wcs.binsize_irf = 0.2 * u.deg" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "AnalysisConfig\n", "\n", " general:\n", " log: {level: info, filename: null, filemode: null, format: null, datefmt: null}\n", " outdir: .\n", " observations:\n", " datastore: $GAMMAPY_DATA/cta-1dc/index/gps\n", " obs_ids: [110380, 111140, 111159]\n", " obs_file: null\n", " obs_cone: {frame: null, lon: null, lat: null, radius: null}\n", " obs_time: {start: null, stop: null}\n", " datasets:\n", " type: 3d\n", " stack: true\n", " geom:\n", " wcs:\n", " skydir: {frame: galactic, lon: 0.0 deg, lat: 0.0 deg}\n", " binsize: 0.02 deg\n", " fov: {width: 10.0 deg, height: 8.0 deg}\n", " binsize_irf: 0.2 deg\n", " selection: {offset_max: 3.5 deg}\n", " axes:\n", " energy: {min: 0.1 TeV, max: 10.0 TeV, nbins: 1}\n", " energy_true: {min: 0.1 TeV, max: 10.0 TeV, nbins: 30}\n", " map_selection: [counts, exposure, background, psf, edisp]\n", " background:\n", " method: null\n", " exclusion: null\n", " parameters: {}\n", " safe_mask:\n", " methods: [aeff-default]\n", " parameters: {}\n", " on_region: {frame: null, lon: null, lat: null, radius: null}\n", " containment_correction: true\n", " fit:\n", " fit_range: {min: 0.1 TeV, max: 10.0 TeV}\n", " flux_points:\n", " energy: {min: 0.1 TeV, max: 10.0 TeV, nbins: 30}\n", " source: source\n", " parameters: {}\n", " \n" ] } ], "source": [ "print(config)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Getting the reduced dataset" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We now use the config file and create a single `MapDataset` containing `counts`, `background`, `exposure`, `psf` and `edisp` maps." ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "Setting logging config: {'level': 'INFO', 'filename': None, 'filemode': None, 'format': None, 'datefmt': None}\n", "Fetching observations.\n", "Number of selected observations: 3\n", "Creating geometry.\n", "Creating datasets.\n", "No background maker set for 3d analysis. Check configuration.\n", "Processing observation 110380\n", "No thresholds defined for obs Observation\n", "\n", "\tobs id : 110380 \n", " \ttstart : 59235.50\n", "\ttstop : 59235.52\n", "\tduration : 1800.00 s\n", "\tpointing (icrs) : 267.7 deg, -29.6 deg\n", "\n", "\tdeadtime fraction : 2.0%\n", "\n", "Processing observation 111140\n", "No thresholds defined for obs Observation\n", "\n", "\tobs id : 111140 \n", " \ttstart : 59275.50\n", "\ttstop : 59275.52\n", "\tduration : 1800.00 s\n", "\tpointing (icrs) : 264.2 deg, -29.5 deg\n", "\n", "\tdeadtime fraction : 2.0%\n", "\n", "Processing observation 111159\n", "No thresholds defined for obs Observation\n", "\n", "\tobs id : 111159 \n", " \ttstart : 59276.50\n", "\ttstop : 59276.52\n", "\tduration : 1800.00 s\n", "\tpointing (icrs) : 266.0 deg, -27.0 deg\n", "\n", "\tdeadtime fraction : 2.0%\n", "\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "CPU times: user 9.32 s, sys: 3.22 s, total: 12.5 s\n", "Wall time: 12.8 s\n" ] } ], "source": [ "%%time\n", "analysis = Analysis(config)\n", "analysis.get_observations()\n", "analysis.get_datasets()" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "MapDataset\n", "----------\n", "\n", " Name : stacked \n", "\n", " Total counts : 143121 \n", " Total predicted counts : 125042.02\n", " Total background counts : 125042.02\n", "\n", " Exposure min : 0.00e+00 m2 s\n", " Exposure max : 1.87e+10 m2 s\n", "\n", " Number of total bins : 200000 \n", " Number of fit bins : 186500 \n", "\n", " Fit statistic type : cash\n", " Fit statistic value (-2 log(L)) : 294443.11\n", "\n", " Number of models : 1 \n", " Number of parameters : 3\n", " Number of free parameters : 1\n", "\n", " Component 0: BackgroundModel\n", " \n", " Name : stacked-bkg\n", " Datasets names : ['stacked']\n", " Parameters:\n", " norm : 1.000 \n", " tilt (frozen) : 0.000 \n", " reference (frozen) : 1.000 TeV \n", " \n", " \n" ] } ], "source": [ "print(analysis.datasets[\"stacked\"])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The counts and background maps have only one bin in reconstructed energy. The exposure and IRF maps are in true energy, and hence, have a different binning based upon the binning of the IRFs. We need not bother about them presently." ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "WcsNDMap\n", "\n", "\tgeom : WcsGeom \n", " \taxes : ['lon', 'lat', 'energy']\n", "\tshape : (500, 400, 1)\n", "\tndim : 3\n", "\tunit : \n", "\tdtype : float32\n", "\n" ] } ], "source": [ "print(analysis.datasets[\"stacked\"].counts)" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "WcsNDMap\n", "\n", "\tgeom : WcsGeom \n", " \taxes : ['lon', 'lat', 'energy']\n", "\tshape : (500, 400, 1)\n", "\tndim : 3\n", "\tunit : \n", "\tdtype : float64\n", "\n" ] } ], "source": [ "print(analysis.datasets[\"stacked\"].background_model.map)" ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "WcsNDMap\n", "\n", "\tgeom : WcsGeom \n", " \taxes : ['lon', 'lat', 'energy_true']\n", "\tshape : (500, 400, 30)\n", "\tndim : 3\n", "\tunit : m2 s\n", "\tdtype : float32\n", "\n" ] } ], "source": [ "print(analysis.datasets[\"stacked\"].exposure)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Modelling\n", "\n", "Now, we define a model to be fitted to the dataset. **The important thing to note here is the dummy spectral model - an integrated powerlaw with only free normalisation**. Here, we use its YAML definition to load it:" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [], "source": [ "model_config = \"\"\"\n", "components:\n", "- name: GC-1\n", " type: SkyModel\n", " spatial:\n", " type: PointSpatialModel\n", " frame: galactic\n", " parameters:\n", " - name: lon_0\n", " value: 0.02\n", " unit: deg\n", " - name: lat_0 \n", " value: 0.01 \n", " unit: deg\n", " spectral:\n", " type: PowerLaw2SpectralModel\n", " parameters:\n", " - name: amplitude \n", " value: 1.0e-12\n", " unit: cm-2 s-1 \n", " - name: index\n", " value: 2.0\n", " unit: ''\n", " frozen: true\n", " - name: emin\n", " value: 0.1\n", " unit: TeV\n", " frozen: true\n", " - name: emax\n", " value: 10.0\n", " unit: TeV\n", " frozen: true \n", "\"\"\"" ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "Reading model.\n", "Models\n", "\n", "Component 0: SkyModel\n", "\n", " Name : GC-1\n", " Datasets names : None\n", " Spectral model type : PowerLaw2SpectralModel\n", " Spatial model type : PointSpatialModel\n", " Temporal model type : None\n", " Parameters:\n", " amplitude : 1.00e-12 1 / (cm2 s) \n", " index (frozen) : 2.000 \n", " emin (frozen) : 0.100 TeV \n", " emax (frozen) : 10.000 TeV \n", " lon_0 : 0.020 deg \n", " lat_0 : 0.010 deg \n", "\n", "\n" ] } ], "source": [ "analysis.set_models(model_config)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We will freeze the parameters of the background" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [], "source": [ "analysis.datasets[\"stacked\"].background_model.parameters[\"norm\"].frozen = True\n", "analysis.datasets[\"stacked\"].background_model.parameters[\"tilt\"].frozen = True" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "Fitting datasets.\n", "OptimizeResult\n", "\n", "\tbackend : minuit\n", "\tmethod : minuit\n", "\tsuccess : False\n", "\tmessage : Optimization failed.\n", "\tnfev : 24\n", "\ttotal stat : 0.00\n", "\n" ] } ], "source": [ "# To run the fit\n", "analysis.run_fit()" ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [ { "data": { "text/html": [ "Table length=9\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "
namevalueunitminmaxfrozenerror
str9float64str8float64float64boolfloat64
norm1.000e+000.000e+00nanTrue0.000e+00
tilt0.000e+00nannanTrue0.000e+00
reference1.000e+00TeVnannanTrue0.000e+00
amplitude1.000e-12cm-2 s-1nannanFalsenan
index2.000e+00nannanTrue0.000e+00
emin1.000e-01TeVnannanTrue0.000e+00
emax1.000e+01TeVnannanTrue0.000e+00
lon_02.000e-02degnannanFalsenan
lat_01.000e-02degnannanFalsenan
" ], "text/plain": [ "\n", " name value unit min max frozen error \n", " str9 float64 str8 float64 float64 bool float64 \n", "--------- --------- -------- --------- ------- ------ ---------\n", " norm 1.000e+00 0.000e+00 nan True 0.000e+00\n", " tilt 0.000e+00 nan nan True 0.000e+00\n", "reference 1.000e+00 TeV nan nan True 0.000e+00\n", "amplitude 1.000e-12 cm-2 s-1 nan nan False nan\n", " index 2.000e+00 nan nan True 0.000e+00\n", " emin 1.000e-01 TeV nan nan True 0.000e+00\n", " emax 1.000e+01 TeV nan nan True 0.000e+00\n", " lon_0 2.000e-02 deg nan nan False nan\n", " lat_0 1.000e-02 deg nan nan False nan" ] }, "execution_count": 15, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# To see the best fit values along with the errors\n", "analysis.fit_result.parameters.to_table()" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.0" }, "nbsphinx": { "orphan": true } }, "nbformat": 4, "nbformat_minor": 2 }