Working with conda?
Gammapy can be installed with Anaconda or Miniconda:
$ conda install -c conda-forge gammapy
Gammapy can be installed via pip from PyPI.
$ pip install gammapy
Update existing version? Working with virtual environments? Installing a specific version? Check the advanced installation page.
The best way to get started and learn Gammapy are the Tutorials. For convenience we provide a pre-defined conda environment file, so you can get additional useful packages together with Gammapy in a virtual isolated environment. First install Miniconda and then just execute the following commands in the terminal:
$ curl -O https://gammapy.org/download/install/gammapy-X.Y.Z-environment.yml $ conda env create -f gammapy-X.Y.Z-environment.yml
On Windows, you have to open up the conda environment file and delete the
healpy. Those are optional dependencies that
currently aren’t available on Windows.
For Apple silicon M1 (
arm64) architectures you also have to open the
environment file and delete the
sherpa entry, as currently there are
no conda packages available. However you can later install
in the environment using
python -m pip install sherpa.
Once the environment has been created you can activate it using:
$ conda activate gammapy-X.Y.Z
You can now proceed to download the Gammapy tutorial notebooks and the example datasets. The total size to download is ~180 MB. Select the location where you want to install the datasets and proceed with the following commands:
$ gammapy download notebooks $ gammapy download datasets $ conda env config vars set GAMMAPY_DATA=$PWD/gammapy-datasets/X.Y.Z $ conda activate gammapy-X.Y.Z
The last conda commands will define the environment variable within the conda environment.
Conversely, you might want to define the
variable directly in your shell with:
$ export GAMMAPY_DATA=$PWD/gammapy-datasets/X.Y.Z
If you are not using the
bash shell, handling of shell environment variables
might be different, e.g. in some shells the command to use is
set or something
else instead of
export, and also the profile setup file will be different.
On Windows, you should set the
GAMMAPY_DATA environment variable in the
“Environment Variables” settings dialog, as explained e.g.
Finally start a notebook server by executing:
$ cd notebooks $ jupyter notebook
Gammapy can read and access data from multiple gamma-ray instruments. Data from
Imaging Atmospheric Cherenkov Telescopes, such as CTA, H.E.S.S., MAGIC
and VERITAS, is typically accessed from the event list data level, called “DL3”.
This is most easily done using the
DataStore class. In addition data
can also be accessed from the level of binned events and pre-reduced instrument response functions,
so called “DL4”. This is typically the case for Fermi-LAT data or data from
Water Cherenkov Observatories. This data can be read directly using the
Gammapy lets you create a 1D spectrum by defining an analysis region in
the sky and energy binning using
The events and instrument response are binned into
IRFMap objects. In addition you can choose to estimate
the background from data using e.g. a reflected regions method.
Flux points can be computed using the
Gammapy lets you perform a combined spectral and spatial analysis as well.
This is sometimes called in jargon a “cube analysis”. Based on the 3D data reduction
Gammapy can also simulate events. Flux points can be computed using the
Gammapy allows you to compute light curves in various ways. Light curves
can be computed for a 1D or 3D analysis scenario (see above) by either
grouping or splitting the DL3 data into multiple time intervals. Grouping
mutiple observations allows for computing e.g. a monthly or nightly light curves,
while splitting of a single observation allows to compute light curves for flares.
You can also compute light curves in multiple energy bands. In all cases the light
curve is computed using the
Gammapy offers the possibility to combine data from multiple instruments in a “joint-likelihood” fit. This can be done at multiple data levels and independent dimensionality of the data. Gammapy can handle 1D and 3D datasets at the same time and can also include e.g. flux points in a combined likelihood fit.