Skip to content

Commit

Permalink
Merge pull request #503 from asogaard/update-citation
Browse files Browse the repository at this point in the history
Update citations and README for v1.0
  • Loading branch information
asogaard authored May 12, 2023
2 parents b420f57 + 8d3e61b commit 32d4f8c
Show file tree
Hide file tree
Showing 5 changed files with 133 additions and 74 deletions.
33 changes: 26 additions & 7 deletions CITATION.cff
Original file line number Diff line number Diff line change
@@ -1,38 +1,57 @@
cff-version: 1.2.0
message: "If you use this software, please cite it as below."
title: GraphNeT
doi: 10.5281/zenodo.7152630
doi: 10.5281/zenodo.6720188
authors:
- family-names: "Søgaard"
given-names: "Andreas"
orcid: "https://orcid.org/0000-0002-0823-056X"
- family-names: "F. Ørsøe"
given-names: "Rasmus"
orcid: "https://orcid.org/0000-0001-8890-4124"
- family-names: "Bozianu"
given-names: "Leon"
orcid: "https://orcid.org/0000-0002-1243-9980"
- family-names: "Holm"
given-names: "Morten"
orcid: "https://orcid.org/0000-0003-1383-2810"
- family-names: "Bozianu"
given-names: "Leon"
orcid: "https://orcid.org/0000-0002-1243-9980"
- family-names: "Rosted"
given-names: "Aske"
orcid: "https://orcid.org/0000-0003-2410-400X"
- family-names: "C. Petersen"
given-names: "Troels"
orcid: "https://orcid.org/0000-0003-0221-3037"
- family-names: "E. Iversen"
- family-names: "Endrup Iversen"
given-names: "Kaare"
orcid: "https://orcid.org/0000-0001-6533-4085"
- family-names: "Hermansen"
given-names: "Andreas"
orcid: "https://orcid.org/0009-0006-1162-9770"
- family-names: "Guggenmos"
given-names: "Tim"
- family-names: "Andresen"
given-names: "Peter"
orcid: "https://orcid.org/0009-0008-5759-0490"
- family-names: "Ha Minh"
given-names: "Martin"
orcid: "https://orcid.org/0000-0001-7776-4875"
- family-names: "Neste"
given-names: "Ludwig"
orcid: "https://orcid.org/0000-0002-4829-3469"
- family-names: "Holmes"
given-names: "Moust"
orcid: "https://orcid.org/0009-0000-8530-7041"
- family-names: "Pontén"
given-names: "Axel"
orcid: "https://orcid.org/0009-0008-2463-2930"
- family-names: "Leonard DeHolton"
given-names: "Kayla"
orcid: "https://orcid.org/0000-0002-8795-0601"
- family-names: "Eller"
given-names: "Philipp"
orcid: "https://orcid.org/0000-0001-6354-5209"
version: 0.2.2
date-released: 2022-10-06
version: 1.0.0
date-released: 2023-05-12
url: "https://github.com/graphnet-team/graphnet"
license: Apache-2.0
type: software
21 changes: 20 additions & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,4 +49,23 @@ From "Software Best Practices Effective Version Control", Alex Olivas, IceCube B
* Bad commit message: `"blerg"`

Others:
* Keep backward compatibility in mind when you change code.
* Keep backward compatibility in mind when you change code.

## Experiment tracking

We're using [Weights & Biases](https://wandb.ai/) (W&B) to track the results — i.e. losses, metrics, and model artifacts — of training runs as a means to track model experimentation and streamline optimisation. To authenticate with W&B, sign up on the website and run the following in your terminal after having installed this package:
```bash
$ wandb login
```
You can use your own, personal projects on W&B, but for projects of common interest you are encouraged to join the `graphnet-team` team on W&B [here](https://wandb.ai/graphnet-team), create new projects for your specific use cases, and log your runs there. Just ask [@asogaard](https://github.com/asogaard) for an invite to the team!

If you don't want to use W&B and/or only want to log run data locally, you can run:
```bash
$ wandb offline
```
If you change you mind, it's as simple as:
```bash
$ wandb online
```

The [examples/04_training/01_train_model.py](examples/04_training/01_train_model.py) script shows how to train a model and log the results to W&B.
34 changes: 14 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,30 @@
<center>

![logo](./assets/identity/graphnet-logo-and-wordmark.png)

| Usage | Development |
| --- | --- |
| [![Slack](https://img.shields.io/badge/slack-4A154B.svg?logo=slack)](https://join.slack.com/t/graphnet-team/signup) | ![build](https://github.com/graphnet-team/graphnet/actions/workflows/build.yml/badge.svg) |
| [![status](https://joss.theoj.org/papers/eecab02fb1ecd174a5273750c1ea0baf/status.svg)](https://joss.theoj.org/papers/eecab02fb1ecd174a5273750c1ea0baf) | ![build](https://github.com/graphnet-team/graphnet/actions/workflows/build.yml/badge.svg) |
| [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.6720188.svg)](https://doi.org/10.5281/zenodo.6720188) | ![code-quality](https://github.com/graphnet-team/graphnet/actions/workflows/code-quality.yml/badge.svg) |
| [![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0) | [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black) |
| ![Supported python versions](https://img.shields.io/badge/python-3.7%20%7C%203.8%20%7C%203.9%20%7C%203.10-blue) | [![Maintainability](https://api.codeclimate.com/v1/badges/b273a774112e32643162/maintainability)](https://codeclimate.com/github/graphnet-team/graphnet/maintainability) |
| [![Docker image](https://img.shields.io/docker/v/asogaard/graphnet?color=blue&logo=docker&sort=semver)](https://hub.docker.com/repository/docker/asogaard/graphnet) | [![Test Coverage](https://api.codeclimate.com/v1/badges/b273a774112e32643162/test_coverage)](https://codeclimate.com/github/graphnet-team/graphnet/test_coverage) |

</center>

## :rocket: About

**GraphNeT** is an open-source Python framework aimed at providing high quality, user friendly, end-to-end functionality to perform reconstruction tasks at neutrino telescopes using graph neural networks (GNNs). GraphNeT makes it fast and easy to train complex models that can provide event reconstruction with state-of-the-art performance, for arbitrary detector configurations, with inference times that are orders of magnitude faster than traditional reconstruction techniques.

Feel free to join the [GraphNeT Slack group](https://join.slack.com/t/graphnet-team/signup)!

### Publications using GraphNeT

| Type | Title | DOI |
| --- | --- | --- |
| Paper | GraphNeT: Graph neural networks for neutrino telescope event reconstruction | [![status](https://joss.theoj.org/papers/eecab02fb1ecd174a5273750c1ea0baf/status.svg)](https://joss.theoj.org/papers/eecab02fb1ecd174a5273750c1ea0baf) |
| Paper | Graph Neural Networks for low-energy event classification & reconstruction in IceCube | [![JINST](https://img.shields.io/badge/JINST-10.1088%2F1748--0221%2F17%2F11%2FP11003-blue)](https://doi.org/10.1088/1748-0221/17/11/P11003) |

## :gear: Install

We recommend installing `graphnet` in a separate environment, e.g. using a Python virtual environment or Anaconda (see details on installation [here](https://www.anaconda.com/products/individual)). Below we prove installation instructions for different setups.
Expand Down Expand Up @@ -169,25 +182,6 @@ To make sure that the process of contributing is as smooth and effective as poss
In short, everyone who wants to contribute to this project is more than welcome to do so! Contributions are handled through pull requests, that should be linked to a [GitHub issue](https://github.com/graphnet-team/graphnet/issues) describing the feature to be added or bug to be fixed. Pull requests will be reviewed by the project maintainers and merged into the main branch when accepted.


## :test_tube: Experiment tracking

We're using [Weights & Biases](https://wandb.ai/) (W&B) to track the results — i.e. losses, metrics, and model artifacts — of training runs as a means to track model experimentation and streamline optimisation. To authenticate with W&B, sign up on the website and run the following in your terminal after having installed this package:
```bash
$ wandb login
```
You can use your own, personal projects on W&B, but for projects of common interest you are encouraged to join the `graphnet-team` team on W&B [here](https://wandb.ai/graphnet-team), create new projects for your specific use cases, and log your runs there. Just ask [@asogaard](https://github.com/asogaard) for an invite to the team!

If you don't want to use W&B and/or only want to log run data locally, you can run:
```bash
$ wandb offline
```
If you change you mind, it's as simple as:
```bash
$ wandb online
```

The [examples/04_training/01_train_model.py](examples/04_training/01_train_model.py) script shows how to train a model and log the results to W&B.

## :memo: License

GraphNeT has an Apache 2.0 license, as found in the [LICENSE](LICENSE) file.
Expand Down
25 changes: 10 additions & 15 deletions paper/paper.bib
Original file line number Diff line number Diff line change
@@ -1,19 +1,13 @@
// GraphNeT Zenodo
@software{graphnet_zenodo:2022,
author = {Andreas Søgaard and
Rasmus F. Ørsøe and
Leon Bozianu and
Morten Holm and
Kaare Endrup Iversen and
Tim Guggenmos and
Martin Ha Minh and
Philipp Eller},
title = {GraphNeT},
month = jun,
year = 2022,
publisher = {Zenodo},
doi = {10.5281/zenodo.6720188},
url = {https://doi.org/10.5281/zenodo.6720188}
@software{Sogaard_GraphNeT_2023,
author = {Søgaard, Andreas and F. Ørsøe, Rasmus and Holm, Morten and Bozianu, Leon and Rosted, Aske and C. Petersen, Troels and Endrup Iversen, Kaare and Hermansen, Andreas and Guggenmos, Tim and Andresen, Peter and Ha Minh, Martin and Neste, Ludwig and Holmes, Moust and Pontén, Axel and Leonard DeHolton, Kayla and Eller, Philipp},
doi = {10.5281/zenodo.6720188},
license = {Apache-2.0},
month = may,
title = {{GraphNeT}},
url = {https://github.com/graphnet-team/graphnet},
version = {1.0.0},
year = {2023}
}

@incollection{NEURIPS2019_9015,
Expand All @@ -23,6 +17,7 @@ @incollection{NEURIPS2019_9015
editor = {H. Wallach and H. Larochelle and A. Beygelzimer and F. d\textquotesingle Alch\'{e}-Buc and E. Fox and R. Garnett},
pages = {8024--8035},
year = {2019},
doi = {doi.org/10.48550/arXiv.1912.01703},
publisher = {Curran Associates, Inc.},
url = {http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf}
}
Expand Down
94 changes: 63 additions & 31 deletions paper/paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,38 +12,70 @@ tags:
- neutrinos

authors:
- name: Andreas Søgaard
orcid: 0000-0002-0823-056X
affiliation: 1 # "1, 2" (Multiple affiliations must be quoted)
corresponding: true
- name: Rasmus F. Ørsøe
orcid: 0000-0001-8890-4124
affiliation: 2
- name: Leon Bozianu
affiliation: 1
- name: Morten Holm
affiliation: 1
- name: Kaare Endrup Iversen
affiliation: 1
- name: Tim Guggenmos
affiliation: 2
- name: Martin Ha Minh
orcid: 0000-0001-7776-4875
affiliation: 2
- name: Philipp Eller
orcid: 0000-0001-6354-5209
affiliation: 2
- name: Troels C. Petersen
orcid: 0000-0003-0221-3037
affiliation: 1
- name: Andreas Søgaard
affiliation: 1
orcid: 0000-0002-0823-056X
corresponding: true
- name: Rasmus F. Ørsøe
affiliation: "1, 2"
orcid: 0000-0001-8890-4124
- name: Morten Holm
affiliation: 1
orcid: 0000-0003-1383-2810
- name: Leon Bozianu
affiliation: 1
orcid: 0000-0002-1243-9980
- name: Aske Rosted
affiliation: 3
orcid: 0000-0003-2410-400X
- name: Troels C. Petersen
affiliation: 1
orcid: 0000-0003-0221-3037
- name: Kaare Endrup Iversen
affiliation: 1
orcid: 0000-0001-6533-4085
- name: Andreas Hermansen
affiliation: 1
orcid: 0009-0006-1162-9770
- name: Tim Guggenmos
affiliation: 2
- name: Peter Andresen
affiliation: 1
orcid: 0009-0008-5759-0490
- name: Martin Ha Minh
affiliation: 2
orcid: 0000-0001-7776-4875
- name: Ludwig Neste
affiliation: 4
orcid: 0000-0002-4829-3469
- name: Moust Holmes
affiliation: 1
orcid: 0009-0000-8530-7041
- name: Axel Pontén
affiliation: 5
orcid: 0009-0008-2463-2930
- name: Kayla Leonard DeHolton
affiliation: 6
orcid: 0000-0002-8795-0601
- name: Philipp Eller
affiliation: 2
orcid: 0000-0001-6354-5209

affiliations:
- name: Niels Bohr Institute, University of Copenhagen, Denmark
index: 1
- name: Technical University of Munich, Germany
index: 2

date: 16 September 2022
- name: Niels Bohr Institute, University of Copenhagen, Denmark
index: 1
- name: Technical University of Munich, Germany
index: 2
- name: Chiba University, Japan
index: 3
- name: Technical University of Dortmund, Germany
index: 4
- name: Uppsala University, Sweden
index: 5
- name: Pennsylvania State University, USA
index: 6

date: 12 May 2023

bibliography: paper.bib

Expand All @@ -53,7 +85,7 @@ bibliography: paper.bib

Neutrino telescopes, such as ANTARES [@ANTARES:2011hfw], IceCube [@Aartsen:2016nxy; @DeepCore], KM3NeT [@KM3Net:2016zxf], and Baikal-GVD [@Baikal-GVD:2018isr] have the science goal of detecting neutrinos and measuring their properties and origins. Reconstruction at these experiments is concerned with classifying the type of event or estimating properties of the interaction.

`GraphNeT` [@graphnet_zenodo:2022] is an open-source Python framework aimed at providing high quality, user friendly, end-to-end functionality to perform reconstruction tasks at neutrino telescopes using graph neural networks (GNNs). `GraphNeT` makes it fast and easy to train complex models that can provide event reconstruction with state-of-the-art performance, for arbitrary detector configurations, with inference times that are orders of magnitude faster than traditional reconstruction techniques [@gnn_icecube].
`GraphNeT` [@Sogaard_GraphNeT_2023] is an open-source Python framework aimed at providing high quality, user friendly, end-to-end functionality to perform reconstruction tasks at neutrino telescopes using graph neural networks (GNNs). `GraphNeT` makes it fast and easy to train complex models that can provide event reconstruction with state-of-the-art performance, for arbitrary detector configurations, with inference times that are orders of magnitude faster than traditional reconstruction techniques [@gnn_icecube].

GNNs from `GraphNeT` are flexible enough to be applied to data from all neutrino telescopes, including future projects such as IceCube extensions [@IceCube-PINGU:2014okk; @IceCube:2016xxt; @IceCube-Gen2:2020qha] or P-ONE [@P-ONE:2020ljt]. This means that GNN-based reconstruction can be used to provide state-of-the-art performance on most reconstruction tasks in neutrino telescopes, at real-time event rates, across experiments and physics analyses, with vast potential impact for neutrino and astro-particle physics.

Expand Down

0 comments on commit 32d4f8c

Please sign in to comment.