neos

Upstream optimisation for downstream inference

https://github.com/gradhep/neos

Science Score: 85.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 3 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org, zenodo.org
  • Committers with academic emails
    1 of 9 committers (11.1%) from academic institutions
  • Institutional organization owner
    Organization gradhep has institutional domain (mattermost.web.cern.ch)
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (11.9%) to scientific vocabulary

Keywords from Contributors

jax asymptotic-formulas closember cls frequentist-statistics hep hep-ex high-energy-physics histfactory scikit-hep
Last synced: 6 months ago · JSON representation ·

Repository

Upstream optimisation for downstream inference

Basic Info
  • Host: GitHub
  • Owner: gradhep
  • License: bsd-3-clause
  • Language: Jupyter Notebook
  • Default Branch: main
  • Size: 64.7 MB
Statistics
  • Stars: 69
  • Watchers: 2
  • Forks: 8
  • Open Issues: 12
  • Releases: 11
Created about 6 years ago · Last pushed 7 months ago
Metadata Files
Readme Contributing License Citation

README.md

neos logo
neural end-to-end-optimised summary statistics
arxiv.org/abs/2203.05570
GitHub Workflow Status Zenodo DOI Binder

About

Leverages the shoulders of giants (jax and pyhf) to differentiate through a high-energy physics analysis workflow, including the construction of the frequentist profile likelihood.

If you're more of a video person, see this talk given by Nathan on the broader topic of differentiable programming in high-energy physics, which also covers neos.

You want to apply this to your analysis?

Some things need to happen first. Click here for more info -- I wrote them up!

Have questions?

Do you want to chat about neos? Join us in Mattermost: Mattermost

Cite

Please cite our newly released paper:

@article{neos, Author = {Nathan Simpson and Lukas Heinrich}, Title = {neos: End-to-End-Optimised Summary Statistics for High Energy Physics}, Year = {2022}, Eprint = {arXiv:2203.05570}, doi = {10.48550/arXiv.2203.05570}, url = {https://doi.org/10.48550/arXiv.2203.05570} }

Example usage -- train a neural network to optimize an expected p-value

setup

In a python 3 environment, run the following: pip install --upgrade pip setuptools wheel pip install neos pip install git+http://github.com/scikit-hep/pyhf.git@make_difffable_model_ctor

With this, you should be able to run the demo notebook demo.ipynb on your pc :)

This workflow is as follows: - From a set of normal distributions with different means, we'll generate four blobs of (x,y) points, corresponding to a signal process, a nominal background process, and two variations of the background from varying the background distribution's mean up and down. - We'll then feed these points into the previously defined neural network for each blob, and construct a histogram of the output using kernel density estimation. The difference between the two background variations is used as a systematic uncertainty on the nominal background. - We can then leverage the magic of pyhf to construct an event-counting statistical model from the histogram yields. - Finally, we calculate the p-value of a test between the nominal signal and background-only hypotheses. This uses the familiar profile likelihood-based test statistic.

This counts as one forward pass of the workflow -- we then optimize the neural network by gradient descent, backpropagating through the whole analysis!

Thanks

A big thanks to the teams behind jax, fax, jaxopt and pyhf for their software and support.

Owner

  • Name: gradHEP
  • Login: gradhep
  • Kind: organization

Applying differentiable programming to high-energy physics. Join our mattermost chat with the link below, where we discuss + have irregular meetings!

Citation (CITATION.cff)

cff-version: 1.2.0
message: "Thanks for being interested in neos! If you use this software in a project, please cite it as below."
authors:
  - family-names: Simpson
    given-names: Nathan
    orcid: https://orcid.org/0000-0003-4188-829
  - family-names: Heinrich
    given-names: Lukas
    orcid: https://orcid.org/0000-0002-4048-7584
title: "neos: version 0.2.0"
version: v0.2.0
date-released: 2021-01-12
url: "https://github.com/gradhep/neos"
doi: 10.5281/zenodo.6351423
references:
  - type: article
    authors:
    - family-names: Simpson
      given-names: Nathan
      orcid: https://orcid.org/0000-0003-4188-829
    - family-names: "Heinrich"
      given-names: "Lukas"
      orcid: "https://orcid.org/0000-0002-4048-7584"
      affiliation: "TU Munich"
    title: "neos: End-to-End-Optimised Summary Statistics for High Energy Physics"
    doi: 10.48550/arXiv.2203.05570
    url: "https://doi.org/10.48550/arXiv.2203.05570"
    year: 2022

GitHub Events

Total
  • Watch event: 1
  • Push event: 18
  • Pull request event: 1
  • Fork event: 3
Last Year
  • Watch event: 1
  • Push event: 18
  • Pull request event: 1
  • Fork event: 3

Committers

Last synced: almost 3 years ago

All Time
  • Total Commits: 181
  • Total Committers: 9
  • Avg Commits per committer: 20.111
  • Development Distribution Score (DDS): 0.343
Top Committers
Name Email Commits
Nathan Simpson e****n@g****m 119
Nathan Simpson n****n@h****e 35
Matthew Feickert m****t@c****h 5
pre-commit-ci[bot] 6****]@u****m 4
andrzejnovak n****j@g****m 4
Lukas Heinrich l****h@g****m 4
Nathan Simpson p****e@p****m 4
dependabot[bot] 4****]@u****m 3
gehring c****g@g****m 3
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 7 months ago

All Time
  • Total issues: 10
  • Total pull requests: 27
  • Average time to close issues: about 1 month
  • Average time to close pull requests: 29 days
  • Total issue authors: 5
  • Total pull request authors: 6
  • Average comments per issue: 1.6
  • Average comments per pull request: 0.96
  • Merged pull requests: 21
  • Bot issues: 0
  • Bot pull requests: 11
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • phinate (5)
  • andrzejnovak (2)
  • alexander-held (1)
  • lukasheinrich (1)
  • gehring (1)
Pull Request Authors
  • phinate (11)
  • dependabot[bot] (7)
  • pre-commit-ci[bot] (4)
  • matthewfeickert (2)
  • gehring (2)
  • andrzejnovak (1)
Top Labels
Issue Labels
enhancement (5) help wanted (3) dependencies (1) good first issue (1)
Pull Request Labels
dependencies (7) documentation (3)

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 66 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 1
  • Total versions: 7
  • Total maintainers: 1
pypi.org: neos

UpUpstream optimization of a neural net summary statistic with respect to downstream inference goals.

  • Versions: 7
  • Dependent Packages: 0
  • Dependent Repositories: 1
  • Downloads: 66 Last month
Rankings
Stargazers count: 8.4%
Dependent packages count: 10.1%
Forks count: 14.2%
Average: 16.4%
Dependent repos count: 21.6%
Downloads: 27.5%
Maintainers (1)
Last synced: 7 months ago

Dependencies

examples/requirements.txt pypi
  • celluloid *
  • plothelp *
.github/workflows/ci.yml actions
  • actions/checkout v1 composite
  • actions/upload-artifact v2 composite
  • pypa/gh-action-pypi-publish v1.4.2 composite
pyproject.toml pypi
setup.py pypi