final-state-transformer

Machine learning development toolkit built upon Transformer encoder network architectures and tailored for the realm of high-energy physics and particle-collision event analysis.

https://github.com/dev-geof/final-state-transformer

Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (6.4%) to scientific vocabulary

Keywords

deep-learning machine-learning multi-head-attention particle-physics science-research toolkit transformer
Last synced: 6 months ago · JSON representation ·

Repository

Machine learning development toolkit built upon Transformer encoder network architectures and tailored for the realm of high-energy physics and particle-collision event analysis.

Basic Info
  • Host: GitHub
  • Owner: dev-geof
  • License: mit
  • Language: Python
  • Default Branch: main
  • Homepage:
  • Size: 4.9 MB
Statistics
  • Stars: 3
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 2
Topics
deep-learning machine-learning multi-head-attention particle-physics science-research toolkit transformer
Created almost 2 years ago · Last pushed over 1 year ago
Metadata Files
Readme License Citation

README.md

FINAL STATE TRANSFORMER

Introducing a machine learning development toolkit built upon Transformer encoder network architectures and specifically crafted for high-energy physics applications. Leveraging the power of the multi-head attention mechanism for capturing long-range dependencies and contextual information in sequences of particle-collision event final-state objects, it allows the design of machine learning models that excel in classification and regression tasks. Featuring a user-friendly interface, this toolkit facilitates integration of Transformer networks into research workflows, enabling scientists and researchers to harness state-of-the-art machine learning techniques.

Documentation

Owner

  • Name: Geoffrey Gilles
  • Login: dev-geof
  • Kind: user

Particle Physicist & Data Scientist

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
- family-names: "Gilles"
  given-names: "Geoffrey"
  orcid: "https://orcid.org/0000-0000-0000-0000"
title: "Final State Transformer"
version: 1.0.1
date-released: 2024-06-03
url: "https://github.com/dev-geof/final-state-transformer"

GitHub Events

Total
  • Watch event: 1
  • Push event: 4
Last Year
  • Watch event: 1
  • Push event: 4

Dependencies

.github/workflows/python-package.yml actions
  • actions/checkout v4 composite
  • actions/setup-python v3 composite
setup.py pypi
  • PyYAML >=6.0
  • cuda-python >=12.4.0
  • graphviz >=0.20.1
  • h5py >=3.8.0
  • matplotlib >=3.5.3
  • numpy >=1.24.2
  • puma >=0.0.0rc1
  • puma_hep >=0.2.2
  • pydot >=1.4.2
  • scikit_learn >=1.2.2
  • tensorflow >=2.11.0
  • termcolor >=1.1.0
  • tf2onnx >=1.12.0
  • tqdm >=4.62.3
requirements.txt pypi
  • PyYAML >=6.0
  • cuda-python >=12.4.0
  • graphviz >=0.20.1
  • h5py >=3.8.0
  • matplotlib >=3.5.1
  • numpy >=1.24.2
  • puma-hep >=0.2.2
  • pydot >=1.4.2
  • scikit-learn >=1.1.2
  • scikit_learn >=1.2.2
  • tensorflow >=2.11.0
  • termcolor >=1.1.0
  • tf2onnx >=1.12.0
  • tqdm >=4.62.3