celeritas

Celeritas is a new Monte Carlo transport code designed to accelerate scientific discovery in high energy physics by improving detector simulation throughput and energy efficiency using GPUs.

https://github.com/celeritas-project/celeritas

Science Score: 59.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 9 DOI reference(s) in README
  • Academic publication links
    Links to: zenodo.org
  • Committers with academic emails
    9 of 24 committers (37.5%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (15.2%) to scientific vocabulary

Keywords

computational-physics cuda detector-simulation gpu hep high-energy-physics hip monte-carlo particle-transport

Keywords from Contributors

build-tools hpsf radiuss spack
Last synced: 6 months ago · JSON representation

Repository

Celeritas is a new Monte Carlo transport code designed to accelerate scientific discovery in high energy physics by improving detector simulation throughput and energy efficiency using GPUs.

Basic Info
Statistics
  • Stars: 84
  • Watchers: 11
  • Forks: 43
  • Open Issues: 107
  • Releases: 23
Topics
computational-physics cuda detector-simulation gpu hep high-energy-physics hip monte-carlo particle-transport
Created almost 6 years ago · Last pushed 6 months ago
Metadata Files
Readme Contributing Citation Copyright

README.md

Celeritas

The Celeritas project implements HEP detector physics on GPU accelerator hardware with the ultimate goal of supporting the massive computational requirements of the HL-LHC upgrade.

Documentation

Most of the Celeritas documentation is readable through the codebase through a combination of static RST documentation and Doxygen-markup comments in the source code itself. The full Celeritas user documentation (including selected code documentation incorporated by Breathe) and the Celeritas code documentation are mirrored on our GitHub pages site. You can generate these yourself (if the necessary prerequisites are installed) by setting the CELERITAS_BUILD_DOCS=ON configuration option and running ninja doc (user) or ninja doxygen (developer).

Installation for applications

The easiest way to install Celeritas as a library/app is with Spack: - Follow these steps to install Spack. ```console

Install Spack

git clone --depth=2 https://github.com/spack/spack.git

Add Spack to the shell environment

For bash/zsh/sh (See spack-start for other shell)

. spack/share/spack/setup-env.sh - Install Celeritas with console spack install celeritas - Add the Celeritas installation to your `PATH` with: console spack load celeritas ```

To install a GPU-enabled Celeritas build, you might have to make sure that VecGeom is also built with CUDA support if installing celeritas+vecgeom, which is the default geometry. To do so, set Spack up its CUDA usage: ```console . spack/share/spack/setup-env.sh

Set up CUDA

$ spack external find cuda

Optionally set the default configuration. Replace "cuda_arch=80"

with your target architecture

$ spack config add packages:all:variants:"cxxstd=17 +cuda cudaarch=80" and install Celeritas with this configuration: console $ spack install celeritas If Celeritas was installed with a different configuration do console $ spack install --fresh celeritas If you need to set a default configuration console $ spack install celeritas +cuda cudaarch=80 ```

Integrating into a Geant4 app

In the simplest case, integration requires a few small changes to your user applications, with many more details described in integration overview.

You first need to find Celeritas in your project's CMake file, and change library calls to support VecGeom's use of CUDA RDC: diff +find_package(Celeritas 0.6 REQUIRED) find_package(Geant4 REQUIRED) @@ -36,3 +37,4 @@ else() add_executable(trackingmanager-offload trackingmanager-offload.cc) - target_link_libraries(trackingmanager-offload + celeritas_target_link_libraries(trackingmanager-offload + Celeritas::accel ${Geant4_LIBRARIES}

One catch-all include exposes the Celeritas high-level offload classes to user code: ```diff --- example/geant4/trackingmanager-offload.cc +++ example/geant4/trackingmanager-offload.cc @@ -31,2 +31,4 @@

+// Celeritas +#include

```

Celeritas uses the run action to set up and tear down cleanly: diff --- example/accel/trackingmanager-offload.cc +++ example/accel/trackingmanager-offload.cc @@ -133,2 +138,3 @@ class RunAction final : public G4UserRunAction { + TMI::Instance().BeginOfRunAction(run); } @@ -136,2 +142,3 @@ class RunAction final : public G4UserRunAction { + TMI::Instance().EndOfRunAction(run); }

And integrates into the tracking loop primarily using the G4TrackingManager interface: ```diff --- example/accel/trackingmanager-offload.cc +++ example/accel/trackingmanager-offload.cc @@ -203,4 +235,8 @@ int main()

  • auto& tmi = TMI::Instance(); + // Use FTFPBERT, but use Celeritas tracking for e-/e+/g auto* physicslist = new FTFP_BERT{/* verbosity = */ 0};
  • physics_list->RegisterPhysics(
  • new celeritas::TrackingManagerConstructor(&tmi)); ```

More flexible alternatives to this high level interface, compatible with other run manager implementations and older versions of Geant4, are described in the manual.

Installation for developers

Since Celeritas is still under very active development, you may be installing it for development purposes. The installation documentation has a complete description of the code's dependencies and installation process for development.

As an example, if you have the Spack package manager installed and want to do development on a CUDA system with Volta-class graphics cards, execute the following steps from within the cloned Celeritas source directory: ```console

Set up CUDA (optional)

$ spack external find cuda

Install celeritas dependencies

$ spack env create celeritas scripts/spack.yaml $ spack env activate celeritas $ spack config add packages:all:variants:"cxxstd=17 +cuda cuda_arch=80" $ spack install

Set up

Configure, build, and test with a default development configure

$ ./scripts/build.sh dev ```

If you don't use Spack but have all the dependencies you want (Geant4, GoogleTest, VecGeom, etc.) in your CMAKE_PREFIX_PATH, you can configure and build Celeritas as you would any other project: console $ mkdir build && cd build $ cmake .. $ make && ctest

[!NOTE] It is highly recommended to use the build.sh script to set up your user environment, even when not using Spack. The first time you run it, edit the CMakeUserPresets.json symlink it creates, and submit it in your next pull request.

Celeritas guarantees full compatibility and correctness only on the combinations of compilers and dependencies tested under continuous integration. See the configure output from the GitHub runners for the full list of combinations. - Compilers - GCC 11, 12, 14 - Clang 10, 15, 18 - MSVC 19 - GCC 11.5 + NVCC 12.6 - ROCm Clang 18 - Platforms - Linux x8664, ARM - Windows x8664 - C++ standard - C++17 and C++20 - Dependencies: - Geant4 11.0.4 - VecGeom 1.2.10

Partial compatibility and correctness is available for an extended range of Geant4: - 10.5-10.7: no support for tracking manager offload - 11.0: no support for fast simulation offload

Note also that navigation bugs in Geant4 and VecGeom older than the versions listed above will cause failures in some geometry-related unit tests. Future behavior changes in external packages may also cause failures.

Since we compile with extra warning flags and avoid non-portable code, most other compilers should work. The full set of configurations is viewable on CI platform GitHub Actions). Compatibility fixes that do not cause newer versions to fail are welcome.

Development

See the contribution guide for the contribution process, the development guidelines for further details on coding in Celeritas, and the administration guidelines for community standards and roles.

Directory structure

| Directory | Description | |---------------|-------------------------------------------------------| | app | Source code for installed executable applications | | cmake | Implementation code for CMake build configuration | | doc | Code documentation and manual | | example | Example applications and input files | | external | Automatically fetched external CMake dependencies | | scripts | Development and continuous integration helper scripts | | src | Library source code | | test | Unit tests |

Citing Celeritas

If using Celeritas in your work, we ask that you cite the following article:

Johnson, Seth R., Amanda Lund, Philippe Canal, Stefano C. Tognini, Julien Esseiva, Soon Yung Jun, Guilherme Lima, et al. 2024. Celeritas: Accelerating Geant4 with GPUs. EPJ Web of Conferences 295:11005. https://doi.org/10.1051/epjconf/202429511005.

See also its DOECode registration:

Johnson, Seth R., Amanda Lund, Soon Yung Jun, Stefano Tognini, Guilherme Lima, Philippe Canal, Ben Morgan, Tom Evans, and Julien Esseiva. 2022. Celeritas. https://doi.org/10.11578/dc.20221011.1.

and its Zenodo release metadata for version 0.6: DOI

Seth R. Johnson, Amanda Lund, Julien Esseiva, Philippe Canal, Elliott Biondo, Hayden Hollenbeck, Stefano Tognini, Lance Bullerwell, Soon Yung Jun, Guilherme Lima, Damien L-G, & Sakib Rahman. (2025). Celeritas 0.6 (0.6.0). Github. https://doi.org/10.5281/zenodo.15281110

A continually evolving list of works authored by (or with content authored by) core team members is continually updated at our publications page and displayed on the official project web site.

Owner

  • Name: Celeritas Project
  • Login: celeritas-project
  • Kind: organization

A collaboration targeting exascale simulation of high energy particle physics for detector modeling.

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 1,620
  • Total Committers: 24
  • Avg Commits per committer: 67.5
  • Development Distribution Score (DDS): 0.388
Past Year
  • Commits: 451
  • Committers: 15
  • Avg Commits per committer: 30.067
  • Development Distribution Score (DDS): 0.463
Top Committers
Name Email Commits
Seth R. Johnson j****r@o****v 992
Amanda Lund a****d@a****v 234
Julien Esseiva e****u 94
Stefano Tognini t****s@o****v 50
Philippe Canal p****l@f****v 45
Soon Yung Jun s****n@f****v 44
Guilherme Lima m****a 38
Elliott Biondo b****d@o****v 35
Ben Morgan d****n 25
Paul Romano p****o@g****m 15
Hayden Hollenbeck 1****b 13
Tom Evans e****m@o****v 8
Vincent R. Pascuzzi 3****i 6
lebuller 6****r 4
Vidor Heli Lujan Montiel v****u@o****v 4
Soon yung Jun s****n@l****v 3
Damien L-G d****b@g****m 3
Andrey Prokopenko a****k@g****m 1
Clang-format c****t 1
DoaaDeeb 7****b 1
Peter Heywood p****d@g****m 1
Sakib Rahman r****s@m****a 1
Vince Pascuzzi V****i 1
hartsw 1****w 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 174
  • Total pull requests: 1,498
  • Average time to close issues: 6 months
  • Average time to close pull requests: 5 days
  • Total issue authors: 15
  • Total pull request authors: 24
  • Average comments per issue: 1.71
  • Average comments per pull request: 1.79
  • Merged pull requests: 1,283
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 55
  • Pull requests: 690
  • Average time to close issues: 26 days
  • Average time to close pull requests: 4 days
  • Issue authors: 10
  • Pull request authors: 14
  • Average comments per issue: 0.58
  • Average comments per pull request: 1.85
  • Merged pull requests: 557
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • sethrj (138)
  • whokion (7)
  • esseivaju (4)
  • amandalund (4)
  • drbenmorgan (4)
  • stognini (3)
  • mrguilima (3)
  • hhollenb (2)
  • pcanal (2)
  • elliottbiondo (2)
  • huntercih (1)
  • amihashemi (1)
  • smuzaffar (1)
  • tmdelellis (1)
  • OutisLi (1)
Pull Request Authors
  • sethrj (824)
  • amandalund (249)
  • esseivaju (139)
  • pcanal (53)
  • elliottbiondo (49)
  • stognini (40)
  • whokion (36)
  • hhollenb (29)
  • mrguilima (28)
  • drbenmorgan (17)
  • lebuller (8)
  • rahmans1 (4)
  • VHLM2001 (4)
  • tmdelellis (3)
  • paulromano (3)
Top Labels
Issue Labels
enhancement (86) external (41) physics (31) bug (19) core (18) minor (16) orange (14) user (11) performance (10) documentation (7) internal (4) app (4) field (4) geometry (4)
Pull Request Labels
enhancement (526) minor (489) core (339) physics (272) documentation (237) external (216) bug (213) orange (176) app (87) field (53) performance (46) geometry (37) user (35) removal (17) backport (10) internal (7)

Packages

  • Total packages: 1
  • Total downloads: unknown
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 19
  • Total maintainers: 2
spack.io: celeritas

Celeritas is a new Monte Carlo transport code designed for high- performance (GPU-targeted) simulation of high-energy physics detectors.

  • Versions: 19
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Forks count: 22.4%
Stargazers count: 24.8%
Average: 26.1%
Dependent packages count: 57.3%
Maintainers (2)
Last synced: 6 months ago