HealpixMPI.jl
HealpixMPI.jl: an MPI-parallel implementation of the Healpix tessellation scheme in Julia - Published in JOSS (2024)
Science Score: 98.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 3 DOI reference(s) in README and JOSS metadata -
✓Academic publication links
Links to: joss.theoj.org -
○Committers with academic emails
-
○Institutional organization owner
-
✓JOSS paper metadata
Published in Journal of Open Source Software
Repository
an MPI-parallel implementation of the Healpix tessellation scheme in Julia
Basic Info
Statistics
- Stars: 6
- Watchers: 2
- Forks: 2
- Open Issues: 0
- Releases: 3
Metadata Files
README.md

HealpixMPI.jl: an MPI-parallel implementation of the Healpix tessellation scheme in Julia
Welcome to HealpixMPI.jl, an MPI-parallel implementation of the main functionalities of HEALPix spherical tessellation scheme, entirely coded in Julia.
This package constitutes a natural extension of the package Healpix.jl, providing an MPI integration of its main functionalities, allowing for simultaneous shared-memory (multithreading) and distributed-memory (MPI) parallelization leading to high performance sperical harmonic transforms.
Read the full documentation for further details.
Installation
From the Julia REPL, run
julia
import Pkg
Pkg.add("HealpixMPI")
Usage Example
The example shows the necessary steps to set up and perform an MPI-parallel alm2map SHT with HealpixMPI.jl.
Set up
We set up the necessary MPI communication and initialize Healpix.jl structures: ````julia using MPI using Random using Healpix using HealpixMPI
MPI set-up
MPI.Init() comm = MPI.COMMWORLD crank = MPI.Commrank(comm) csize = MPI.Comm_size(comm) root = 0
initialize Healpix structures
NSIDE = 64 lmax = 3*NSIDE - 1 if crank == root hmap = HealpixMap{Float64, RingOrder}(NSIDE) #empty map halm = Alm(lmax, lmax, randn(ComplexF64, numberOfAlms(lmax))) #random alm else hmap = nothing halm = nothing end ````
Distribution
The distributed HealpixMPI.jl data types are filled through an overload of MPI.Scatter!:
````julia
initialize empty HealpixMPI structures
dmap = DMap{RR}(comm) dalm = DAlm{RR}(comm)
fill them
MPI.Scatter!(hmap, dmap) MPI.Scatter!(halm, dalm) ````
SHT
We perform the SHT through an overload of Healpix.alm2map and, if needed, we MPI.Gather! the result in a HealpixMap:
julia
alm2map!(d_alm, d_map; nthreads = 16)
MPI.Gather!(d_map, h_map)
This allows the user to adjust at run time the number of threads to use, typically to be set to the number of cores of your machine.
Polarization
Since v1.0.0 HealpixMPI.jl supports polarized SHT's.
There are two different ways to distribute a PolarizedHealpixMap using MPI.Scatter!, i.e. passing one or two DMap output objects respectively, as shown in the following example:
julia
MPI.Scatter!(h_map, out_d_pol_map) #here out_d_pol_map is a DMap object containing only the Q and U components of the input h_map
MPI.Scatter!(h_map, out_d_map, out_d_pol_map) #here out_d_map contains the I component, while out_d_pol_map Q and U
Of course, the distribution of a polarized set of alms, represented in Healpix.jl by an AbstractArray{Alm{T}, 1}, works in a similar way:
julia
MPI.Scatter!(h_alms, out_d_pol_alms) #here both h_alms and out_d_pol_alms should only contain the E and B components
MPI.Scatter!(h_alms, out_d_alm, out_d_pol_alms) #here h_alms should contain [T,E,B], shared by out_d_alm (T) and out_d_pol_alm (E and B)
This allows the SHTs to be performed on the DMap and DAlm resulting objects directly, regardless of the field being polarized or not, as long as the number of components in the two objects is matching.
The functions alm2map and adjoint_alm2map will get authomatically the correct spin value for the given transform:
julia
alm2map!(d_alm, d_map) #spin-0 transform
alm2map!(d_pol_alm, d_pol_map) #polarized transform
Run
In order to exploit MPI parallelization run the code through mpirun or mpiexec as
shell
$ mpiexec -n {Ntask} julia {your_script.jl}
To run a code on multiple nodes, specify a machine file machines.txt as
shell
$ mpiexec -machinefile machines.txt julia {your_script.jl}
How to Cite
If you make use of HealpixMPI.jl for your work, please remember to cite it properly.
In order to do so, click on the Cite this repository menu in the About section of this repository. Alternatively, use the following BibTeX entry:
@article{Bianchi_HealpixMPI_jl_an_MPI-parallel_2024,
author = {Bianchi, Leo A.},
doi = {10.21105/joss.06467},
journal = {Journal of Open Source Software},
publisher = {The Open Journal},
month = may,
number = {97},
pages = {6467},
title = {{HealpixMPI.jl: an MPI-parallel implementation of the Healpix tessellation scheme in Julia}},
url = {https://joss.theoj.org/papers/10.21105/joss.06467},
volume = {9},
year = {2024}
}
Owner
- Name: Leo Alessandro Bianchi
- Login: LeeoBianchi
- Kind: user
- Location: Oslo, Norway
- Company: UniMi / UiO
- Repositories: 2
- Profile: https://github.com/LeeoBianchi
Computational (Astro-)Physics / Cosmology / CMB student @ University of Milan. Currently on research period @ University of Oslo
JOSS Publication
HealpixMPI.jl: an MPI-parallel implementation of the Healpix tessellation scheme in Julia
Authors
Tags
SHT Healpix parallel computing cosmologyCitation (CITATION.cff)
cff-version: "1.2.0"
authors:
- family-names: Bianchi
given-names: Leo A.
orcid: "https://orcid.org/0009-0002-6351-5426"
doi: 10.5281/zenodo.11192548
message: If you use this software, please cite our article in the
Journal of Open Source Software.
preferred-citation:
authors:
- family-names: Bianchi
given-names: Leo A.
orcid: "https://orcid.org/0009-0002-6351-5426"
date-published: 2024-05-20
doi: 10.21105/joss.06467
issn: 2475-9066
issue: 97
journal: Journal of Open Source Software
publisher:
name: Open Journals
start: 6467
title: "HealpixMPI.jl: an MPI-parallel implementation of the Healpix
tessellation scheme in Julia"
type: article
url: "https://joss.theoj.org/papers/10.21105/joss.06467"
volume: 9
title: "HealpixMPI.jl: an MPI-parallel implementation of the Healpix
tessellation scheme in Julia"
GitHub Events
Total
Last Year
Committers
Last synced: 7 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| LeeoBianchi | l****8@g****m | 127 |
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 3
- Total pull requests: 7
- Average time to close issues: 17 days
- Average time to close pull requests: 10 days
- Total issue authors: 2
- Total pull request authors: 3
- Average comments per issue: 2.67
- Average comments per pull request: 0.43
- Merged pull requests: 7
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- marcobonici (2)
- JuliaTagBot (1)
Pull Request Authors
- LeeoBianchi (7)
- baxmittens (2)
- danielskatz (2)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- julia 1 total
- Total dependent packages: 0
- Total dependent repositories: 0
- Total versions: 2
juliahub.com: HealpixMPI
an MPI-parallel implementation of the Healpix tessellation scheme in Julia
- Documentation: https://docs.juliahub.com/General/HealpixMPI/stable/
- License: GPL-2.0
-
Latest release: 1.0.0
published about 2 years ago
Rankings
Dependencies
- JuliaRegistries/TagBot v1 composite
- actions/checkout v3 composite
- codecov/codecov-action v3 composite
- julia-actions/cache v1 composite
- julia-actions/julia-buildpkg v1 composite
- julia-actions/julia-processcoverage v1 composite
- julia-actions/julia-runtest v1 composite
- julia-actions/setup-julia v1 composite
- actions/checkout v3 composite
- julia-actions/setup-julia v1 composite
