h5fortran-mpi

HDF5-MPI parallel Fortran object-oriented interface

https://github.com/geospace-code/h5fortran-mpi

Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: zenodo.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (12.4%) to scientific vocabulary

Keywords

fortran hdf5 mpi mpi-applications object-oriented-fortran
Last synced: 4 months ago · JSON representation ·

Repository

HDF5-MPI parallel Fortran object-oriented interface

Basic Info
  • Host: GitHub
  • Owner: geospace-code
  • License: bsd-3-clause
  • Language: Fortran
  • Default Branch: main
  • Homepage:
  • Size: 791 KB
Statistics
  • Stars: 20
  • Watchers: 2
  • Forks: 2
  • Open Issues: 4
  • Releases: 6
Topics
fortran hdf5 mpi mpi-applications object-oriented-fortran
Created over 4 years ago · Last pushed over 1 year ago
Metadata Files
Readme License Citation Codemeta

README.md

h5fortran-mpi

DOI

ci ci

Easy to use object-oriented Fortran parallel HDF5-MPI interface. The h5fortran-mpi API can be used with or with MPI. A very similar NetCDF4 interface is nc4fortran.

Many computer systems default to the serial HDF5 API, which lacks the HDF5 parallel MPI layer. The scripts/CMakeLists.txt can build the HDF5-MPI stack if needed. To use HDF5-MPI features, the computer must have a working MPI library installed already (e.g. OpenMPI, MPICH, Intel MPI, MS-MPI).

Some OS have an installable parallel HDF5 package:

  • Ubuntu: apt install libhdf5-mpi-dev
  • CentOS: yum install hdf5-openmpi-devel
  • MacOS Homebrew: brew install hdf5-mpi
  • MacOS MacPorts: port install hdf5 +fortran +mpich

While HDF5 1.10.2 is the oldest working HDF5 version, in general for bugfixes and performance HDF5 ≥ 1.10.5 is recommended. For highest performance with parallel compressed writes consider HDF5 ≥ 1.12.2.

Compressed parallel HDF5

Compression is useful in general to save significant disk space and speed up write/read. HDF5-MPI file compression requires HDF5 >= 1.10.2 and MPI-3. As noted above, HDF5 >= 1.10.5 is recommended for stability and performance.

Windows limitations

Microsoft Windows does not currently support native HDF5 parallel file compression. Windows Subsystem for Linux can be used for HDF5 parallel file compression. Native Windows users can read HDF5 compressed files but without using MPI.

Native Windows MPI options are currently limited to MS-MPI and Intel MPI. Currently Windows MS-MPI is MPI-2. A quirk with Intel oneAPI on Windows despite having MPI-3 is that with HDF5 1.10.x and at least through HDF5 1.12.1 the collective filtered parallel compression file I/O does not work. We test for this in CMake and set the compile flags appropriately.

Windows users that need file compression may use Windows Subsystem for Linux (e.g. Ubuntu) and install libhdf5-mpi-dev.

Build this project

Build this project like:

sh cmake -B build cmake --build build

If you have previously built / installed a parallel HDF5 library, refer to it (saving build time) like:

sh cmake -B build -DHDF5_ROOT=~/lib_par cmake --build build

To build without MPI (serial HDF5 file operations only):

sh cmake -B build -Dhdf5_parallel=off

Cray computers may use a CMake toolchain file to work with Intel or GCC backend.


Fortran Package Manager (FPM) users build like:

```sh fpm build --flag -Dh5fortranHAVEPARALLEL

omitting this flag builds the serial API only

fpm test ```

Notes

To build and install the HDF5 parallel library use the script:

```sh cmake -B buildhdf5 -S scripts --install-prefix=$HOME/libpar

cmake --build build_hdf5 ```

that will build and install HDF5 under ~/lib_par (or other directory of your choice).

Owner

  • Name: Geospace code
  • Login: geospace-code
  • Kind: organization
  • Location: 1 au

GNSS and other geospace analysis programs

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
  - family-names: Hirsch
    given-names: Michael
    orcid: https://orcid.org/0000-0002-1637-6526
title: h5fortran-mpi
doi: 10.5281/zenodo.5847354
date-released: 2022-01-14

CodeMeta (codemeta.json)

{
  "@context": "https://doi.org/10.5063/schema/codemeta-2.0",
  "@type": "SoftwareSourceCode",
  "codeRepository": "https://github.com/geospace-code/h5fortran-mpi",
  "contIntegration": "https://github.com/geospace-code/h5fortran-mpi/actions",
  "dateModified": "2022-07-29",
  "downloadUrl": "https://github.com/geospace-code/h5fortran-mpi/releases",
  "issueTracker": "https://github.com/geospace-code/h5fortran-mpi/issues",
  "name": "h5fortran-mpi",
  "identifier": "10.5281/zenodo.5847354",
  "description": "Lightweight object-oriented HDF5-MPI parallel Fortran interface",
  "applicationCategory": "file I/O",
  "developmentStatus": "active",
  "funder": {
    "@type": "Organization",
    "name": "DARPA"
  },
  "keywords": [
    "hdf5",
    "object-oriented",
    "mpi"
  ],
  "programmingLanguage": [
    "Fortran"
  ],
  "author": [
    {
      "@type": "Person",
      "@id": "https://orcid.org/0000-0002-1637-6526",
      "givenName": "Michael",
      "familyName": "Hirsch"
    }
  ]
}

GitHub Events

Total
  • Issues event: 2
  • Watch event: 2
  • Fork event: 1
Last Year
  • Issues event: 2
  • Watch event: 2
  • Fork event: 1

Committers

Last synced: almost 2 years ago

All Time
  • Total Commits: 485
  • Total Committers: 1
  • Avg Commits per committer: 485.0
  • Development Distribution Score (DDS): 0.0
Past Year
  • Commits: 11
  • Committers: 1
  • Avg Commits per committer: 11.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Michael Hirsch s****n 485

Issues and Pull Requests

Last synced: 10 months ago

All Time
  • Total issues: 3
  • Total pull requests: 0
  • Average time to close issues: about 1 hour
  • Average time to close pull requests: N/A
  • Total issue authors: 2
  • Total pull request authors: 0
  • Average comments per issue: 0.33
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • MatthAlex (2)
  • scivision (2)
  • Beliavsky (1)
Pull Request Authors
Top Labels
Issue Labels
bug (2) enhancement (1)
Pull Request Labels

Dependencies

.github/workflows/ci.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v4 composite
  • actions/upload-artifact v3 composite
.github/workflows/ci_build.yml actions
  • actions/checkout v3 composite
.github/workflows/ci_fpm.yml actions
  • actions/checkout v3 composite
  • fortran-lang/setup-fpm v4 composite
.github/workflows/intel-oneapi.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v4 composite
.github/workflows/oneapi-windows.yml actions
  • actions/checkout v3 composite