marex
Marine Extremes detection, identification, and tracking/merging for Exascale Climate data
Science Score: 49.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 4 DOI reference(s) in README -
✓Academic publication links
Links to: zenodo.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (11.9%) to scientific vocabulary
Keywords
Repository
Marine Extremes detection, identification, and tracking/merging for Exascale Climate data
Basic Info
Statistics
- Stars: 3
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 4
Topics
Metadata Files
README.md
Marine Extremes Python Package
Efficient & scalable Marine Extremes detection, identification, & tracking for Exascale Climate Data.
MarEx is a high-performance Python framework for identifying and tracking extreme oceanographic events (such as Marine Heatwaves or Acidity Extremes) in massive climate datasets. Built on advanced statistical methods and distributed computing, it processes decades of daily-resolution global ocean data with unprecedented efficiency and scalability.
Key Capabilities
- ** Extreme Performance**: Process 100+ years of high-resolution daily global data in minutes
- ** Advanced Analytics**: Multiple statistical methodologies for robust extreme event detection
- ** Complex Event Tracking**: Seamlessly handles coherent object splitting, merging, and evolution
- ** Universal Grid Support**: Native support for both regular (lat/lon) grids and unstructured ocean models
- ** Cloud-Native Scaling**: Identical codebase scales from laptop to a supercomputer using up to 1024+ cores
- ** Memory Efficient**: Intelligent chunking and lazy evaluation for datasets larger than memory
View 20 Years of marEx Tracking (Click to expand)
https://github.com/user-attachments/assets/36ee3150-c869-4cba-be68-628dc37e4775Features
Data Pre-processing Pipeline
MarEx implements a highly-optimised preprocessing pipeline powered by dask for efficient parallel computation and scaling to very large spatio-temporal datasets. Included are two complementary methods for calculating anomalies and detecting extremes:
Anomaly Calculation: 1. Shifting Baseline Scientifically-rigorous definition of anomalies relative to a backwards-looking rolling smoothed climatology. 2. Detrended Baseline Efficiently removes trend & season cycle using a 6+ coefficient model (mean, annual & semi-annual harmonics, and arbitrary polynomial trends). (Highly efficient, but this approximation may lead to biases in certain statistics.)
Extreme Detection: 1. Hobday Extreme Implements a similar methodology to Hobday et al. (2016) with local day-of-year specific thresholds determined based on the quantile within a rolling window. 2. Global Extreme Applies a global-in-time percentile threshold at each point across the entire dataset. Optionally renormalises anomalies using a 30-day rolling standard deviation. (Highly efficient, but may misrepresent seasonal variability and differs from common definitions in literature.)
Object Detection & Tracking
Object Detection:
- Implements efficient algorithms for object detection in 2D geographical data.
- Fully-parallelised workflow built on dask for extremely fast & larger-than-memory computation.
- Uses morphological opening & closing to fill small holes and gaps in binary features.
- Filters out small objects based on area thresholds.
- Identifies and labels connected regions in binary data representing arbitrary events (e.g. SST or SSS extrema, tracer presence, eddies, etc...).
- Performance/Scaling Test: 100 years of daily 0.25 resolution binary data with 64 cores...
- Takes ~5 wall-minutes per century
- Requires only 1 Gb memory per core (with dask chunks of 25 days)
Object Tracking:
- Implements strict event tracking conditions to avoid very few, very large objects.
- Permits temporal gaps (of T_fill days) between objects, to allow more continuous event tracking.
- Requires objects to overlap by at least overlap_threshold fraction of the smaller objects's area to be considered the same event and continue tracking with the same ID.
- Accounts for & keeps a history of object splitting & merging events, ensuring objects are more coherent and retain their previous identities & histories.
- Improves upon the splitting & merging logic of Sun et al. (2023):
- In this New Version: Partition the child object based on the parent of the nearest-neighbour cell (not the nearest parent centroid).
- Provides much more accessible and usable tracking outputs:
- Tracked object properties (such as area, centroid, and any other user-defined properties) are mapped into ID-time space
- Details & Properties of all Merging/Splitting events are recorded.
- Provides other useful information that may be difficult to extract from the large object ID field, such as:
- Event presence in time
- Event start/end times and duration
- etc...
- Performance/Scaling Test: 100 years of daily 0.25 resolution binary data with 64 cores...
- Takes ~8 wall-minutes per decade (cf. Old Method, i.e. without merge-split-tracking, time-gap filling, overlap-thresholding, et al., but here updated to leverage dask, now takes 1 wall-minute per decade!)
- Requires only ~2 Gb memory per core (with dask chunks of 25 days)
Visualisation
Plotting: - Provides a few helper functions to create pretty plots, wrapped subplots, and animations (e.g. below).
cf. Old (Basic) ID Method vs. New Tracking & Merging Algorithm:
https://github.com/user-attachments/assets/5acf48eb-56bf-43e5-bfc4-4ef1a7a90eff
Technical Architecture
Distributed Computing Stack:
- Framework: Dask for distributed computation with asyncronous task scheduling
- Parallelism: Multi-level spatio-temporal parallelisation
- Memory Management: Lazy evaluation with automatic spilling and graph optimisation
- I/O Optimisation: Zarr-based intermediate storage with compression
Performance Optimisations: - JIT Compilation: Numba-accelerated critical paths for numerical kernels - GPU Acceleration: Optional JAX backend for tensor operations - Sparse Operations: Custom sparse matrix algorithms for unstructured grids - Cache-Aware: Memory access patterns optimised for modern CPU architectures
Computational Workflow
- Preprocess: Remove trends & seasonal cycles and identify anomalous extremes
- Detect: Filter & label connected regions using morphological operations
- Track: Follow objects through time, handling complex evolution patterns
- Analyse: Extract event statistics, duration, and spatial properties
Quick Start Example
```python import xarray as xr import marEx
Load sea surface temperature data
sst = xr.opendataset('sstdata.nc', chunks={}).sst
Pre-process SST Data to identify extremes: cf. 01_preprocess_extremes.ipynb
extremeeventsds = marEx.preprocessdata( sst, thresholdpercentile=95, methodanomaly='shiftingbaseline', methodextreme='hobdayextreme' )
Identify & Track Marine Heatwaves through time: cf. 02_id_track_events.ipynb
eventsds = marEx.tracker( extremeeventsds.extremeevents, extremeeventsds.mask, Rfill=8, areafilterquartile=0.5, allowmerging=True ).run()
Visualise results: cf. 03_visualise_events.ipynb
fig, ax, im = (eventsds.IDfield > 0).mean("time").plotX.singleplot(marEx.PlotConfig(varunits="MHW Frequency", cmap="hot_r", cperc=[0, 96])) ```
Installation & Setup
Full Installation
```bash
Complete HPC installation with all optional dependencies
pip install marEx[full,hpc] ```
Development Installation
```bash
Clone and install for development
git clone https://github.com/wienkers/marEx.git cd marEx pip install -e .[dev]
Install pre-commit hooks
pre-commit install ```
Getting Help
If you encounter installation issues:
- Documentation: Check the full documentation for detailed guides and API reference
- Check Dependencies: Run
marEx.print_dependency_status()to identify missing components - Search Issues: Check the GitHub Issues for similar problems
- System Information: Include your OS, Python version, and error messages when reporting issues
- Support: Reach out to Aaron Wienkers
Funding
This project has received funding through:
- The EERIE (European Eddy-Rich ESMs) Project
- The European Union's Horizon Europe research and innovation programme under Grant Agreement No. 101081383
- The Swiss State Secretariat for Education, Research and Innovation (SERI) under contract #22.00366
Please contact Aaron Wienkers with any questions, comments, issues, or bugs.
Owner
- Name: Aaron
- Login: wienkers
- Kind: user
- Location: Cambridge, UK
- Company: University of Cambridge, Trinity College
- Website: www.wienkers.com
- Repositories: 1
- Profile: https://github.com/wienkers
GitHub Events
Total
- Create event: 16
- Release event: 3
- Issues event: 1
- Watch event: 3
- Delete event: 7
- Push event: 69
- Fork event: 1
Last Year
- Create event: 16
- Release event: 3
- Issues event: 1
- Watch event: 3
- Delete event: 7
- Push event: 69
- Fork event: 1
Packages
- Total packages: 1
-
Total downloads:
- pypi 80,826 last-month
- Total dependent packages: 0
- Total dependent repositories: 0
- Total versions: 7
- Total maintainers: 1
pypi.org: marex
Marine Extremes Detection and Tracking
- Homepage: https://github.com/wienkers/marEx
- Documentation: https://marex.readthedocs.io/
-
Latest release: 3.1.1
published 6 months ago
Rankings
Maintainers (1)
Dependencies
- dask *
- dask_image *
- flox *
- jax *
- jaxlib *
- numpy *
- pillow *
- scikit-image *
- scipy *
- xarray *
- xhistogram *