extremeweatherbench

Benchmarking of machine learning and numerical weather prediction (MLWP & NWP) models, with a focus on extreme events.

https://github.com/brightbandtech/extremeweatherbench

Science Score: 36.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Committers with academic emails
    1 of 3 committers (33.3%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.0%) to scientific vocabulary

Keywords

benchmarking meteorology
Last synced: 6 months ago · JSON representation

Repository

Benchmarking of machine learning and numerical weather prediction (MLWP & NWP) models, with a focus on extreme events.

Basic Info
Statistics
  • Stars: 67
  • Watchers: 6
  • Forks: 3
  • Open Issues: 16
  • Releases: 0
Topics
benchmarking meteorology
Created over 1 year ago · Last pushed 6 months ago
Metadata Files
Readme License

README.md

Extreme Weather Bench (EWB)

Read our blog post here

As AI weather models are growing in popularity, we need a standardized set of community driven tests that evaluate the models across a wide variety of high-impact hazards. Extreme Weather Bench (EWB) builds on the successful work of WeatherBench and introduces a set of high-impact weather events, spanning across multiple spatial and temporal scales and different parts of the weather spectrum. We provide data to use for testing, standard metrics for evaluation by forecasters worldwide for each of the phenomena, as well as impact-based metrics. EWB is a community system and will be adding additional phenomena, test cases and metrics in collaboration with the worldwide weather and forecast verification community.

EWB paper and talks

  • AMS 2025 talk (recording will go live shortly after AMS): https://ams.confex.com/ams/105ANNUAL/meetingapp.cgi/Paper/451220
  • EWB paper is in preparation and will be submitted by early Spring 2025

How do I suggest new data, metrics, or otherwise get involved?

Extreme Weather Bench welcomes your involvement! The success of a benchmark suite rests on community involvement and feedback. There are several ways to get involved:

  • Get involved in community discussion using the discussion board
  • Submit new code requests using the issues
  • Send us email at hello@brightband.com

Installing EWB

Currently, the easiest way to install EWB is using the pip command:

shell $ pip install git+https://github.com/brightbandtech/ExtremeWeatherBench.git

It is highly recommend to use uv if possible:

shell $ git clone https://github.com/brightbandtech/ExtremeWeatherBench.git $ cd ExtremeWeatherBench $ uv sync

How to Run EWB

Running EWB on sample data (included) is straightforward.

Using command line initialization:

shell $ ewb --default

Using Jupyter Notebook or script:

```python from extremeweatherbench import config, events, evaluate import pickle

Select model

model = 'FOURv200GFS'

Set up path to directory of file - zarr or kerchunk/virtualizarr json/parquet

forecast_dir = f'gs://extremeweatherbench/{model}.parq'

Choose the event types you want to include

event_list = [events.HeatWave, events.Freeze]

Use ForecastSchemaConfig to map forecast variable names to CF convention-based names used in EWB

the sample forecast kerchunk references to the CIRA MLWP archive are the default configuration

defaultforecastconfig = config.ForecastSchemaConfig()

Set up configuration object that includes events and the forecast directory

heatwaveandfreezeconfiguration = config.Config( eventtypes=eventlist, forecastdir=forecastdir, # This line is not necessary, forecastschemaconfig defaults to the defaultforecastconfig. # Here as an example if values need to be changed for your use case forecastschemaconfig=defaultforecast_config )

Run the evaluate script which outputs a dataframe of case results with associated metrics and variables

cases = evaluate.evaluate(evalconfig=heatwaveandfreezeconfiguration)

Save the results to a pickle file

with open(f'ewbcases{model}.pkl', 'wb') as f: pickle.dump(cases, f)

Or, save to csv:

cases.tocsv(f'ewbcases_{model}.csv') ```

EWB case studies and categories

EWB case studies are fully documented here.

Owner

  • Name: Brightband
  • Login: brightbandtech
  • Kind: organization
  • Location: United States of America

Brightband is making weather and climate predictable for all, to help humanity adapt to increasingly extreme weather.

GitHub Events

Total
  • Create event: 81
  • Release event: 1
  • Issues event: 60
  • Watch event: 53
  • Delete event: 61
  • Issue comment event: 74
  • Public event: 1
  • Push event: 690
  • Pull request review event: 99
  • Pull request review comment event: 160
  • Pull request event: 121
  • Fork event: 3
Last Year
  • Create event: 81
  • Release event: 1
  • Issues event: 60
  • Watch event: 53
  • Delete event: 61
  • Issue comment event: 74
  • Public event: 1
  • Push event: 690
  • Pull request review event: 99
  • Pull request review comment event: 160
  • Pull request event: 121
  • Fork event: 3

Committers

Last synced: 7 months ago

All Time
  • Total Commits: 499
  • Total Committers: 3
  • Avg Commits per committer: 166.333
  • Development Distribution Score (DDS): 0.026
Past Year
  • Commits: 499
  • Committers: 3
  • Avg Commits per committer: 166.333
  • Development Distribution Score (DDS): 0.026
Top Committers
Name Email Commits
aaTman m****r@g****m 486
Amy McGovern a****n@o****u 12
Daniel Rothenberg d****l@d****m 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 47
  • Total pull requests: 123
  • Average time to close issues: about 1 month
  • Average time to close pull requests: 3 days
  • Total issue authors: 4
  • Total pull request authors: 3
  • Average comments per issue: 0.87
  • Average comments per pull request: 0.6
  • Merged pull requests: 85
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 47
  • Pull requests: 123
  • Average time to close issues: about 1 month
  • Average time to close pull requests: 3 days
  • Issue authors: 4
  • Pull request authors: 3
  • Average comments per issue: 0.87
  • Average comments per pull request: 0.6
  • Merged pull requests: 85
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • aaTman (41)
  • hansmohrmann (2)
  • amymcgovern (2)
  • alxmrs (2)
Pull Request Authors
  • aaTman (116)
  • amymcgovern (6)
  • gideonite (1)
Top Labels
Issue Labels
Improvement (5) documentation (4) enhancement (4) bug (3) Feature (2) good first issue (1) v1 (1)
Pull Request Labels

Packages

  • Total packages: 2
  • Total downloads: unknown
  • Total dependent packages: 0
    (may contain duplicates)
  • Total dependent repositories: 0
    (may contain duplicates)
  • Total versions: 2
proxy.golang.org: github.com/brightbandtech/ExtremeWeatherBench
  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent packages count: 5.4%
Average: 5.6%
Dependent repos count: 5.8%
Last synced: 6 months ago
proxy.golang.org: github.com/brightbandtech/extremeweatherbench
  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent packages count: 5.4%
Average: 5.6%
Dependent repos count: 5.8%
Last synced: 6 months ago

Dependencies

.github/workflows/ci.yaml actions
  • actions/checkout v4 composite
  • actions/checkout v3 composite
  • actions/setup-python v5 composite
  • actions/setup-python v3 composite
  • astral-sh/setup-uv v4 composite
  • pre-commit/action v3.0.1 composite
pyproject.toml pypi
  • cartopy >=0.24.1
  • cftime >=1.6.4.post1
  • dacite >=1.8.1
  • dask [complete]>=2024.12.1
  • fastparquet >=2024.11.0
  • gcsfs >=2024.12.0
  • geopandas >=1.0.1
  • h5py >=3.12.1
  • ipywidgets >=8.1.5
  • kerchunk >=0.2.7
  • numpy >=2.2.0
  • pandas >=2.2.3
  • pyyaml >=6.0.2
  • regionmask >=0.13.0
  • rioxarray >=0.18.1
  • s3fs >=2024.12.0
  • scikit-learn >=1.6.0
  • scores >=2.0.0
  • seaborn >=0.13.2
  • shapely >=2.0.6
  • tqdm >=4.67.1
  • ujson >=5.10.0
  • virtualizarr >=1.2.0
  • xarray >=2024.11.0
  • zarr >=2.18.4
uv.lock pypi
  • 157 dependencies