Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (8.0%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

Basic Info
  • Host: GitHub
  • Owner: madeleineburns
  • Language: Jupyter Notebook
  • Default Branch: main
  • Size: 8.44 MB
Statistics
  • Stars: 0
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 0
Created about 1 year ago · Last pushed 10 months ago
Metadata Files
Readme Citation

README.md

Comparing Snow Water Equivalent Estimations from Long Short-Term Memory Networks and Physics-Based Models in the Western United States

This repository contains the code necessary to train an LSTM model to estimate snow water equivalent (SWE) in the western U.S. This repository also contains the code to test the model on a representative set of national test sites, evaluate its performance in comparison to two physics-based models (ParFlow-CLM and the UA SWE model), and investigate the strengths and limitations of the all three models on the testing data. A brief description of the files in this repository is included here.

Model scripts

run_lstm.py: script for training the LSTM on a user-specified set of training data, generating predictions for the testing dataset, and quantifying model performance on the testing dataset.
run_pfclm.py: script for running ParFlow-CLM for all of the years in the testing dataset and quantifying model performance.

Analysis notebooks

lstm_analysis.ipynb: Jupyter notebook for analysis of LSTM model predictions on the testing dataset.
lstm_regime_model.ipynb: Jupyter notebook for analysis of the predictions of multiple LSTM models trained on varying percentages of data from each snowpack regime.
lstm_state_model.ipynb: Jupyter notebook for training an LSTM model on data from a single year and analyzing the model's transferability.
manage_data.ipynb: Jupyter notebook with assorted functions for managing or analyzing training data, testing data, and performance statistics.
model_comparisons.ipynb: Jupyter notebook for the analysis of estimations from the LSTM model, ParFlow-CLM, and the UA SWE model.
parflow_analysis.ipynb: Jupyter notebook for the analysis of ParFlow-CLM predictions.

Supporting files

_data.py: function library script supporting data collection, cleaning, and analysis.
_lstm.py: function library script supporting the development, training, and testing of the LSTM model.
national_test_sites.txt: text file listing the sites included in the testing data.
national_test_years.txt: text file listing the full set of testing data (both sites and years).

Owner

  • Login: madeleineburns
  • Kind: user

Citation (CITATION.cff)

# This CITATION.cff file was generated with cffinit.
# Visit https://bit.ly/cffinit to generate yours today!

cff-version: 1.2.0
title: 'LSTM Model of Snow Water Equivalent (SWE) '
message: >-
  If you use this software, please cite it using the
  metadata from this file.
type: software
authors:
  - given-names: Madeleine
    family-names: Burns
    email: mcburns@princeton.edu
    affiliation: Princeton University
    orcid: 'https://orcid.org/0000-0002-8116-639X'
repository-code: 'https://github.com/madeleineburns/swe-lstm'
abstract: >-
  The code in this repository trains and tests an LSTM model
  of snow water equivalent (SWE), evaluates the model's
  sensitivity and transferability, and compares the accuracy
  of its predictions to ParFlow-CLM and the University of
  Arizona SWE model. 
keywords:
  - snow water equivalent
  - SWE
  - LSTM
  - machine learning
  - mountain hydrology
  - snow
  - ParFlow-CLM
date-released: '2024-12-24'

GitHub Events

Total
  • Release event: 1
  • Push event: 8
  • Create event: 2
Last Year
  • Release event: 1
  • Push event: 8
  • Create event: 2