ncdssm

PyTorch implementation of the NCDSSM models presented in the ICML '23 paper "Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series".

https://github.com/clear-nus/ncdssm

Science Score: 28.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
  • .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (13.6%) to scientific vocabulary

Keywords

continuous-time forecasting icml-2023 imputation kalman-filter state-space-model time-series
Last synced: 6 months ago · JSON representation ·

Repository

PyTorch implementation of the NCDSSM models presented in the ICML '23 paper "Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series".

Basic Info
  • Host: GitHub
  • Owner: clear-nus
  • License: mit
  • Language: Python
  • Default Branch: main
  • Homepage:
  • Size: 296 KB
Statistics
  • Stars: 24
  • Watchers: 2
  • Forks: 3
  • Open Issues: 2
  • Releases: 0
Topics
continuous-time forecasting icml-2023 imputation kalman-filter state-space-model time-series
Created almost 3 years ago · Last pushed over 2 years ago
Metadata Files
Readme License Citation

README.md

# Neural Continuous-Discrete State Space Models (NCDSSM) [![preprint](https://img.shields.io/static/v1?label=arXiv&message=2301.11308&color=B31B1B)](https://arxiv.org/abs/2301.11308) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![Venue:ICML 2023](https://img.shields.io/badge/Venue-ICML%202023-007CFF)](https://icml.cc/)


Fig 1. (Top) Generative model of Neural Continuous-Discrete State Space Model. (Bottom) Amortized inference for auxiliary variables and continuous-discrete Bayesian inference for states.


This repository contains code for reproducing the experiments presented in the ICML 2023 paper Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series.

Installation

NCDSSM requires Python 3.8 or higher.

  • Create a conda environment (optional, but recommended). sh conda create --name ncdssm python=3.9 conda activate ncdssm
  • Clone this repo and install the package. sh # Install package with all optional dependencies for running experiments pip install --editable ".[exp]"

Usage

Generate or download datasets

  • For bouncing ball, damped pendulum, climate and Pymunk datasets, generate/download them by running the scripts in data/. ```sh # Bouncing Ball dataset python data/bouncing_ball.py

Damped Pendulum dataset

python data/damped_pendulum.py

USHCN Climate dataset

python data/climate.py

Pymunk datasets

python data/box.py python data/pong.py `` * For CMU MoCap dataset, download the preprocessed dataset (the file namedmocap35.mat) available at [cagatayyildiz/ODE2VAE](https://github.com/cagatayyildiz/ODE2VAE) and place it indata/mocap/mocap35.mat`.

Training

Train models using train_pymunk.py --config /path/to/config for the Pymunk datasets and train_ts.py --config /path/to/config for all other datasets. The yaml config files can be found in configs/. The keys in the config files can also be passed as command line options to the training script as --key value. ```sh

Train NCDSSM-NL on Damped Pendulum with 30% missing

python traints.py --config configs/pendulumncdssmnl.yaml --trainmissingp 0.3

Train NCDSSM-LL on Bouncing Ball with 80% missing

python traints.py --config configs/ballncdssmll.yaml --trainmissingp 0.8

Train NCDSSM-NL on CMU MoCap (setup 1)

python traints.py --config configs/mocapncdssmnl.yaml

Train NCDSSM-LL on CMU MoCap (setup 2)

python traints.py --config configs/mocap2ncdssmll.yaml

Train NCDSSM-NL on USHCN Climate (first fold)

--sporadic means that data is missing both in time and feature dimensions

python traints.py --sporadic --config configs/climatencdssmnl.yaml --data_fold 0

Train NCDSSM-LL on Box

python trainpymunk.py --config configs/boxncdssmll.yaml ```

Evaluation

Once trained, models can be evaluated on the test set using the eval_pymunk.py and eval_ts.py. Alternatively, you may download our pretrained checkpoints from this repository's Releases or from :hugs: hugging face and place them in ./checkpoints/. ```sh

Download pretrained checkpoint from Releases

wget https://github.com/clear-nus/NCDSSM/releases/download/v0.0.1/checkpoints.tar.gz \ && tar -xzf checkpoints.tar.gz \ && rm checkpoints.tar.gz ```

```sh

Evaluate NCDSSM-NL on Damped Pendulum with 30% missing

python evalts.py --ckpt checkpoints/dampedpendulum0.3ncdssmnl.pt --smooth

Evaluate NCDSSM-LL on Bouncing Ball with 80% missing

python evalts.py --ckpt checkpoints/bouncingball0.8ncdssmll.pt --smooth

Evaluate NCDSSM-NL on CMU MoCap (Setup 1)

python evalts.py --ckpt checkpoints/mocapncdssmnl.pt --nostatesampling --seed 0

Evaluate NCDSSM-LL on CMU MoCap (Setup 2)

python evalts.py --ckpt checkpoints/mocap2ncdssmll.pt --nostatesampling --seed 0

Evaluate NCDSSM-NL on USHCN Climate (first fold)

python evalts.py --sporadic --ckpt checkpoints/climatencdssmnl.pt

Evaluate NCDSSM-LL on Box

python evalpymunk.py --ckpt checkpoints/boxncdssmll.pt --wass --smooth --nostatesampling --seed 0 ```

Questions

For any questions regarding the code or the paper, please email Fatir.

BibTeX

If you find this repository or the ideas presented in our paper useful, please consider citing. @inproceedings{ ansari2023neural, title={Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series}, author={Ansari, Abdul Fatir and Heng, Alvin and Lim, Andre and Soh, Harold}, booktitle={International Conference on Machine Learning}, year={2023} }

Acknowledgement

This repo contains parts of code based on the following repos.

| Repo | Copyright (c) | License | | ------------- | ---------- | ------------- | | edebrouwer/gruodebayes | Edward De Brouwer | MIT License | | cagatayyildiz/ODE2VAE | Çağatay Yıldız | NA | | mannyray/KalmanFilter | @mannyray | MIT License | | simonkamronn/kvae | Simon Kamronn | MIT License

Owner

  • Name: CLeAR
  • Login: clear-nus
  • Kind: organization
  • Email: harold@comp.nus.edu.sg

Collaborative, Learning, and Adaptive Robots (CLeAR) Lab at NUS

Citation (CITATION.bib)

@inproceedings{
ansari2023neural,
    title={Neural Continuous-Discrete State Space Models for Irregularly-Sampled Time Series},
    author={Ansari, Abdul Fatir and Heng, Alvin and Lim, Andre and Soh, Harold},
    booktitle={International Conference on Machine Learning},
    year={2023}
}

GitHub Events

Total
  • Watch event: 1
  • Issue comment event: 1
  • Fork event: 1
Last Year
  • Watch event: 1
  • Issue comment event: 1
  • Fork event: 1

Committers

Last synced: about 2 years ago

All Time
  • Total Commits: 8
  • Total Committers: 2
  • Avg Commits per committer: 4.0
  • Development Distribution Score (DDS): 0.125
Past Year
  • Commits: 8
  • Committers: 2
  • Avg Commits per committer: 4.0
  • Development Distribution Score (DDS): 0.125
Top Committers
Name Email Commits
Abdul Fatir A****s@g****m 7
Abdul Fatir Ansari a****d@a****m 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: about 2 years ago

All Time
  • Total issues: 1
  • Total pull requests: 0
  • Average time to close issues: 24 days
  • Average time to close pull requests: N/A
  • Total issue authors: 1
  • Total pull request authors: 0
  • Average comments per issue: 7.0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 1
  • Pull requests: 0
  • Average time to close issues: 24 days
  • Average time to close pull requests: N/A
  • Issue authors: 1
  • Pull request authors: 0
  • Average comments per issue: 7.0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • RichardShea (1)
  • 666-will (1)
Pull Request Authors
  • RichardShea (1)
Top Labels
Issue Labels
Pull Request Labels