https://github.com/dc-research/tempo

The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for forecasting task v1.0 version.

https://github.com/dc-research/tempo

Science Score: 36.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (13.6%) to scientific vocabulary

Keywords

forecasting forecasting-models forecasting-time-series foundation-models gpt pretrained-language-model pretrained-models time-series time-series-analysis transformer transformers transformers-models
Last synced: 6 months ago · JSON representation

Repository

The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for forecasting task v1.0 version.

Basic Info
  • Host: GitHub
  • Owner: DC-research
  • License: mit
  • Language: Python
  • Default Branch: main
  • Homepage:
  • Size: 1.82 MB
Statistics
  • Stars: 118
  • Watchers: 3
  • Forks: 17
  • Open Issues: 5
  • Releases: 1
Topics
forecasting forecasting-models forecasting-time-series foundation-models gpt pretrained-language-model pretrained-models time-series time-series-analysis transformer transformers transformers-models
Created almost 2 years ago · Last pushed 12 months ago
Metadata Files
Readme License

README.md

Time Series Foundation Model - TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting

preprint huggingface License: MIT

The official code for ["TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)"].

TEMPO is one of the very first open source Time Series Foundation Models for forecasting task v1.0 version.

⏳ Upcoming Features

  • [✅] Parallel pre-training pipeline
  • [] Probabilistic forecasting
  • [] Multimodal dataset
  • [] Multimodal pre-training script

🚀 News

  • Nov 2024: 🚀 We've published TimeAGI on PyPI! Now you can simply pip install timeagi to get started and try TEMPO by from tempo.models.TEMPO import TEMPO. Check out our demo for more details: TimeAGI!

  • Oct 2024: 🚀 We've streamlined our code structure, enabling users to download the pre-trained model and perform zero-shot inference with a single line of code! Check out our demo for more details. Our model's download count on HuggingFace is now trackable!

  • Jun 2024: 🚀 We added demos for reproducing zero-shot experiments in Colab. We also added the demo of building the customer dataset and directly do the inference via our pre-trained foundation model: Colab

  • May 2024: 🚀 TEMPO has launched a GUI-based online demo, allowing users to directly interact with our foundation model!

  • May 2024: 🚀 TEMPO published the 80M pretrained foundation model in HuggingFace!

  • May 2024: 🧪 We added the code for pretraining and inference TEMPO models. You can find a pre-training script demo in this folder. We also added a script for the inference demo.

  • Mar 2024: 📈 Released TETS dataset from S&P 500 used in multimodal experiments in TEMPO.

  • Mar 2024: 🧪 TEMPO published the project code and the pre-trained checkpoint online!

  • Jan 2024: 🚀 TEMPO paper get accepted by ICLR!

  • Oct 2023: 🚀 TEMPO paper released on Arxiv!

Build the environment

conda create -n tempo python=3.8 conda activate tempo pip install timeagi

Script Demo

A streamlining example showing how to perform forecasting using TEMPO:

```python

Third-party library imports

import numpy as np import torch from numpy.random import choice

Local imports

from tempo.models.TEMPO import TEMPO

model = TEMPO.loadpretrainedmodel( device = torch.device('cuda:0' if torch.cuda.isavailable() else 'cpu'), repoid = "Melady/TEMPO", filename = "TEMPO-80Mv1.pth", cachedir = "./checkpoints/TEMPO_checkpoints"
)

inputdata = np.random.rand(336) # Random input data with torch.nograd(): predictedvalues = model.predict(inputdata, predlength=96) print("Predicted values:") print(predictedvalues)

```

Demos

1. Reproducing zero-shot experiments on ETTh2:

Please try to reproduc the zero-shot experiments on ETTh2 [here on Colab].

2. Zero-shot experiments on customer dataset:

We use the following Colab page to show the demo of building the customer dataset and directly do the inference via our pre-trained foundation model: [Colab]

3. Online demo:

Please try our foundation model demo [here].

Practice on your end

We also updated our models on HuggingFace: [Melady/TEMPO].

Get Data

Download the data from [Google Drive] or [Baidu Drive], and place the downloaded data in the folder./dataset. You can also download the STL results from [Google Drive], and place the downloaded data in the folder./stl.

Run TEMPO

Pre-Training Stage

bash [ecl, etth1, etth2, ettm1, ettm2, traffic, weather].sh

Test/ Inference Stage

After training, we can test TEMPO model under the zero-shot setting:

bash [ecl, etth1, etth2, ettm1, ettm2, traffic, weather]_test.sh

Pre-trained Models

You can download the pre-trained model from [Google Drive] and then run the test script for fun.

TETS dataset

Here is the prompts use to generate the coresponding textual informaton of time series via [OPENAI ChatGPT-3.5 API]

The time series data are come from [S&P 500]. Here is the EBITDA case for one company from the dataset:

Example of generated contextual information for the Company marked above:

You can download the processed data with text embedding from GPT2 from: [TETS].

Contact

Feel free to connect DefuCao@USC.EDU / YanLiu.CS@USC.EDU if you’re interested in applying TEMPO to your real-world application.

Cite our work

@inproceedings{ cao2024tempo, title={{TEMPO}: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting}, author={Defu Cao and Furong Jia and Sercan O Arik and Tomas Pfister and Yixiang Zheng and Wen Ye and Yan Liu}, booktitle={The Twelfth International Conference on Learning Representations}, year={2024}, url={https://openreview.net/forum?id=YH5w12OUuU} }

@article{ Jia_Wang_Zheng_Cao_Liu_2024, title={GPT4MTS: Prompt-based Large Language Model for Multimodal Time-series Forecasting}, volume={38}, url={https://ojs.aaai.org/index.php/AAAI/article/view/30383}, DOI={10.1609/aaai.v38i21.30383}, number={21}, journal={Proceedings of the AAAI Conference on Artificial Intelligence}, author={Jia, Furong and Wang, Kevin and Zheng, Yixiang and Cao, Defu and Liu, Yan}, year={2024}, month={Mar.}, pages={23343-23351} }

Owner

  • Name: DC-research
  • Login: DC-research
  • Kind: organization

GitHub Events

Total
  • Create event: 2
  • Release event: 3
  • Issues event: 20
  • Watch event: 53
  • Member event: 1
  • Issue comment event: 13
  • Push event: 17
  • Pull request event: 2
  • Fork event: 7
Last Year
  • Create event: 2
  • Release event: 3
  • Issues event: 20
  • Watch event: 53
  • Member event: 1
  • Issue comment event: 13
  • Push event: 17
  • Pull request event: 2
  • Fork event: 7

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 16
  • Total pull requests: 1
  • Average time to close issues: 16 days
  • Average time to close pull requests: 7 minutes
  • Total issue authors: 15
  • Total pull request authors: 1
  • Average comments per issue: 1.25
  • Average comments per pull request: 0.0
  • Merged pull requests: 1
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 13
  • Pull requests: 1
  • Average time to close issues: 16 days
  • Average time to close pull requests: 7 minutes
  • Issue authors: 12
  • Pull request authors: 1
  • Average comments per issue: 1.15
  • Average comments per pull request: 0.0
  • Merged pull requests: 1
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • Lanxin1011 (2)
  • YangYu-NUAA (1)
  • yang-den (1)
  • Qqqqxin (1)
  • Curiosity007 (1)
  • ztb-35 (1)
  • luossa (1)
  • AdityaZanjurne133 (1)
  • Lining160 (1)
  • Thomzoy (1)
  • JasonStraka (1)
  • holydick99 (1)
  • cccccrj (1)
  • avtosubaru25 (1)
  • oops343 (1)
Pull Request Authors
  • yongchand (1)
Top Labels
Issue Labels
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 31 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 2
  • Total maintainers: 1
pypi.org: timeagi

Time Series Foundation Model - TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting

  • Versions: 2
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 31 Last month
Rankings
Dependent packages count: 10.0%
Average: 33.0%
Dependent repos count: 56.0%
Maintainers (1)
Last synced: 6 months ago