https://github.com/earthai-tech/fusionlab-learn

fusionlab-learn: Igniting Next-Gen Temporal Fusion Architectures

https://github.com/earthai-tech/fusionlab-learn

Science Score: 26.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (15.9%) to scientific vocabulary

Keywords

attention-mechanisms deep-learning forecasting-models machine-intelligence time-series transformers
Last synced: 5 months ago · JSON representation

Repository

fusionlab-learn: Igniting Next-Gen Temporal Fusion Architectures

Basic Info
Statistics
  • Stars: 1
  • Watchers: 1
  • Forks: 0
  • Open Issues: 0
  • Releases: 6
Topics
attention-mechanisms deep-learning forecasting-models machine-intelligence time-series transformers
Created 10 months ago · Last pushed 7 months ago
Metadata Files
Readme License Code of conduct

README.md

FusionLab Logo


fusionlab-learn

A Research-Oriented Library for Advanced Time Series Forecasting with Hybrid, Transformer, and Physics-Informed Models

PyPI Version Documentation Status Build Status Python Version License

fusionlab-learn is a flexible and extensible Python package for building and experimenting with state-of-the-art time series models. It provides robust, research-grade implementations of advanced architectures, from data-driven forecasters to novel Physics-Informed Neural Networks (PINNs).

Whether you're a researcher exploring new architectures or a practitioner building production-grade forecasting systems, fusionlab-learn provides tools built on TensorFlow/Keras to accelerate your work.


✨ Key Features

🏛️ A Spectrum of Advanced Architectures

The library provides implementations across three major families of forecasting models.

  • Hybrid Models: Architectures like HALNet and XTFT that fuse the sequential processing power of LSTMs with the long-range context modeling of attention mechanisms.
  • Pure Transformers: Implementations of the standard "Attention Is All You Need" encoder-decoder architecture, adapted for time series forecasting.
  • Physics-Informed Models (PINNs): State-of-the-art hybrid models like TransFlowSubsNet that integrate physical laws (PDEs) directly into the training process to produce physically consistent and robust forecasts.

🧩 Modular & Reusable Components

Build custom models with a rich set of well-tested neural network blocks, including: * Gated Residual Networks (GRNs) & Variable Selection Networks (VSNs) * Specialized Attention Layers: CrossAttention, HierarchicalAttention, and MemoryAugmentedAttention * Multi-Scale LSTMs for capturing temporal patterns at various resolutions.

⚛️ PINN Capabilities

  • Solve coupled-physics problems with models like TransFlowSubsNet.
  • Perform inverse modeling by configuring physical coefficients (K, Ss, C) as learnable parameters.
  • Utilize specialized PINN data utilities for the unique sequence and coordinate preparation required by these models.

🛠️ Unified Hyperparameter Tuning

  • Leverage the HydroTuner to automatically find optimal hyperparameters for all hydrogeological PINN models.
  • Use dedicated tuners for data-driven models like HALNet and XTFT.
  • The tuner's .create() factory method automatically infers data dimensions, making setup fast and easy.

🚀 Getting Started

Installation

  1. Prerequisites:

  2. Install from PyPI (Recommended): bash pip install fusionlab-learn

  3. Install from Source (for Development): bash git clone https://github.com/earthai-tech/fusionlab-learn.git cd fusionlab-learn pip install -e .

Quick Example

```python import numpy as np import tensorflow as tf from fusionlab.nn.models import HALNet # Or any other model

--- 1. Prepare Dummy Data ---

(Replace with your actual preprocessed & sequenced data)

B, T, Ddyn = 16, 10, 3 # Batch, TimeSteps, DynamicFeatures Dstat = 2 # StaticFeatures D_fut = 1 # FutureFeatures H = 5 # Forecast Horizon

Model expects list: [Static, Dynamic, Future]

dummystatic = np.random.rand(B, Dstat).astype(np.float32) dummydynamic = np.random.rand(B, T, Ddyn).astype(np.float32)

For 'tft_like' mode, future input spans past + horizon

dummyfuture = np.random.rand(B, T + H, Dfut).astype(np.float32) dummy_target = np.random.rand(B, H, 1).astype(np.float32)

modelinputs = [dummystatic, dummydynamic, dummyfuture]

--- 2. Instantiate Model ---

model = HALNet( staticinputdim=Dstat, dynamicinputdim=Ddyn, futureinputdim=Dfut, forecasthorizon=H, maxwindowsize=T, outputdim=1, hiddenunits=16, # Smaller units for quick example num_heads=2 )

--- 3. Compile & Train ---

model.compile(optimizer='adam', loss='mse') print("Training simple model...") model.fit(modelinputs, dummytarget, epochs=2, batch_size=4, verbose=0) print("Training finished.")

--- 4. Predict ---

print("Making predictions...") predictions = model.predict(model_inputs) print("Prediction shape:", predictions.shape)

Expected: (16, 5, 1) -> (Batch, Horizon, NumOutputs)

```

(See the Quickstart Guide for a more detailed walkthrough.)


📚 Documentation

For detailed usage, tutorials, API reference, and explanations of the underlying concepts, please see the full documentation:

Read the Documentation


📄 License

This project is licensed under the BSD-3-Clause. See the LICENSE file for details.


🤝 Contributing

We welcome contributions! Whether it's adding new features, fixing bugs, or improving documentation, your help is appreciated. Please see our Contribution Guidelines for more details on how to get started.


📞 Contact & Support

Owner

  • Name: daniel03
  • Login: earthai-tech
  • Kind: user
  • Location: China
  • Company: Central South University

Geophysicist | Lecturer @ AI in Geophysics - Computational Geophysics

GitHub Events

Total
  • Release event: 2
  • Push event: 138
  • Pull request event: 46
  • Create event: 2
Last Year
  • Release event: 2
  • Push event: 138
  • Pull request event: 46
  • Create event: 2

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 30 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 0
  • Total versions: 5
  • Total maintainers: 1
pypi.org: fusionlab-learn

Next-Gen Temporal Fusion Architectures for Time-Series Forecasting

  • Versions: 5
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 30 Last month
Rankings
Dependent packages count: 9.2%
Average: 30.5%
Dependent repos count: 51.8%
Maintainers (1)
Last synced: 6 months ago