Recent Releases of auto-sklong

auto-sklong - Auto-Sklong Official Paper Is Out!

Hi folks!

We are pleased to announce that Auto-Sklong is now available with its official paper released, all that under the tag 0.0.9 🎉

âťť In a nutshell, what's Auto-Sklong?

📽️ Auto-Sklong (short for Auto-Scikit-Longitudinal) is built on @_gama (by @PGijsbers & @amore-labs), @scikit-longitudinal (our own Sklearn-like library gathering primitives for Longitudinal related classification tasks challenges), @_smac3 for Bayesian optimization (by the @automl team) and draws inspiration from longitudinal research by top-notch researchers like Dr. Caio Ribeiro (@caioedurib), Dr. Tossapol Pomsuwan (@mastervii), Dr. Sergey Ovchinnik (@SergeyOvchinnik), Dr. Fernando Otero (@febo). But really, what does it do? 👇👇👇

đź’ˇ Auto-Sklong is an open-source Python system that is among the first to unravel uncharted territories between automated machine learning (AutoML) and longitudinal health-related data classification. It solves the Combined Algorithm Selection and Hyperparameter (CASH) optimization problem for longitudinal datasets, where features are repeatedly measured over time (often called waves), common in health and biomedical fields. Unlike standard AutoML systems, which require flattening longitudinal data and lose temporal insights (Longitudinal data is different than time-series data, while similar in some ways), Auto-Sklong includes both data transformation and algorithm adaptation approaches, evaluating pipelines with longitudinal-aware components aiming to enhance both predictive accuracy and explainability!

With Auto-Sklong, we address the challenge of manually selecting the best pipeline for temporal data by offering automated search across:

  • Search Methods: 4 strategies including Bayesian Optimization (via SMAC3), Asynchronous Successive Halving, Evolutionary Algorithms, and Random Search (via GAMA).
  • Data Preparation: Utilities like MerWavTimePlus to preserve temporal structure for longitudinal methods.
  • Data Transformation: 10 flattening methods (e.g., aggregation with mean/median, MerWavTimeMinus, SepWav with voting/stacking) for standard ML compatibility.
  • Preprocessing: Longitudinal-aware feature selection like ExhaustiveCFSPerGroup, or standard CFS.
  • Estimators: 11 classifiers, including 5 specialized longitudinal ones (e.g., LexicoRandomForestClassifier, NestedTreesClassifier, LexicoDeepForestClassifier) and 6 traditional (e.g., Random Forest, Gradient Boosting) from Scikit-learn.

Maybe a look at the search space won't hurt!

AutoSklongSearchSpace

| |

Not Enough? âťž

🗞️ The scientific paper is available at (Published by IEEE): https://doi.org/10.1109/BIBM62325.2024.10821737

As well as that, more is coming; explore our GitHub issues, read through our README, and check our documentation! We've also added tutorial enhancements in the docs for a quick overview (PS: There even is a generated podcast about the paper!).

| |

Open-Source Contribution, More Than Welcome! âťž

We hope to provide motivation for you to contribute your own search methods, preprocessors, data transformation techniques and more! If we could have 1% of what @scikit-learn did 10 years ago (back in France 🇫🇷) for the machine learning community globally, it'd be just insane!

As a result, please share your suggestions! Without external input, how can we ensure we're advancing longitudinal AutoML workflows? 👀 New primitives are welcome from external contributors without problems—simply open an issue to discuss.

âťť I guess it's now time for tech-ish changelog!

🫵 https://pypi.org/project/auto-sklong/

[v0.0.9] - 2025-08-02 - BIBM Paper Publication and Early-Beta Transition

Added

  • BIBM paper integration: Added links, badges, and references to the published paper (DOI: 10.1109/BIBM62325.2024.10821737) – core commit for release.
  • Tutorial improvements in documentation – commit yesterday.
  • GAMA inclusion for enhanced search methods – commit 13 hours ago.
  • Document-dates plugin for docs – inferred from similar updates.
  • Blurry tabs styling for docs – inferred from similar updates.

Enhanced

  • Documentation overhaul: Revamped home page, improved wordings, import examples, and docstrings – commits yesterday.
  • Updated BIBM paper links and temporal dependency references – commits 7-8 hours ago.
  • Improved authorship, PyPI README, and links – commits 13 hours ago.
  • Updated development requirements (e.g., package versions) and uv.lock – commit on Jan 15, 2025.
  • Enhanced tutorial and README – commits 14 hours ago and yesterday.

Resolved

  • Fixed PyPI links and setup.py – commits 13 hours ago.
  • Addressed post-acceptance tweaks in minor releases (e.g., since 0.0.4) – various commits. Note: This changelog covers advancements since 0.0.4. For prior details, see the expanded history below.
Previously in v0.0.4 and earlier ## [v0.0.4] - 2025-01-15 – Stability Enhancements & Package Updates We’re excited to introduce **Auto-Sklong** `v0.0.4`! While this release is minor, it ensures that our dependencies remain up-to-date, particularly with the migration of **Scikit-Longitudinal** to version `0.0.7`. This upgrade brings compatibility improvements and prepare the foundation for better performance in longitudinal machine learning tasks. ### Highlights 1. **Package Updates** - Updated **Scikit-Longitudinal** to `v0.0.7` to incorporate compatibility adjustments for the recent migration to `uv`. 2. **Stability Improvements** - Minor bug fixes and adjustments to ensure smoother workflows, especially with updated dependencies. As always, thank you for your support and contributions! Let’s keep pushing forward together! 🚀 ## [v0.0.3] - 2024-12-25 – Migration to UV & Documentation Improvements We are glad to present **Auto-Sklong** `v0.0.3`. This new release focusses on streamlining the development workflow by **migrating from PDM to [uv](https://github.com/astral-sh/uv)**, which speeds up installation and reduces complexity. Our documentation has also been updated, with clarifications to the Quick Start and new sections for Apple Silicon Mac customers. Most excitingly, our paper has been **accepted** to the **IEEE BIBM 2024** conference—stay tuned for the BibTeX citation and additional publication details when the proceedings are posted! #### Highlights 1. **Migration from PDM to `uv`** - Far simpler commands and fewer setup configurations for the community. - Substantial speed improvements, as shown in this [benchmark comparison](https://github.com/astral-sh/uv/assets/1309177/03aa9163-1c79-4a87-a31d-7a9311ed9310#only-dark), pitting `uv` against poetry, PDM, and pip-sync. 2. **Documentation Enhancements** - **Quick Start Fixes**: Thanks to @anderdnavarro (in #4, #5, #6) for correcting parameter names in the Quick Start feature list examples. - **Paper Acceptance**: Our paper on Auto-Sklong has been accepted to the [2024 IEEE BIBM Conference](https://ieeebibm.org/). We will add the BibTeX reference once the proceedings are finalised. - **Apple Silicon Installation Guide**: The [Quick Start](https://simonprovost.github.io/Auto-Sklong/quick-start/#installing-auto-sklong-on-apple-silicon-macs) now includes a dedicated section for installing Auto-Sklong on Apple Silicon-based Macs, making setup for M1/M2 systems more transparent and accessible. Thanks once more to @anderdnavarro for pointing that out! #### Future Work - **BibTeX Citation**: We will add a citation reference for our BIBM 2024 paper as soon as the proceedings are publicly available. - **Documentation**: The experiments paper section will be simplified, and the redundant "Release History" tab in the documentation will be removed. - **Examples**: We aim to launch a comprehensive Jupyter notebook tutorial to demonstrate how to use Auto-Sklong. As always, thank you for your continued support. Let’s keep exploring the boundaries of longitudinal machine learning! **Merry XMas!** 🎄
Extras Could be of interest to: @anderdnavarro

- Python
Published by simonprovost 7 months ago

auto-sklong - 🎉 Minor Update with Package Updates & Stability Enhancements

[v0.0.4] - 2025-01-15 – Stability Enhancements & Package Updates

We’re excited to introduce Auto-Sklong v0.0.4! While this release is minor, it ensures that our dependencies remain up-to-date, particularly with the migration of Scikit-Longitudinal to version 0.0.7. This upgrade brings compatibility improvements and prepare the foundation for better performance in longitudinal machine learning tasks.

Highlights

  1. Package Updates

    • Updated Scikit-Longitudinal to v0.0.7 to incorporate compatibility adjustments for the recent migration to uv.
  2. Stability Improvements

    • Minor bug fixes and adjustments to ensure smoother workflows, especially with updated dependencies.

As always, thank you for your support and contributions! Let’s keep pushing forward together! 🚀

Previously in v0.0.3 🎄 Migration to UV & Documentation Improvements We are glad to present **Auto-Sklong** `v0.0.3`. This new release focusses on streamlining the development workflow by **migrating from PDM to [uv](https://github.com/astral-sh/uv)**, which speeds up installation and reduces complexity. Our documentation has also been updated, with clarifications to the Quick Start and new sections for Apple Silicon Mac customers. Most excitingly, our paper has been **accepted** to the **IEEE BIBM 2024** conference—stay tuned for the BibTeX citation and additional publication details when the proceedings are posted! #### Highlights 1. **Migration from PDM to `uv`** - Far simpler commands and fewer setup configurations for the community. - Substantial speed improvements, as shown in this [benchmark comparison](https://github.com/astral-sh/uv/assets/1309177/03aa9163-1c79-4a87-a31d-7a9311ed9310#only-dark), pitting `uv` against poetry, PDM, and pip-sync. 2. **Documentation Enhancements** - **Quick Start Fixes**: Thanks to @anderdnavarro (in #4, #5, #6) for correcting parameter names in the Quick Start feature list examples. - **Paper Acceptance**: Our paper on Auto-Sklong has been accepted to the [2024 IEEE BIBM Conference](https://ieeebibm.org/). We will add the BibTeX reference once the proceedings are finalised. - **Apple Silicon Installation Guide**: The [Quick Start](https://simonprovost.github.io/Auto-Sklong/quick-start/#installing-auto-sklong-on-apple-silicon-macs) now includes a dedicated section for installing Auto-Sklong on Apple Silicon-based Macs, making setup for M1/M2 systems more transparent and accessible. Thanks once more to @anderdnavarro for pointing that out! #### Future Work - **BibTeX Citation**: We will add a citation reference for our BIBM 2024 paper as soon as the proceedings are publicly available. - **Documentation**: The experiments paper section will be simplified, and the redundant "Release History" tab in the documentation will be removed. - **Examples**: We aim to launch a comprehensive Jupyter notebook tutorial to demonstrate how to use Auto-Sklong. As always, thank you for your continued support. Let’s keep exploring the boundaries of longitudinal machine learning! **Merry XMas!** 🎄

- Python
Published by simonprovost about 1 year ago

auto-sklong - 🎄 Migration to UV & Documentation Improvements

[v0.0.3] - 2024-12-31 – Migration to uv & Documentation Improvements

We are glad to present Auto-Sklong v0.0.3. This new release focusses on streamlining the development workflow by migrating from PDM to uv, which speeds up installation and reduces complexity. Our documentation has also been updated, with clarifications to the Quick Start and new sections for Apple Silicon Mac customers. Most excitingly, our paper has been accepted to the IEEE BIBM 2024 conference—stay tuned for the BibTeX citation and additional publication details when the proceedings are posted!

Highlights

  1. Migration from PDM to uv

    • Far simpler commands and fewer setup configurations for the community.
    • Substantial speed improvements, as shown in this benchmark comparison, pitting uv against poetry, PDM, and pip-sync.
  2. Documentation Enhancements

    • Quick Start Fixes: Thanks to @anderdnavarro (in #4, #5, #6) for correcting parameter names in the Quick Start feature list examples.
    • Paper Acceptance: Our paper on Auto-Sklong has been accepted to the 2024 IEEE BIBM Conference. We will add the BibTeX reference once the proceedings are finalised.
    • Apple Silicon Installation Guide: The Quick Start now includes a dedicated section for installing Auto-Sklong on Apple Silicon-based Macs, making setup for M1/M2 systems more transparent and accessible. Thanks once more to @anderdnavarro for pointing that out!

Future Work

  • BibTeX Citation: We will add a citation reference for our BIBM 2024 paper as soon as the proceedings are publicly available.
  • Documentation: The experiments paper section will be simplified, and the redundant "Release History" tab in the documentation will be removed.
  • Examples: We aim to launch a comprehensive Jupyter notebook tutorial to demonstrate how to use Auto-Sklong.

As always, thank you for your continued support. Let’s keep exploring the boundaries of longitudinal machine learning!

Merry XMas! 🎄

Previously in v0.0.2 We are pleased to announce that Auto-Sklong is now available in its first public release under the tag `0.0.2`, despite numerous PyPI misadventures _(lesson learned, PyPI-Tests)_. 🎉 ### About Auto-Sklong Auto-Sklong is built on [@PGijsbers’ General Automated Machine Learning (AutoML) Assistant (GAMA)](https://github.com/openml-labs/gama) framework—a flexible AutoML framework for experimenting with different search strategies and a customisable search space. We began improving GAMA locally for our own goals of tackling longitudinal machine learning tasks, resulting in Auto-Sklong. While it remains an AutoML system, it offers new features such as: - A sequential search space via [ConfigSpace](https://automl.github.io/ConfigSpace/latest/). - Bayesian optimisation using [SMAC3](https://github.com/automl/SMAC3). - Additional built-in features inherited from GAMA. ### Key Features in `v0.0.2` - **New Search Space**: ConfigSpace-supported search space. - **New Search Method**: Bayesian Optimisation via SMAC3. - **Documentation**: Comprehensive new docs (Material for MkDocs), including tutorials on longitudinal data and usage examples. - **PyPI Availability**: Auto-Sklong is now published on PyPI. - **Continuous Integration**: Streamlined CI pipeline for building, testing, and publishing. ### Next Steps - Finalise PRs on GAMA to align with Auto-Sklong. - Add real-world examples and Jupyter notebooks to help users adopt the library. - Continue refining the library and documentation. > **Note**: No tag `0.0.1` will ever be available.

- Python
Published by simonprovost about 1 year ago

auto-sklong - 🎉 First Public Release Github & PyPi

We are pleased to announce that Auto-Sklong is now available in its first public release under the tag 0.0.2, despite numerous Pypi misadventures (lesson learned, Pypi-Tests). 🎉

📽️ Auto-Sklong is built on @PGijsbers' General Automated Machine Learning (AutoML) Assistant (GAMA) framework. A flexible AutoML framework for experimenting with different search strategies and a customisable search space, among other cool features. We began using and improving locally GAMA for our own goals of tackling the Longitudinal machine learning tasks via AutoML, then created Auto-Sklong, which, while an AutoML system, differs from the very goal of GAMA; however, the improvements made to GAMA by doing Auto-Sklong were "generalised" for the GAMA goal, and we submitted three pull requests (see further in our readme).

💡 Auto-Sklong introduces a completely new search space by leveraging ConfigSpace, a sequential search space. Introduces a new search method, bayesian optimisation, via SMAC3. It also includes all of GAMA's built-in features, such as different search methods and other cool stuff. Read the Auto-Sklong and GAMA documentation. In order to achieve the end goal: Auto-Sklong is now capable of solving both the (1) Longitudinal Machine Learning task problem by understanding the temporal dependency in the dataset – leveraging Sklong – and the (2) Combined Algorithm Selection and Hyperparameter Optimisation (CASH Optimisation).

Paper has been submitted to a conference. Will be updated if accepted.

🫵 https://pypi.org/project/Auto-Sklong/0.0.2

[v0.0.2] - 2024-07-12 - First Public Release

Added

  • New Search Space: ConfigSpace supported search space via GAMA. Pull request ongoing on the original repository.
  • New Search Method: Bayesian Optimization via SMAC3 is now feasible. Pull request ongoing on the GAMA original repository.
  • Documentation: Comprehensive new documentation with Material for MKDocs. This includes a detailed tutorial on understanding vectors of waves in longitudinal datasets, a contribution guide, an FAQ section, and complete API references which use a lot of Sklong and GAMA documentation to guide the users.
  • PyPI Availability: Auto-Sklong is now available on PyPI.
  • Continuous Integration: Integrated unit testing, documentation, and PyPI publishing within the CI pipeline.

To-Do

  • Finalize PRs on GAMA: Ongoing pull requests on GAMA would facilitate the alignment between Auto-Sklong and GAMA's latest version. They need to be worked on and published so that we can make compatibility adjustments between both libraries for the sake of Auto-Sklong's long-term goals (being able to benefit from future GAMA features if any).
  • Future Enhancements: Ongoing improvements and new features as they are identified.
  • Documentation Examples: Add examples to the documentation to help users understand how to use the library with Jupyter notebooks.

Note, no tag 0.0.1 will ever be available.

- Python
Published by simonprovost over 1 year ago