https://github.com/sintel-dev/ml-stars

Primitives and Pipelines for Time Series Data

https://github.com/sintel-dev/ml-stars

Science Score: 26.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (19.4%) to scientific vocabulary

Keywords

pipelines primitives time-series
Last synced: 5 months ago · JSON representation

Repository

Primitives and Pipelines for Time Series Data

Basic Info
Statistics
  • Stars: 4
  • Watchers: 2
  • Forks: 1
  • Open Issues: 2
  • Releases: 8
Topics
pipelines primitives time-series
Created over 4 years ago · Last pushed about 1 year ago
Metadata Files
Readme Changelog Contributing License Authors

README.md

DAI-Lab An Open Source Project from the Data to AI Lab, at MIT

Development Status PyPi Shield Tests Downloads Binder

ml-stars

Primitives for machine learning and time series.

  • Github: https://github.com/sintel-dev/ml-stars
  • License: MIT
  • Development Status: Pre-Alpha

Overview

This repository contains primitive annotations to be used by the MLBlocks library, as well as the necessary Python code to make some of them fully compatible with the MLBlocks API requirements.

There is also a collection of custom primitives contributed directly to this library, which either combine third party tools or implement new functionalities from scratch.

Installation

Requirements

ml-stars has been developed and tested on Python 3.8, 3.9, 3.10, 3.11, and 3.12

Also, although it is not strictly required, the usage of a virtualenv is highly recommended in order to avoid interfering with other software installed in the system where ml-stars is run.

Install with pip

The easiest and recommended way to install ml-stars is using pip:

bash pip install ml-stars

This will pull and install the latest stable release from PyPi.

If you want to install from source or contribute to the project please read the Contributing Guide.

Quickstart

This section is a short series of tutorials to help you getting started with ml-stars.

We will be executing a single primitive for data transformation.

1. Load a Primitive

The first step in order to run a primitive is to load it.

This will be done using the mlstars.load_primitive function, which will load the indicated primitive as an MLBlock Object from MLBlocks

In this case, we will load the sklearn.preprocessing.MinMaxScaler primitive.

```python3 from mlstars import load_primitive

primitive = load_primitive('sklearn.preprocessing.MinMaxScaler') ```

2. Load some data

The StandardScaler is a transformation primitive which scales your data into a given range.

To use this primtives, we generate a synthetic data with some numeric values. ```python3 import numpy as np

data = np.array([10, 1, 3, -1, 5, 6, 0, 4, 13, 4]).reshape(-1, 1) ```

The data is a list of integers where their original range is between [-1, 13].

3. Fit the primitive

In order to run our primitive, we first need to fit it.

This is the process where it analyzes the data to detect what is the original range of the data.

This is done by calling its fit method and passing the data as X.

python3 primitive.fit(X=data)

4. Produce results

Once the pipeline is fit, we can process the data by calling the produce method of the primitive instance and passing agin the data as X.

python3 transformed = primitive.produce(X=data) transformed

After this is done, we can see how the transformed data contains the transformed values:

array([[0.78571429], [0.14285714], [0.28571429], [0. ], [0.42857143], [0.5 ], [0.07142857], [0.35714286], [1. ], [0.35714286]])

The data is now in [0, 1] range.

What's Next?

Documentation

Owner

  • Name: The Signal Intelligence Project
  • Login: sintel-dev
  • Kind: organization
  • Email: dai-lab@mit.edu

Systems and tools to design, develop and deploy AI applications on top of signals.

GitHub Events

Total
  • Create event: 5
  • Release event: 3
  • Issues event: 4
  • Delete event: 2
  • Issue comment event: 1
  • Push event: 13
  • Pull request event: 4
Last Year
  • Create event: 5
  • Release event: 3
  • Issues event: 4
  • Delete event: 2
  • Issue comment event: 1
  • Push event: 13
  • Pull request event: 4

Committers

Last synced: about 2 years ago

All Time
  • Total Commits: 38
  • Total Committers: 1
  • Avg Commits per committer: 38.0
  • Development Distribution Score (DDS): 0.0
Past Year
  • Commits: 35
  • Committers: 1
  • Avg Commits per committer: 35.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Sarah Alnegheimish s****h@g****m 38

Issues and Pull Requests

Last synced: 7 months ago

All Time
  • Total issues: 4
  • Total pull requests: 13
  • Average time to close issues: 2 days
  • Average time to close pull requests: 4 days
  • Total issue authors: 2
  • Total pull request authors: 2
  • Average comments per issue: 0.25
  • Average comments per pull request: 0.0
  • Merged pull requests: 12
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 2
  • Pull requests: 2
  • Average time to close issues: 3 days
  • Average time to close pull requests: about 12 hours
  • Issue authors: 1
  • Pull request authors: 1
  • Average comments per issue: 0.5
  • Average comments per pull request: 0.0
  • Merged pull requests: 2
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • sarahmish (3)
  • graceyesong (1)
Pull Request Authors
  • sarahmish (14)
  • boom90lb (1)
Top Labels
Issue Labels
enhancement (1)
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 1,751 last-month
  • Total dependent packages: 2
  • Total dependent repositories: 1
  • Total versions: 19
  • Total maintainers: 3
pypi.org: ml-stars

Primitives and Pipelines for Time Series Data.

  • Versions: 19
  • Dependent Packages: 2
  • Dependent Repositories: 1
  • Downloads: 1,751 Last month
Rankings
Dependent packages count: 3.2%
Downloads: 5.8%
Average: 17.1%
Dependent repos count: 21.6%
Stargazers count: 25.1%
Forks count: 29.8%
Maintainers (3)
Last synced: 6 months ago

Dependencies

.github/workflows/docs.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v1 composite
  • peaceiris/actions-gh-pages v3 composite
.github/workflows/tests.yml actions
  • actions/checkout v1 composite
  • actions/setup-python v2 composite
setup.py pypi
  • Keras >=2.4,<2.5
  • fix *
  • mlblocks >=0.4,<0.6
  • numpy <1.21.0,>=1.16.0
  • pandas >=1,<2
  • protobuf <4
  • scikit-learn >=0.21
  • scipy >=1.1.0,<2
  • statsmodels >=0.9.0,<0.13
  • tensorflow >=2,<2.5
  • xgboost >=0.72.1,<1