c-gb

Condensed-Gradient Boosting

https://github.com/gaa-uam/c-gb

Science Score: 67.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 3 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.2%) to scientific vocabulary

Keywords

gradient-boosting multi-class-classification multi-output-regression newton-raphson
Last synced: 6 months ago · JSON representation ·

Repository

Condensed-Gradient Boosting

Basic Info
Statistics
  • Stars: 3
  • Watchers: 5
  • Forks: 0
  • Open Issues: 0
  • Releases: 3
Topics
gradient-boosting multi-class-classification multi-output-regression newton-raphson
Created about 3 years ago · Last pushed over 1 year ago
Metadata Files
Readme License Citation

README.md

Condensed-Gradient Boosting (C-GB)

Table of contents

Introduction

Gradient Boosting Machine is a machine learning model for classification and regression problems. In the following, we present a Condensed Gradient Boosting model that works well for multiclass multi-output regression with high precision and speed.

Usage

To train the CGB model for both multiclass classification and multioutput regression, first, it should be installed using pip.

pip install .

After importing the class, define the model with hyperparameters or use the default values for it. Models run on both Windows and Linux.

To access more examples, plots, and related codes, please refer to C_GB-EX.

On the wiki page, the implementation of the algorithm for two problems (classification and regression) is described.

Citation

Cite this package as below.

log @article{Emami2024, author = {Seyedsaman Emami and Gonzalo Martínez-Muñoz}, title = {Condensed-gradient boosting}, journal = {International Journal of Machine Learning and Cybernetics}, year = {2024}, volume = {}, number = {}, pages = {}, doi = {10.1007/s13042-024-02279-0}, url = {https://doi.org/10.1007/s13042-024-02279-0}, issn = {1868-808X} }

Key members of C-GB

Version

0.0.5

Updated

09.Jul.2023

Date-released

01.Oct.2021

Related links

  • wiki, the model introduction along with complete examples, API, hyperparameters. Refer wiki
  • Examples, codes to reproduce the results, and additional experiments. Refer C_GB-EX.
  • For the condensed model and analysis features, refer to our paper
  • For instructions, please refer to the documentation.

Owner

  • Name: Grupo de Aprendizaje Automático - Universidad Autónoma de Madrid
  • Login: GAA-UAM
  • Kind: organization
  • Location: Madrid, Spain

Machine Learning Group at Universidad Autónoma de Madrid

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this package, please cite it as below."
authors:
- family-names: "Emami"
  given-names: "Seyedsaman"
  orcid: "https://orcid.org/0000-0002-6306-1180"
- family-names: "Martínez-Muñoz"
  given-names: "Gonzalo"
  orcid: "https://orcid.org/0000-0002-6125-6056"
title: "Condensed-Gradient Boosting"
Publication Date: 2024-07-23
url: "https://link.springer.com/article/10.1007/s13042-024-02279-0"

GitHub Events

Total
Last Year