gb-dnnr

Deep Learning for Multi-Output Regression using Gradient Boosting

https://github.com/gaa-uam/gb-dnnr

Science Score: 57.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in README
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (12.2%) to scientific vocabulary

Keywords

deep-neural-networks gradient-boosting multi-output-regression
Last synced: 6 months ago · JSON representation ·

Repository

Deep Learning for Multi-Output Regression using Gradient Boosting

Basic Info
Statistics
  • Stars: 4
  • Watchers: 5
  • Forks: 0
  • Open Issues: 0
  • Releases: 1
Topics
deep-neural-networks gradient-boosting multi-output-regression
Created about 2 years ago · Last pushed about 2 years ago
Metadata Files
Readme License Citation

README.md

GB-DNNR

Gradient Boosted - Deep Neural Network Regression.

GB-DNNR is the Python library for working with Gradient Boosted - Deep Neural Network Regression (GB-DNNR).

Citing

  • If you use this package, please cite.
  • If you employ GB-DNNR in your research paper, please cite our work by citing the following references.

@article{10415168, author = {Emami, Seyedsaman and Martínez-Muñoz, Gonzalo}, journal = {IEEE Access}, title = {Deep Learning for Multi-Output Regression Using Gradient Boosting}, year = {2024}, volume = {12}, pages = {17760-17772}, doi = {10.1109/ACCESS.2024.3359115} } - An alternative method is to utilize the provided bib file for referencing our work.

Key members

Requirements

This package takes advantage of the following libraries (Python 3.x):

  • Numpy
  • TensorFlow 2.13.0
  • Keras

Keywords

Gradient Boosting, Deep Neural Network, Multi-output regression

License

The package is licensed under the GNU Lesser General Public License v2.1.

Documentation

For more information, examples, etc, please refer to the Wiki.

Development

Our latest algorithm is available on the main branch of the repository.

Contributions

Contributions to the GB-DNNR project are welcome. Feel free to fork the repository, make changes, and submit pull requests.

Date-released

21.12.2023

Date-updated

21.12.2023

Version

0.0.1

Owner

  • Name: Grupo de Aprendizaje Automático - Universidad Autónoma de Madrid
  • Login: GAA-UAM
  • Kind: organization
  • Location: Madrid, Spain

Machine Learning Group at Universidad Autónoma de Madrid

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this package, please cite it as below."
authors:
- family-names: "Emami"
  given-names: "Seyedsaman"
  orcid: "https://orcid.org/0000-0002-6306-1180"
- family-names: "Martínez-Muñoz"
  given-names: "Gonzalo"
  orcid: "https://orcid.org/0000-0002-6125-6056"
title: "Gradient Boosted - Deep Neural Network Regression (GBDNNR)."
version: 0.0.1
date-released: 2023-12-21
Repository: "https://github.com/GAA-UAM/GB-DNNR"
Paper: "Deep Learning for Multi-Output Regression using Gradient Boosting"
doi: "10.1109/ACCESS.2024.3359115"

GitHub Events

Total
  • Watch event: 3
  • Fork event: 1
Last Year
  • Watch event: 3
  • Fork event: 1