https://github.com/amilworks/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Flink and DataFlow
Science Score: 10.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
○codemeta.json file
-
○.zenodo.json file
-
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (16.1%) to scientific vocabulary
Last synced: 6 months ago
·
JSON representation
Repository
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Flink and DataFlow
Basic Info
- Host: GitHub
- Owner: amilworks
- License: apache-2.0
- Default Branch: master
- Homepage: https://xgboost.ai/
- Size: 14.1 MB
Statistics
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Fork of dmlc/xgboost
Created almost 6 years ago
· Last pushed almost 6 years ago
https://github.com/amilworks/xgboost/blob/master/
eXtreme Gradient Boosting =========== [](https://xgboost-ci.net/blue/organizations/jenkins/xgboost/activity) [](https://travis-ci.org/dmlc/xgboost) [](https://ci.appveyor.com/project/tqchen/xgboost) [](https://xgboost.readthedocs.org) [](./LICENSE) [](http://cran.r-project.org/web/packages/xgboost) [](https://pypi.python.org/pypi/xgboost/) [](https://optuna.org) [Community](https://xgboost.ai/community) | [Documentation](https://xgboost.readthedocs.org) | [Resources](demo/README.md) | [Contributors](CONTRIBUTORS.md) | [Release Notes](NEWS.md) XGBoost is an optimized distributed gradient boosting library designed to be highly ***efficient***, ***flexible*** and ***portable***. It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples. License ------- Contributors, 2019. Licensed under an [Apache-2](https://github.com/dmlc/xgboost/blob/master/LICENSE) license. Contribute to XGBoost --------------------- XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. Checkout the [Community Page](https://xgboost.ai/community). Reference --------- - Tianqi Chen and Carlos Guestrin. [XGBoost: A Scalable Tree Boosting System](http://arxiv.org/abs/1603.02754). In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016 - XGBoost originates from research project at University of Washington. Sponsors -------- Become a sponsor and get a logo here. See details at [Sponsoring the XGBoost Project](https://xgboost.ai/sponsors). The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net). ## Open Source Collective sponsors [](#backers) [](#sponsors) ### Sponsors [[Become a sponsor](https://opencollective.com/xgboost#sponsor)]
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
![]()
### Backers [[Become a backer](https://opencollective.com/xgboost#backer)]
## Other sponsors The sponsors in this list are donating cloud hours in lieu of cash donation.
![]()
Owner
- Name: Amil Khan
- Login: amilworks
- Kind: user
- Location: UCSB
- Company: UCSB Electrical & Computer Engineering
- Website: amilworks.github.io
- Repositories: 2
- Profile: https://github.com/amilworks
PhD student in Electrical & Computer Engineering @ucsb, Lead Engineer @ BisQue
eXtreme Gradient Boosting
===========
[](https://xgboost-ci.net/blue/organizations/jenkins/xgboost/activity)
[](https://travis-ci.org/dmlc/xgboost)
[](https://ci.appveyor.com/project/tqchen/xgboost)
[](https://xgboost.readthedocs.org)
[](./LICENSE)
[](http://cran.r-project.org/web/packages/xgboost)
[](https://pypi.python.org/pypi/xgboost/)
[](https://optuna.org)
[Community](https://xgboost.ai/community) |
[Documentation](https://xgboost.readthedocs.org) |
[Resources](demo/README.md) |
[Contributors](CONTRIBUTORS.md) |
[Release Notes](NEWS.md)
XGBoost is an optimized distributed gradient boosting library designed to be highly ***efficient***, ***flexible*** and ***portable***.
It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework.
XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way.
The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples.
License
-------
Contributors, 2019. Licensed under an [Apache-2](https://github.com/dmlc/xgboost/blob/master/LICENSE) license.
Contribute to XGBoost
---------------------
XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone.
Checkout the [Community Page](https://xgboost.ai/community).
Reference
---------
- Tianqi Chen and Carlos Guestrin. [XGBoost: A Scalable Tree Boosting System](http://arxiv.org/abs/1603.02754). In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
- XGBoost originates from research project at University of Washington.
Sponsors
--------
Become a sponsor and get a logo here. See details at [Sponsoring the XGBoost Project](https://xgboost.ai/sponsors). The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).
## Open Source Collective sponsors
[](#backers) [](#sponsors)
### Sponsors
[[Become a sponsor](https://opencollective.com/xgboost#sponsor)]

