https://github.com/amirabbasasadi/babai
C++ Optimization Library
Science Score: 26.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.6%) to scientific vocabulary
Keywords
Repository
C++ Optimization Library
Basic Info
- Host: GitHub
- Owner: amirabbasasadi
- Language: C++
- Default Branch: master
- Size: 2.16 MB
Statistics
- Stars: 7
- Watchers: 0
- Forks: 0
- Open Issues: 1
- Releases: 0
Topics
Metadata Files
README.md
Babai
The library has not been released yet and is under development. contributions are welcomed.
C++ Optimization Library (Version 0.1)
Babai(means sheep) is a C++ Optimization library based on Eigen and GSL.
Features
- Solving Unconstrained Optimization Problems
- [to do] Multi-Objective Optimization Problems
- [to do] Constrained Optimization Problems
- [to do] Stochastic Optimization Problems ## How to use Babai Babai requirements:
- C++ compiler that support C++17 (GCC, ...)
- CMake
- GNU Scientific Library
to use Babai, you only need to include the include/Babai/babai.hpp. It will include all problem types and optimizers. to build your program, don't forget to link the gsl and gslcblas.also you can use -O3 -march=native flags to increase the performance. this is an example using CMake:
cmake
cmake_minimum_required(VERSION 3.0)
project(BabaiTest)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_FLAGS "-O3 -march=native")
include_directories(include)
add_executable(app source.cpp)
target_link_libraries(app gsl gslcblas)
Gradient-Free Solvers
Adaptive Particle Swarm Optimization
The APSO solver was implemented based on this paper: - Zhan, Z. H., Zhang, J., Li, Y., & Chung, H. S. H. (2009). Adaptive particle swarm optimization. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 39(6), 1362-1381.
Parameters Adaption and Elitist Learning are implemented.
Examples
minimizing the function:
```cpp
include
// include main Babai header
include "Babai/babai.hpp"
int main(){ // create an optimization problem auto p = new babai::problem();
// set number of objective function variables p->dimension(100);
// set lower bound and upper bounds for variables // if the bounds are not same for all variables you can pass a vector // for example : p->lowerbound(v) where v is an Eigen::RowVectorXd p->lowerbound(-10.0)->upper_bound(10.0);
// define the objective using either minimize or maximize method // the objective function could be a lambda function // type of the input is a reference to Eigen::RowVectorXd // you can use all Eigen vector operations or simply access the elements and define your own operation p->minimize( { return v.norm(); });
// create an instance of Adaptive Particle Swarm optimizer auto pso = new babai::PSO();
// set number of particles and problem pso->npop(20)->problem(p);
// trace and control iterations // this function runs in every iteration // type of the input is same as the pso variable (i.e a pointer to the optimizer) pso->iterate( { // using the trace, you can access all parameters of the solver std::cout << "step :" << trace->step() << " | " << "loss :" << trace->best() << " | " << "objective evaluations :" << trace->nfe() << std::endl;
// continue until convergence
if (trace->best() < 0.01)
trace->stop(); // stop iterations
}); // print the best found position std::cout << "best found position : " << std::endl; std::cout << pso->best_position() << std::endl; return 0; }
```
Parameters
The parameters and methods which are accessible inside the trace function are as follow:
- stop() stops optimizer iterations
- step() returns number of performed iterations
- best() returns best objective function value
- nfe() returns number of the objective function evaluations
- npop() returns number of particles
- best_position() returns best found position as a vector
- best_local_positions() returns best local position for all particles as a matrix
- positions() returns the positions of all particles as a matrix
- velocity() returns the velocity of all particles as a matrix
to see explanation for the parameters listed below, refer to the APSO paper
- inertia_weight(), returns the inertia weight
- self_cognition() returns the Self Cognition
- social_influence() returns the Social Influence
- evolutionary_factor(), returns the Evolutionary Factor
- elitist_learning_rate() returns the Elitist Learning Rate
Gradient-Based Solvers
[to do]
Developers
- Amirabbas Asadi, (amir137825@gmail.com) ## References
- Zhan, Z. H., Zhang, J., Li, Y., & Chung, H. S. H. (2009). Adaptive particle swarm optimization. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 39(6), 1362-1381.
Owner
- Name: Amirabbas Asadi
- Login: amirabbasasadi
- Kind: user
- Location: Iran
- Website: https://www.linkedin.com/in/amirabbas-asadi/
- Repositories: 5
- Profile: https://github.com/amirabbasasadi
Independent AI Researcher
GitHub Events
Total
Last Year
Committers
Last synced: 8 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Amirabbas Asadi | a****5@g****m | 6 |
Issues and Pull Requests
Last synced: 8 months ago
All Time
- Total issues: 1
- Total pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Total issue authors: 1
- Total pull request authors: 0
- Average comments per issue: 0.0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- GenieTim (1)