Recent Releases of torch_activation
torch_activation - TAC Release 0.3.0
Preface: First release after two years, pheww! v0.2.1 was abandoned when my final exams kicked in until recently. The infrastructure was pretty outdated, so no updates occurred despite some active commits last year. This update significantly bumps the original 18 activation functions to 308 functions (some untested), largely thanks to the "THREE DECADES OF ACTIVATIONS: A COMPREHENSIVE SURVEY OF 400 ACTIVATION FUNCTIONS FOR NEURAL NETWORKS" article. This makes referencing functions much easier, reducing the need for scattered definitions. However, there are still 100 functions and some "goofy" stuff to implement. An ambitious benchmarking project is also in the works, so stay tuned!
Key Changes:
- Major Restructuring: The library has been completely restructured for better organization and maintainability. Activation functions are now grouped into files based on their families, residing in either the
classicaloradaptivedirectories. - Better CI: Testing have a complete remake, better versioning setup, and GitHub workflow now have automated test and better publication.
Function Families:
Adaptive:
abu.pyfaaf.pylaaf.pymelu.pymsrf.pyrelu.pys_shaped.pysigmoid_weighted.pysigmoid.py
Classical:
caf.pyglu.pylayer.pymaxsig.pyrelu.pysigmoid_family.pysigmoid_weighted.pysoftmax.pysquared.pyother.py: This file contains functions mentioned in the survey that have a naming convention ofx.xinstead of the more commonx.x.x.
Important Notes:
- Extensive Update: Due to the sheer size of this update, please refer to the documentation for a complete and up-to-date list of all implemented activation functions.
- Testing: Some functions are currently untested. We are actively working on comprehensive testing and benchmarking.
- Contribution: Contributions are welcome! If you find any issues or have suggestions, please open an issue or submit a pull request.
Documentation:
Thank you for your support and patience! ```
- Python
Published by hdmquan about 1 year ago
torch_activation - Torch Activations v0.2.1 Release
Add Minsin, SineReLU and VLU
- Python
Published by hdmquan over 1 year ago
torch_activation - Torch Activations v0.2.0 Release
Restructure and add 3 more activation functions:
- SlReLU
- S-RReLU
- SoftsignRReLU
- Python
Published by hdmquan over 1 year ago
torch_activation - v0.2.0 Release
Fix:
-Corrected inheritance issue in NormLinComb.
- Renamed activation_list to activations for LinComb and NormLinComb.
- Parameter n of ReLUN
Add: - Added unit tests to ensure the reliability of the code. Previous versions did not include a /test folder. - Included a new logo for the project. Special thanks to Alissa Nguyen for creating the logo.
Change: - Improved documentation by fixing various format errors. The documentation is now more accurate and easier to understand.
This is also the first stable version for torch_activation.
- Python
Published by alan191006 almost 3 years ago
torch_activation - Torch-activation v0.1.1 Release
Add StarReLU activation
- Python
Published by alan191006 almost 3 years ago
torch_activation - Torch-activation v0.0.1 Release
Adding news functions
- ReGLU
- GeGLU
- SeGLU
- SwiGLU
- Python
Published by alan191006 almost 3 years ago
torch_activation - Torch-activation v0.0.0 Release
This version mark the release of torch-activation.
Added function:
- ShiLU
- DELU
- CReLU
- GCU
- CosLU
- CoLU
- ReLUN
- SquaredReLU
- ScaledSoftSign
- Python
Published by alan191006 almost 3 years ago