https://github.com/dmetivie/lux.jl
Elegant & Performant Scientific Machine Learning in Julia
Science Score: 13.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
○codemeta.json file
-
○.zenodo.json file
-
✓DOI references
Found 3 DOI reference(s) in README -
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (10.9%) to scientific vocabulary
Last synced: 5 months ago
·
JSON representation
Repository
Elegant & Performant Scientific Machine Learning in Julia
Basic Info
- Host: GitHub
- Owner: dmetivie
- License: mit
- Language: Julia
- Default Branch: main
- Homepage: https://lux.csail.mit.edu/
- Size: 17.3 MB
Statistics
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
- Releases: 0
Fork of LuxDL/Lux.jl
Created over 1 year ago
· Last pushed over 1 year ago
https://github.com/dmetivie/Lux.jl/blob/main/
![]()
[](https://julialang.zulipchat.com/#narrow/stream/machine-learning) [](https://julialang.org/slack/) [](http://lux.csail.mit.edu/dev/) [](http://lux.csail.mit.edu/stable/) [](https://github.com/LuxDL/Lux.jl/actions/workflows/CI.yml) [&logo=github)](https://github.com/LuxDL/Lux.jl/actions/workflows/CIPreRelease.yml) [](https://buildkite.com/julialang/lux-dot-jl) [](https://codecov.io/gh/LuxDL/Lux.jl) [](https://lux.csail.mit.edu/benchmarks/) [](https://juliapkgstats.com/pkg/Lux) [](https://juliapkgstats.com/pkg/Lux) [](https://github.com/JuliaTesting/Aqua.jl) [](https://github.com/SciML/ColPrac) [](https://github.com/SciML/SciMLStyle)## Installation ```julia import Pkg Pkg.add("Lux") ``` ## Quickstart ```julia using Lux, Random, Optimisers, Zygote # using LuxCUDA, AMDGPU, Metal, oneAPI # Optional packages for GPU support # Seeding rng = Random.default_rng() Random.seed!(rng, 0) # Construct the layer model = Chain(BatchNorm(128), Dense(128, 256, tanh), BatchNorm(256), Chain(Dense(256, 1, tanh), Dense(1, 10))) # Get the device determined by Lux device = gpu_device() # Parameter and State Variables ps, st = Lux.setup(rng, model) .|> device # Dummy Input x = rand(rng, Float32, 128, 2) |> device # Run the model y, st = Lux.apply(model, x, ps, st) # Gradients gs = only(gradient(p -> sum(first(Lux.apply(model, x, p, st))), ps)) # Optimization st_opt = Optimisers.setup(Optimisers.Adam(0.0001), ps) st_opt, ps = Optimisers.update(st_opt, ps, gs) ``` ## Examples Look in the [examples](/examples/) directory for self-contained usage examples. The [documentation](https://lux.csail.mit.edu) has examples sorted into proper categories. ## Testing The full test of `Lux.jl` takes a long time, here's how to test a portion of the code. For each `@testitem`, there are corresponding `tags`, for example: ```julia @testitem "SkipConnection" setup=[SharedTestSetup] tags=[:core_layers] ``` For example, let's consider the tests for `SkipConnection`: ```julia @testitem "SkipConnection" setup=[SharedTestSetup] tags=[:core_layers] begin ... end ``` We can test the group to which `SkipConnection` belongs by testing `core_layers`. To do so set the `LUX_TEST_GROUP` environment variable, or rename the tag to further narrow the test scope: ```shell export LUX_TEST_GROUP="core_layers" ``` Or directly modify the default test tag in `runtests.jl`: ```julia # const LUX_TEST_GROUP = lowercase(get(ENV, "LUX_TEST_GROUP", "all")) const LUX_TEST_GROUP = lowercase(get(ENV, "LUX_TEST_GROUP", "core_layers")) ``` But be sure to restore the default value "all" before submitting the code. Furthermore if you want to run a specific test based on the name of the testset, you can use [TestEnv.jl](https://github.com/JuliaTesting/TestEnv.jl) as follows. Start with activating the Lux environment and then run the following: ```julia using TestEnv; TestEnv.activate(); using ReTestItems; # Assuming you are in the main directory of Lux ReTestItems.runtests("tests/"; name = "NAME OF THE TEST") ``` For the `SkipConnection` tests that would be: ```julia ReTestItems.runtests("tests/"; name = SkipConnection) ``` ## Getting Help For usage related questions, please use [Github Discussions](https://github.com/LuxDL/Lux.jl/discussions) or [JuliaLang Discourse (machine learning domain)](https://discourse.julialang.org/c/domain/ml/) which allows questions and answers to be indexed. To report bugs use [github issues](https://github.com/LuxDL/Lux.jl/issues) or even better send in a [pull request](https://github.com/LuxDL/Lux.jl/pulls). ## Citation If you found this library to be useful in academic work, then please cite: ```bibtex @software{pal2023lux, author = {Pal, Avik}, title = {{Lux: Explicit Parameterization of Deep Neural Networks in Julia}}, month = apr, year = 2023, note = {If you use this software, please cite it as below.}, publisher = {Zenodo}, version = {v0.5.0}, doi = {10.5281/zenodo.7808904}, url = {https://doi.org/10.5281/zenodo.7808904} } @thesis{pal2023efficient, title = {{On Efficient Training \& Inference of Neural Differential Equations}}, author = {Pal, Avik}, year = {2023}, school = {Massachusetts Institute of Technology} } ``` Also consider starring [our github repo](https://github.com/LuxDL/Lux.jl/)Elegant & Performant Scientific Machine Learning in Julia
A Pure Julia Deep Learning Framework designed for Scientific Machine Learning
Owner
- Name: David Métivier
- Login: dmetivie
- Kind: user
- Location: Montpellier, France
- Company: INRAe, MISTEA
- Website: http://www.cmap.polytechnique.fr/~david.metivier/
- Repositories: 5
- Profile: https://github.com/dmetivie
I am a research scientist with a physics background. Now, I do statistics to tackle environmental, and climate change problems. Julia enthusiast!