Science Score: 67.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 14 DOI reference(s) in README -
✓Academic publication links
Links to: arxiv.org, sciencedirect.com -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (13.1%) to scientific vocabulary
Repository
A lite implement of DeepXDE
Basic Info
- Host: GitHub
- Owner: zongzi13545329
- License: apache-2.0
- Language: Python
- Default Branch: main
- Size: 1.98 MB
Statistics
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Metadata Files
README.md
A Pytorch Implementation for DeepXDE
Introduction
DeepXDE is a library for scientific machine learning. You can see more details here.
This is a simplified pythoch implementation that inherits the deep library. We deleted the part about TensorFlow to make the file more lightweight. At the same time, some functions of DeepXDE have been optimized for personalized needs.
This job contains a huge amount of work. The work to be done is as follows. If you have other suggestions, please feel free to leave a message to contact me:
To do list
- [x] Test and verify that the DeepXDE part we need can work normally.
- [x] Add the realization of Helmholtz Equation, refer to here
- [ ] Support more optimization methods to get better predictions. Currently included in the plan: Linear-search.
- [x] Increase the visualization part to make it easier to observe the difference between the model and the method.
- [ ] Implement a wrapper for scipy.optimize to make it a PyTorch Optimizer. This will make subsequent migration of TensorFlow to pytorch easier
- [ ] To be continue...
Papers on algorithms
- Solving PDEs and IDEs via PINN [SIAM Rev.], gradient-enhanced PINN (gPINN) [arXiv]
- Solving fPDEs via fPINN [SIAM J. Sci. Comput.]
- Solving stochastic PDEs via NN-arbitrary polynomial chaos (NN-aPC) [J. Comput. Phys.]
- Solving inverse design/topology optimization via PINN with hard constraints (hPINN) [SIAM J. Sci. Comput.]
- Learning nonlinear operators via DeepONet [Nat. Mach. Intell., arXiv], DeepM&Mnet [J. Comput. Phys., J. Comput. Phys.]
- Learning from multi-fidelity data via MFNN [J. Comput. Phys., PNAS]
Features
DeepXDE has implemented many algorithms as shown above and supports many features:
- complex domain geometries without tyranny mesh generation. The primitive geometries are interval, triangle, rectangle, polygon, disk, cuboid, and sphere. Other geometries can be constructed as constructive solid geometry (CSG) using three boolean operations: union, difference, and intersection.
- multi-physics, i.e., (time-dependent) coupled PDEs.
- 5 types of boundary conditions (BCs): Dirichlet, Neumann, Robin, periodic, and a general BC, which can be defined on an arbitrary domain or on a point set.
- different neural networks, such as (stacked/unstacked) fully connected neural network, residual neural network, and (spatio-temporal) multi-scale fourier feature networks.
- 6 sampling methods: uniform, pseudorandom, Latin hypercube sampling, Halton sequence, Hammersley sequence, and Sobol sequence. The training points can keep the same during training or be resampled every certain iterations.
- conveniently save the model during training, and load a trained model.
- uncertainty quantification using dropout.
- many different (weighted) losses, optimizers, learning rate schedules, metrics, etc.
- callbacks to monitor the internal states and statistics of the model during training, such as early stopping.
- enables the user code to be compact, resembling closely the mathematical formulation.
All the components of DeepXDE are loosely coupled, and thus DeepXDE is well-structured and highly configurable. It is easy to customize DeepXDE to meet new demands.
Installation
DeepXDE requires one of the following backend-specific dependencies to be installed:
PyTorch: PyTorch
For developers, you should clone the folder to your local machine and put it along with your project scripts.
$ git clone [git http url]
Other dependencies
Cite DeepXDE
If you use DeepXDE for academic research, you are encouraged to cite the following paper:
@article{lu2021deepxde,
author = {Lu, Lu and Meng, Xuhui and Mao, Zhiping and Karniadakis, George Em},
title = {{DeepXDE}: A deep learning library for solving differential equations},
journal = {SIAM Review},
volume = {63},
number = {1},
pages = {208-228},
year = {2021},
doi = {10.1137/19M1274067}
}
Also, if you would like your paper to appear here, open an issue in the GitHub "Issues" section.
License
Owner
- Login: zongzi13545329
- Kind: user
- Repositories: 1
- Profile: https://github.com/zongzi13545329
Citation (CITATION.cff)
# This CITATION.cff file was generated with cffinit.
# Visit https://bit.ly/cffinit to generate yours today!
cff-version: 1.2.0
title: >-
DeepXDE: A deep learning library for solving
differential equations
message: >-
If you use this software, please cite it using the
metadata from this file.
type: software
authors:
- given-names: Lu
orcid: 'https://orcid.org/0000-0002-5476-5768'
family-names: Lu
affiliation: Massachusetts Institute of Technology
- given-names: Xuhui
family-names: Meng
affiliation: Brown University
- family-names: Mao
given-names: Zhiping
affiliation: Xiamen University
- family-names: Karniadakis
given-names: George Em
orcid: 'https://orcid.org/0000-0002-9713-7120'
affiliation: Brown University
identifiers:
- type: doi
value: 10.1137/19M1274067
repository-code: 'https://github.com/lululxvi/deepxde'
url: 'https://deepxde.readthedocs.io'
license: Apache-2.0
GitHub Events
Total
- Watch event: 1
Last Year
- Watch event: 1
Dependencies
- matplotlib *
- numpy *
- scikit-learn *
- scikit-optimize *
- scipy *
- tensorflow >=2.2.0
- tensorflow-probability *
- torch *
- matplotlib *
- numpy *
- scikit-learn *
- scikit-optimize *
- scipy *
- x.strip *