https://github.com/deepskies/deeplenssbi
Strong Lensing parameter inference with SBI
Science Score: 49.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 4 DOI reference(s) in README -
✓Academic publication links
Links to: zenodo.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (15.0%) to scientific vocabulary
Repository
Strong Lensing parameter inference with SBI
Basic Info
- Host: GitHub
- Owner: deepskies
- License: mit
- Language: Jupyter Notebook
- Default Branch: main
- Size: 19.8 MB
Statistics
- Stars: 2
- Watchers: 13
- Forks: 0
- Open Issues: 0
- Releases: 0
Metadata Files
README.md
Deep inference of simulated strong lenses in ground-based surveys
Current ground-based cosmological surveys, such as the Dark Energy Survey (DES), are predicted to discover thousands of galaxy-scale strong lenses, while future surveys, such as the Vera Rubin Observatory Legacy Survey of Space and Time (LSST) will increase that number by 1-2 orders of magnitude. The large number of strong lenses discoverable in future surveys will make strong lensing a highly competitive and complementary cosmic probe.
To leverage the increased statistical power of the lenses that will be discovered through upcoming surveys, automated lens analysis techniques are necessary. We present two Simulation-Based Inference (SBI) approaches for lens parameter estimation of galaxy-galaxy lenses. We demonstrate the successful application of Neural Posterior Estimation (NPE) to automate the inference of a 12-parameter lens mass model for DES-like ground-based imaging data. We compare our NPE constraints to a Bayesian Neural Network (BNN) and find that it outperforms the BNN, producing posterior distributions that are for the most part both more accurate and more precise; in particular, several source-light model parameters are systematically biased in the BNN implementation.
Here we provide implementation for both NPE and BNN approaches presented in our paper.
Installation
- Set up an environment. This can be done using
conda:
conda create --name deeplensSBI python=3.9
conda activate deeplensSBI
Install the dependencies needed for training the model provided in requirements.txt.
pip install --user -r "requirements.txt"Create the training set data using
deeplenstronomy(link), or download the datasets used in this work from Zenodo.To train a NPE model, run
python src/sbi_runner.py --num_params --hidden_features --num_transforms --out_features --seedBy default, the script will look for training data in the folder "SBI_dataset". Please modify the file path as necessary.
The arguments of the model are:
| Argument | Description | | ----------- | ----------- | | numparams | Which model you are training. Options are '1', '5' and '12'. | | hiddenfeatures | Sets the number of hidden units used in the MAF model | | numtransforms | Sets the number of flow transformations used in the MAF model | | outfeatures | Sets the number of output features the embedding network outputs |
The script will output a pickle file that contains the trained neural posterior estimator.
- To train a BNN model, see the script in src/12paramBNN.py as an example of how to train a 12-parameter BNN. This script can be adapted for training a 1- or 5-parameter model.
Analysis
Examples of how to use the trained neural posterior estimator can be found in the analysis notebooks in the "Analysis" folder. These notebooks document how to produce the plots that are published in the paper. The files required to run the notebooks can be found in the Zenodo repository for this project: https://zenodo.org/records/13961234.
Owner
- Name: Deep Skies Lab
- Login: deepskies
- Kind: organization
- Email: deepskieslab@gmail.com
- Website: www.deepskieslab.com
- Twitter: deepskieslab
- Repositories: 5
- Profile: https://github.com/deepskies
Building community and making discoveries since 2017
GitHub Events
Total
- Push event: 4
Last Year
- Push event: 4
Dependencies
- Pillow ==10.2.0
- deeplenstronomy ==0.0.2.3
- keras ==2.9.0
- matplotlib ==3.5.1
- numpy ==1.24.4
- pandas ==1.5.3
- sbi ==0.18.0
- scikit-learn ==1.3.2
- scipy ==1.8.0
- seaborn ==0.13.2
- tensorflow ==2.9.1
- tensorflow-probability ==0.17.0
- torch ==1.11.0