https://github.com/charmoniumq/astrophysics-project
Science Score: 10.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
○codemeta.json file
-
○.zenodo.json file
-
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (7.8%) to scientific vocabulary
Last synced: 6 months ago
·
JSON representation
Repository
Basic Info
- Host: GitHub
- Owner: charmoniumQ
- Language: TeX
- Default Branch: main
- Size: 13.7 MB
Statistics
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
- Releases: 0
Created almost 4 years ago
· Last pushed over 3 years ago
https://github.com/charmoniumQ/astrophysics-project/blob/main/
# Neural Network Superresolving for Cosmological Simulations In this repository, I attempt to reproduce the analysis of [Schaurecker et al. 2021][1] on Enzo data (they use Illustris). [1]: https://arxiv.org/pdf/2111.06393.pdf # To reproduce The code `main.py` is intended to be run locally. It sends commands to the remote. You will need to modify this with your site-specific parameters. It should be the only file you need to modify. To set up the remote machine (should be capable of Slurm): ```sh remote$ # Install Spack on the remote remote$ # See my notes in reports/spack_on_cc.md for details on the UIUC Campus Cluster. remote$ git clone -c feature.manyFiles=true https://github.com/spack/spack.git remote$ # Copy spack.lock to the remote remote$ spack/bin/spack environment create main4 spack.lock remote$ spack/bin/spack environment activate main4 remote$ spack/bin/spack concretize remote$ spack/bin/spack install remote$ # Copy envirment.yaml ot the remote remote$ spack/bin/spack activate main4 remote$ conda install --name main3 --file environment,yaml remote$ # Ensure that Slurm works remote$ sbatch --help ``` To set up the local machine: ```sh locla$ # Install conda locla$ # Install conda environment local$ conda install --name main3 --file environment,yaml ``` You will need to configure SSH keys to the remote. Then you should be to run `main.py`. `main.py` runs the entire workflow. It is smart about not running a certain step if the data already exists. It also hashes the input parameters in the filename of the data, so it is unlikely to return stale data. The end result will end up in `output`.
Owner
- Name: Sam Grayson
- Login: charmoniumQ
- Kind: user
- Location: /home/bedroom/bed
- Company: University of Illinois at Urbana-Champaign
- Website: https://samgrayson.me
- Twitter: charmoniumQ
- Repositories: 41
- Profile: https://github.com/charmoniumQ
PhD student, on a crusade to improve scientific software