dbpm_southern_ocean
Workflow for third chapter of PhD thesis
Science Score: 57.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 5 DOI reference(s) in README -
○Academic publication links
-
○Committers with academic emails
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (9.7%) to scientific vocabulary
Repository
Workflow for third chapter of PhD thesis
Basic Info
- Host: GitHub
- Owner: lidefi87
- License: apache-2.0
- Language: Jupyter Notebook
- Default Branch: main
- Size: 18.9 MB
Statistics
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Metadata Files
README.md
Dynamic Benthic Pelagic Model (DBPM) calibration - ISIMIP3A protocol
This repository contains all code necessary to process inputs used by DBPM. This repository has been redesigned to use both Python and R as part of the model workflow. Following protocol ISIMIP3A, this simulation uses inputs from GFDL-MOM6-COBALT2 at two horizontal resolutions: $0.25^{\circ}$ (original) and $1^{\circ}$ (coarsen).
Step 1. Processing DBPM climate inputs at a global scale
- Script
01_processing_dbpm_global_inputs.ipynbprocesses environmental data needed to force the DBPM model at a global scale. GFDL-MOM6-COBALT2 output files are transformed fromnetCDFto analysis readyzarrfiles. Files forspinupperiod are also created here.
Step 2. Processing DBPM climate inputs at a regional scale
- Script
02_processing_dbpm_regional_inputs.ipynbuseszarrfiles produced in the previous step to extract data for an area of interest. In this notebook, we concentrate on the Southern Ocean, which was subdivided using three FAO Major Fishing Areas:- FAO Major Fishing Area 48: Atlantic, Antarctic referred to here as Weddell,
- FAO Major Fishing Area 58: Indian Ocean, Antarctic And Southern referred to here as East Antarctica,
- FAO Major Fishing Area 88: Pacific, Antarctic referred to here as West Antarctica.
- FAO Major Fishing Area 48: Atlantic, Antarctic referred to here as Weddell,
Step 3. Processing DBPM fishing inputs at a regional scale
- Script
03_processing_effort_fishing_inputs.Rprocesses fishing catch and effort data for the area of interest. It also creates a single file including fishing and climate data, which has all variables needed to run DBPM within the boundaries of the area of interest.
Step 4. Calculating fishing mortality parameters
- Script
04_calculating_dbpm_fishing_params.Rdoes the following:
- Estimates fishing mortality parameters (catchability and selectivities for each functional group)
- Checks and adjusts the
search volumeparameter - Creates and saves calibration plots in PDF format
Plots created in this script can be used to visually inspect the fit of predicted catches against observed (reconstructed) catch data.
- Estimates fishing mortality parameters (catchability and selectivities for each functional group)
Step 5. Setting up gridded inputs for spatial DBPM
- Script
05_setup_gridded_DBPM.ipynbprocesses all inputs necessary to run the spatial DBPM for the area and time period of interest. - Script
05a_setup_gridded_params_weddell.ipynbapplies a correction to inputs used to run DBPM in the Weddell Sea at $0.25^{\circ}$. Correction of inputs is needed for this region and resolution to deal with numerical instabilities. - Script
05b_setup_new_effort.ipnybprepares data needed to run the gridded DBPM using CCAMLR effort data.
Step 6. Running DBPM spatial model
- Script
06_running_gridded_DBPM.ipynbuses inputs prepared in step 5 and runs the spatial DBPM. DBPM model outputs are stored for each timestep included in the input data.
Step 7. Calculating catches from gridded DBPM outputs
- Script
07_calculating_catches_gridded_DBPM.pycalculates catches for benthic detritivores and pelagic predators from gridded DBPM outputs calculated in step 6. Catch data is summarised per decade and maps created for the last decade of the spinup and the modelled period (1950 and 2010). Mean yearly catches are calculated for the area of interest from monthly catch estimates to create a time series.
Step 8. Calculating biomass for predators and detritivores
- Script
08_calculating_biomass.ipynbproduces calculates biomass for predators and detritivores from DBPM outputs produced in step 6.
Step 9. Plotting outputs
- Script
09_plotting_gridded_DBPM_outputs.ipynbcreates plots and maps of biomass and catches calculated from DBPM outputs produced in step 6.
Step 10. DBPM output evaluation
- Script
10_evaluating_DBPM_outputs.Rapplies the observation range adjusted method to evaluate model performance as described in Evans & Imran 2024 prior to calculating a number of skill assessment metrics described in Rynne et al 2025.
Step 11. Comparing effort datasets
- Script
11_effort_plotscompares effort data as prescribed by the FishMIP Protocol 3a and CCAMLR data.
Step 12. Plots for publication
- Script
12_supporting_plotsproduces plots to support contents of manuscript describing exploration on DBPM outputs within the Southern Ocean.
Running this repository
The scripts in this repository were developed in NCI's Gadi, so the easiest way to run these script is to clone this repository to Gadi. However, before you can do this, you will need an NCI account, which are only available for researchers with an email address from an Australian institution. Further down in this document, we include information about how to create an NCI account if you do not have one already. Remember, you must have an email address for an Australian institution to create an NCI account.
You can also run these scripts in your own computer or a different server, but you will need need access to the forcing data (i.e., GFDL-MOM6-COBALT2 outputs and fishing data) to run them. We include information about how to access these data.
Getting an NCI account
- Create an NCI user account
- You should use your Australian institution’s email account when signing up
- When it asks for a project to join:
- If possible, contact the NCI scheme manager at your institution to find out what NCI project you should use to sign up for your NCI account. This account will provide you with computing power.
- If possible, contact the NCI scheme manager at your institution to find out what NCI project you should use to sign up for your NCI account. This account will provide you with computing power.
- You should use your Australian institution’s email account when signing up
- Join relevant NCI projects
- Request to join the following NCI projects:
- vf71 - for access to GFDL-MOM6-COBALT2 outputs in analysis ready data format
- xp65 - for the Python conda environment
- Note that it can take a few business day get approved as a project member
- Request to join the following NCI projects:
- Verify that you can log into NCI’s Gadi
- Note that it usually takes more than 30 minutes for your account to be created
- You are also welcome to follow along with the instructions to set up ssh keys, but this is optional.
- Note that it usually takes more than 30 minutes for your account to be created
Accessing forcing data
Ocean outputs from GFDL-MOM6-COBALT2
The environmental data comes from GFDL-MOM6-COBALT2, which is available at two horizontal resolutions: $0.25^{\circ}$ (original model outputs) and ($1^{\circ}$, coarsen from original outputs). The original GFDL-MOM6-COBALT2 outputs can be downloaded from the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) Data Portal as netCDF files. However, you can also access GFDL-MOM6-COBALT2 outputs as zarr files from project vf71 at the National Computational Infrastructure (NCI).
Fishing catch data
The fishing catch data came from three sources:
1. 'ISIMIP3a reconstructed fishing activity data (v1.0)' (Novaglio et al. 2024)
2. 'CCAMLR Statistical Bulletin, Vol. 36' (CCAMLR 2024)
3. 'Sea Around Us catch reconstructions' (Pauly et al. 2020)
CCAMLR effort data
The regional effort data used in comparisons came from the CCAMLR Statistical Bulletin, Vol. 36. This data is publicly available and can be downloaded from their website.
Copies of these datasets are also available under project vf71 at the National Computational Infrastructure (NCI).
Owner
- Name: Lidefi87
- Login: lidefi87
- Kind: user
- Location: Hobart, Australia
- Company: IMAS-UTAS
- Twitter: lidefi87
- Repositories: 2
- Profile: https://github.com/lidefi87
PhD candidate at The University of Tasmania's Institute of Marine and Antarctic Studies (IMAS)
Citation (CITATION.cff)
cff-version: 1.2.0 message: "If you use this software, please cite it as below." authors: - family-names: "Fierro-Arcos" given-names: "Denisse" orcid: "https://orcid.org/0000-0002-5039-6272" - family-names: "Novaglio" given-names: "Camilla" orcid: "https://orcid.org/0000-0003-3681-1377" - family-names: "Blanchard" given-names: "Julia" orcid: "https://orcid.org/0000-0003-0532-4824" - family-names: "Heneghan" given-names: "Ryan" orcid: "https://orcid.org/0000-0001-7626-1248" - family-names: "Forestier" given-names: "Romain" - name: "Fisheries and Marine Ecosystem Model Intercomparison Project (Fish-MIP)" title: "Dynamic Benthic Pelagic Model (DBPM) calibration - ISIMIP3A protocol" version: 2.0.0 doi: TBA date-released: TBA url: "https://github.com/Benthic-Pelagic-Size-Spectrum-Model/lme_scale_calibration_ISMIP3a" license: Apache-2.0
GitHub Events
Total
- Push event: 15
- Create event: 3
Last Year
- Push event: 15
- Create event: 3
Issues and Pull Requests
Last synced: 8 months ago
All Time
- Total issues: 0
- Total pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Total issue authors: 0
- Total pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0