Science Score: 75.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 4 DOI reference(s) in README -
✓Academic publication links
Links to: zenodo.org -
○Academic email domains
-
✓Institutional organization owner
Organization pnlbwh has institutional domain (pnl.bwh.harvard.edu) -
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.3%) to scientific vocabulary
Repository
PNL pipeline for NIFTI images
Basic Info
Statistics
- Stars: 10
- Watchers: 3
- Forks: 15
- Open Issues: 20
- Releases: 7
Metadata Files
docs/README.md

Developed by Tashrif Billah, Sylvain Bouix, and Yogesh Rathi, Brigham and Women's Hospital (Harvard Medical School).
This pipeline is also available as Docker and Singularity containers. See pnlpipe-containers for details.
Table of Contents
- Citation
- Introduction
- Dependencies
- Installation
- Multiprocessing
- Pipeline scripts overview
- Tutorial
- Support
Citation
If this pipeline is useful in your research, please cite as below:
Billah, Tashrif; Bouix, Sylvain; Rathi, Yogesh; NIFTI MRI processing pipeline, https://github.com/pnlbwh/pnlNipype, 2019, DOI: 10.5281/zenodo.3258854
Introduction
pnlNipype is a Python-based framework for processing anatomical (T1, T2) and diffusion weighted images. It comprises some of the PNL's neuroimaging pipelines. A pipeline is a directed acyclic graph (DAG) of dependencies. The following diagram depicts functionality of the NIFTI pipeline, where each node represents an output, and the arrows represent dependencies:

Detailed DAGs are available here.
Dependencies
- dcm2niix
- ANTs >= 2.3.0
- UKFTractography >= 1.0
- FreeSurfer >= 5.3.0
- FSL >= 5.0.11
- Python >= 3.6
Installation
1. Install prerequisites
Installing pnlNipype requires you to install each of the dependencies independently. This way, you can have more control over the requisite software modules. The independent installation is for users with intermediate programming knowledge.
Install the following software (ignore the one(s) you have already):
- Python >= 3.6
- FreeSurfer >= 5.3.0
- FSL >= 5.0.11
Check system architecture
uname -a # check if 32 or 64 bit
Python 3
Download Miniconda Python 3.6 bash installer (32/64-bit based on your environment):
sh Miniconda3-latest-Linux-x86_64.sh -b # -b flag is for license agreement
Activate the conda environment:
source ~/miniconda3/bin/activate # should introduce '(base)' in front of each line
FSL
Follow the instruction to download and install FSL.
FreeSurfer
Follow the instruction to download and install FreeSurfer >= 5.3.0
After installation, you can check FreeSurfer version by typing freesurfer on the terminal.
Conda environment
git clone --recurse-submodules https://github.com/pnlbwh/pnlNipype.git && cd pnlNipype
# install the python packages required to run pnlNipype
conda create -y -n pnlpipe9 -c conda-forge --override-channels python
conda activate pnlpipe9
pip install -r requirements.txt
T1 and T2 training masks
(Optional) In the past, we used MABS (Multi Atlas Brain Segmentation) method to generate T1 and T2 masks. But now we use HD-BET to generate those. Hence, this step remains optional. But if you want, you can 'install' our training masks:
# define PYTHONPATH so that pnlNipype/cmd/install.py can find pnlpipe_software/* installation scripts
export PYTHONPATH=`pwd`
# define PNLPIPE_SOFT as the destination of training masks
export PNLPIPE_SOFT=/path/to/wherever/you/want/
# training data for MABS mask
cmd/install.py trainingDataT1AHCC
cmd/install.py trainingDataT2Masks
unset PYTHONPATH
However, other external software should be built from respective sources:
- ANTs
You can build ANTs from source. Additionally, you should define ANTSPATH.
The other alternative can be installing ANTs through conda. We have developed a conda package for ANTs==2.3.0.
You can get that by:
conda install -c pnbwh ants
- dcm2niix
dcm2niix executable will create NIFTI file from DICOM. The pipeline uses a reliable converter dcm2niix. Building of dcm2niix is very straightforward and reliable. Follow this instruction to build dcm2niix.
- UKFTractography
Follow this instruction to download and install UKFTractography.
- tract_querier & whitematteranalysis
We found that tract_querier and whitematteranalysis dependencies do not quite agree with simpler pnlNipype dependecies. Hence, you may want to install them within a separate Python environment outside of pnlpipe3. Individual repository instructions can be followed to install them. It will be something in line of:
git clone http://github.com/demianw/tract_querier.git
cd tract_querier
pip install .
Special care has to be taken so that tract_querier's and whitematteranalysis's executables are
in PATH environment variable.
2. Configure your environment
It is hard to definitively suggest how to configure your environment. But it should be something akin to:
source ~/miniconda3/bin/activate # should introduce '(base)' in front of each line
export FSLDIR=/path/to/fsl/ # setup fsl environment
source $FSLDIR/etc/fslconf/fsl.sh
export PATH=$PATH:$FSLDIR/bin
export FREESURFER_HOME=/path/to/freesurfer # you may specify another directory where FreeSurfer is installed
source $FREESURFER_HOME/SetUpFreeSurfer.sh
# source external software
ANTSDIR=/path/to/ANTs
ANTSPATH=$ANTSDIR/build/ANTS-build/Examples
PATH=$ANTSPATH:$ANTSDIR/Scripts:$PATH # define ANTSPATH and export ANTs scripts in your path
PATH=/path/to/dcm2niix/build/bin:$PATH
PATH=/path/to/ukftractography/build/UKFTractography-build/UKFTractography/bin:$PATH
PATH=/path/to/pnlNipype/exec:$PATH
export ANTSPATH PATH
(If you would like, you may include the above in your bashrc to have environment automatically setup
every time you open a new terminal)
3. Temporary directory
Both pnlpipe and pnlNipype have centralized control over various temporary directories created down the pipeline.
The temporary directories can be large, and may possibly clog the default /tmp/ directory. You may define custom
temporary directory with environment variable PNLPIPE_TMPDIR:
mkdir /path/to/tmp/
export PNLPIPE_TMPDIR=/path/to/tmp/
4. Tests
Upon successful installation, you should be able to see help message of each script in the pipeline:
cd lib
scripts/atlas.py --help
scripts/fs2dwi.py --help
...
Multiprocessing
Multi-processing is an advanced feature of pnlNipype. The following scripts are able to utilize python multiprocessing capability:
atlas.py
pnl_eddy.py
pnl_epi.py
wmql.py
antsApplyTransformsDWI.py
You may specify N_PROC parameter in scripts/util.py for default number of processes to be used across scripts in the pipeline.
N_PROC = '4'
On a Linux machine, you should find the number of processors by the command lscpu:
On-line CPU(s) list: 0-55
You can specify any number not greater than the On-line CPU(s). However, one caveat is, other applications in your computer
may become sluggish or you may run into memory error due to heavier computation in the background. If this is the case,
reduce NCPU (--nproc) to less than 4.
Pipeline scripts overview
scripts is a directory of PNL specific scripts that implement various
pipeline steps. These scripts are the successors to the ones in pnlpipe
used for NRRD format data. Besides being more robust and up to date with respect to software such
as ANTS, they are implemented in python using
the shell scripting library plumbum.
Being written in python means they are easier to understand and modify,
and plumbum allows them to be
almost as concise as a regular shell script.
You can call any of these scripts directly, e.g.
scripts/align.py -h
It's important to note that usually the scripts are calling other binaries, such as those in ANTS, FreeSurfer and FSL. So, make sure you source each of their environments so individual scripts are able to find them.
This table summarizes the scripts in pnlNipype/scripts/:
| Category | Script | Function | |--------------------|------------------------------------|-----------------------------------------------------------------------| | General | align.py | axis aligns and centers an image | | General | bet_mask.py | masks a 3D/4D MRI using FSL bet | | General | masking.py | skullstrips by applying a labelmap mask | | General | maskfilter.py | performs morphological operation on a brain mask | | General | resample.py | resamples a 3D/4D image | | - | - | - | | DWI | unring.py | Gibbs unringing | | DWI | antsApplyTransformsDWI.py | applies a transform to a DWI | | DWI | bse.py | extracts a baseline b0 image | | - | - | - | | DWI | pnl_epi.py | corrects EPI distortion via registration | | DWI | fsltopupepi_eddy.py | corrects EPI distortion using FSL topup and eddyopenmp | | - | - | - | | DWI | **pnleddy.py** | corrects eddy distortion via registration | | DWI | fsl_eddy.py | corrects eddy distortion using FSL eddyopenmp | | DWI | ukf.py | convenient script for running UKFTractography | | - | - | - | | Structural | atlas.py | computes a brain mask from training data | | Structural | makeAlignedMask.py | transforms a labelmap to align with another structural image | | Structural | fs.py | convenient script for running freesurfer | | - | - | - | | Freesurfer to DWI | fs2dwi.py | registers a freesurfer segmentation to a DWI | | Tractography | wmql.py | simple wrapper for tractquerier | | Tractography | wmqlqc.py | makes html page of rendered wmql tracts |
The above executables are available as soft links in pnlNipype/exec directory as well:
| Soft link | Target script | |---|---| | fsleddy | ../scripts/fsleddy.py | | fsltoupepieddy | ../scripts/fsltopupepieddy.py | | masking | ../scripts/masking.py | | niftialign | ../scripts/align.py | | unring | ../scripts/unring.py | | maskfilter | ../scripts/maskfilter.py | | resample | ../scripts/resample.py | | niftiatlas | ../scripts/atlas.py | | niftibetmask | ../scripts/betmask.py | | niftibse | ../scripts/bse.py | | niftifs | ../scripts/fs.py | | niftifs2dwi | ../scripts/fs2dwi.py | | niftimakeAlignedMask | ../scripts/makeAlignedMask.py | | niftiwmql | ../scripts/wmql.py | | pnleddy | ../scripts/pnleddy.py | | pnlepi | ../scripts/pnlepi.py | | ukf | ../scripts/ukf.py |
For example, to execute axis alignment script, you can do either of the following:
pnlNipype/exec/nifti_align -h
pnlNipype/scripts/align.py -h
Tutorial
See the TUTORIAL for workflow and function of each script.
Support
Create an issue at https://github.com/pnlbwh/pnlNipype/issues . We shall get back to you as early as possible.
Owner
- Name: Psychiatry Neuroimaging Lab @ BWH/HMS
- Login: pnlbwh
- Kind: organization
- Location: Boston, MA
- Website: http://pnl.bwh.harvard.edu/
- Repositories: 52
- Profile: https://github.com/pnlbwh
Psychiatry Neuroimaging Lab at Brigham and Women's Hospital and Harvard Medical School
Citation (CITATION.CFF)
cff-version: 1.2.0 message: "If this software is useful in your research, please cite as below." authors: - family-names: "Billah" given-names: "Tashrif" - family-names: "Bouix" given-names: "Sylvain" - family-names: "Rathi" given-names: "Yogesh" title: "NIFTI MRI processing pipeline" version: 1.1 doi: 10.5281/zenodo.3258854 date-released: 2020-10-22 url: "https://github.com/pnlbwh/pnlNipype"
GitHub Events
Total
- Create event: 3
- Release event: 1
- Issues event: 1
- Delete event: 1
- Issue comment event: 2
- Push event: 25
- Gollum event: 8
- Pull request review event: 3
- Pull request review comment event: 4
- Pull request event: 11
Last Year
- Create event: 3
- Release event: 1
- Issues event: 1
- Delete event: 1
- Issue comment event: 2
- Push event: 25
- Gollum event: 8
- Pull request review event: 3
- Pull request review comment event: 4
- Pull request event: 11
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 0
- Total pull requests: 6
- Average time to close issues: N/A
- Average time to close pull requests: 1 day
- Total issue authors: 0
- Total pull request authors: 1
- Average comments per issue: 0
- Average comments per pull request: 0.0
- Merged pull requests: 3
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 6
- Average time to close issues: N/A
- Average time to close pull requests: 1 day
- Issue authors: 0
- Pull request authors: 1
- Average comments per issue: 0
- Average comments per pull request: 0.0
- Merged pull requests: 3
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- tashrifbillah (2)
- kavandeloo97 (1)
- AlbertoImg (1)
Pull Request Authors
- tashrifbillah (6)
- RyanZurrin (5)