https://github.com/asfhyp3/hyp3-autorift
A HyP3 plugin for feature tracking processing with AutoRIFT-ISCE
Science Score: 75.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 3 DOI reference(s) in README -
✓Academic publication links
Links to: zenodo.org -
○Academic email domains
-
✓Institutional organization owner
Organization asfhyp3 has institutional domain (hyp3-docs.asf.alaska.edu) -
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.9%) to scientific vocabulary
Repository
A HyP3 plugin for feature tracking processing with AutoRIFT-ISCE
Basic Info
Statistics
- Stars: 9
- Watchers: 10
- Forks: 3
- Open Issues: 8
- Releases: 60
Metadata Files
README.md
HyP3 autoRIFT Plugin
The HyP3 autoRIFT plugin provides a set of workflows for feature tracking processing with the AutoRIFT autonomous Repeat Image Feature Tracking (autoRIFT) software package. This plugin is part of the Alaska Satellite Facility's larger HyP3 (Hybrid Plugin Processing Pipeline) system, which is a batch processing pipeline designed for on-demand processing of remote sensing data. For more information on HyP3, see the Background section.
Installation
- Ensure that pixi is installed on your system: https://pixi.sh/latest/installation/.
- Clone the
hyp3-autoriftrepository and navigate to the root directory of this projectbash git clone https://github.com/ASFHyP3/hyp3-autorift.git cd hyp3-autorift - setup the development environment
bash pixi run install-editable (optional) traditional conda-like activation of the pixi environment
bash eval "$(pixi shell-hook)"[!TIP] If you've done (4), you don't need to prefix commands with
pixi run.
Usage
The HyP3 autoRIFT plugin provides workflows (accessible directly in Python or via a CLI) that can be used to process SAR data or optical data using autoRIFT. HyP3 autoRIFT can process these satellite missions: * SAR: * Sentinel-1 * Optical: * Sentinel-2 * Landsat 4,5,7,8,9
To see all available workflows, run:
pixi run python -m hyp3_autorift ++help
hyp3_autorift workflow
The hyp3_autorift workflow is used to get dense feature tracking between two images using autoRIFT. You can run this workflow by selecting the hyp3_autorift process:
pixi run python -m hyp3_autorift ++process hyp3_autorift [WORKFLOW_ARGS]
or by using the hyp3_autorift console script:
pixi run hyp3_autorift [WORKFLOW_ARGS]
For example:
pixi run hyp3_autorift \
--reference LC08_L1TP_009011_20200703_20200913_02_T1 \
--secondary LC08_L1TP_009011_20200820_20200905_02_T1
will run autoRIFT for a Landsat 8 pair over Jakobshavn, Greenland.
[!IMPORTANT] Credentials are necessary to access Landsat and Sentinel-1 data. See the Credentials section for more information.
Similarly, sets of Sentinel-1 bursts can be processed like:
pixi run hyp3_autorift \
--reference \
S1_105608_IW1_20240618T025544_VV_99C1-BURST \
S1_105607_IW1_20240618T025542_VV_C6D8-BURST \
S1_105606_IW1_20240618T025539_VV_C6D8-BURST \
S1_105605_IW1_20240618T025536_VV_C6D8-BURST \
S1_105604_IW1_20240618T025533_VV_C6D8-BURST \
--secondary \
S1_105608_IW1_20240630T025544_VV_3D6E-BURST \
S1_105607_IW1_20240630T025541_VV_2539-BURST \
S1_105606_IW1_20240630T025538_VV_2539-BURST \
S1_105605_IW1_20240630T025535_VV_2539-BURST \
S1_105604_IW1_20240630T025533_VV_2539-BURST
[!IMPORTANT] We recommend processing at least 2 Sentinel-1 bursts along track, with 3-5 bursts seeing improvements in data quality.
For all options available to this workflow, see the help documentation:
pixi run hyp3_autorift --help
Credentials
Depending on the mission being processed, some workflows will need you to provide credentials. Generally, credentials are provided via environment variables, but some may be provided by command-line arguments or via a .netrc file.
AWS Credentials
To process Landsat images, you must provide AWS credentials because the data is hosted by USGS in a "requester pays" bucket. To provide AWS credentials, you can either use an AWS profile specified in your ~/.aws/credentials by exporting:
export AWS_PROFILE=your-profile
or by exporting credential environment variables:
export AWS_ACCESS_KEY_ID=your-id
export AWS_SECRET_ACCESS_KEY=your-key
export AWS_SESSION_TOKEN=your-token # optional; for when using temporary credentials
For more information, please see: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html
NASA Earthdata Login
To process Sentinel-1 images, you must provide Earthdata Login credentials in order to download input data. * If you do not already have an Earthdata account, you can sign up here.
For Earthdata login, you can provide credentials by exporting environment variables:
export EARTHDATA_USERNAME=your-edl-username
export EARTHDATA_PASSWORD=your-edl-password
or via your ~/.netrc file which should contain lines like these two:
machine urs.earthdata.nasa.gov login your-edl-username password your-edl-password
[!TIP] Your
~/.netrcfile should only be readable by your user; otherwise, you'll receive a "net access too permissive" error. To fix, run:chmod 0600 ~/.netrc
Docker Container
The ultimate goal of this project is to create a docker container that can run autoRIFT workflows within a HyP3 deployment. To run the current version of the project's container, use this command:
docker run -it --rm \
-e AWS_ACCESS_KEY_ID=[YOUR_KEY] \
-e AWS_SECRET_ACCESS_KEY=[YOUR_SECRET] \
-e EARTHDATA_USERNAME=[YOUR_USERNAME_HERE] \
-e EARTHDATA_PASSWORD=[YOUR_PASSWORD_HERE] \
ghcr.io/asfhyp3/hyp3-autorift:latest \
++process hyp3_autorift \
[WORKFLOW_ARGS]
[!TIP] You can use
docker run --env-fileto capture all the necessary environment variables in a single file.
Docker Outputs
When running hyp3-autorift via docker, there are two recommended approaches to retain the intermediate and output product files:
- Use a volume mount
Add the -w /tmp -v ${PWD}:/tmp flags after docker run; -w changes the working directory inside the container to /tmp and -v will mount your current working directory to the /tmp location inside the container such that hyp3_autorift outputs are preserved locally. You can replace ${PWD} with any valid path.
- Copy outputs to a remote AWS S3 Bucket
Append the --bucket and --bucket-prefix to [WORKFLOW_ARGS] so that the final output files are uploaded to AWS S3. This also requires that AWS credentials to write to the bucket are available to the running container. For example, to write outputs to a hypothetical bucket s3://hypothetical-bucket/test-run/:
docker run -it --rm \
-e AWS_ACCESS_KEY_ID=[YOUR_KEY] \
-e AWS_SECRET_ACCESS_KEY=[YOUR_SECRET] \
-e AWS_SESSION_TOKEN=[YOUR_TOKEN] \ # Optional
-e EARTHDATA_USERNAME=[YOUR_USERNAME_HERE] \
-e EARTHDATA_PASSWORD=[YOUR_PASSWORD_HERE] \
ghcr.io/asfhyp3/hyp3-autorift:latest \
++process hyp3_autorift \
[WORKFLOW_ARGS] \
--bucket "hypothetical-bucket" \
--bucket-prefix "test-run"
Background
HyP3 is broken into two components: the cloud architecture/API that manages the processing of HyP3 workflows and Docker container plugins that contain scientific workflows that produce new science products from a variety of data sources (see figure below for the full HyP3 architecture).

The cloud infrastructure-as-code for HyP3 can be found in the main HyP3 repository., while this repository contains a plugin that can be used for feature tracking processing with AutoRIFT.
License
The HyP3 autoRIFT plugin is licensed under the BSD 3-Clause license. See the LICENSE file for more details. Some files from rom nasa-jpl/autoRIFT have been vendord in src/hyp3_autorift/vend and retain their Apache 2.0 license from upstream. Please see the README in that directory for details.
Code of conduct
We strive to create a welcoming and inclusive community for all contributors to HyP3 autoRIFT. As such, all contributors to this project are expected to adhere to our code of conduct.
Please see our CODE_OF_CONDUCT.md for the full code of conduct text.
Contributing
Contributions to the HyP3 autoRIFT plugin are welcome! If you would like to contribute, please submit a pull request on the GitHub repository.
Contact Us
Want to talk about HyP3 autoRIFT? We would love to hear from you!
Found a bug? Want to request a feature? open an issue
General questions? Suggestions? Or just want to talk to the team? Open a discussion
Owner
- Name: HyP3
- Login: ASFHyP3
- Kind: organization
- Location: Fairbanks, AK
- Website: https://hyp3-docs.asf.alaska.edu/
- Twitter: ASFHyP3
- Repositories: 36
- Profile: https://github.com/ASFHyP3
Alaska Satellite Facility's Hybrid Pluggable Processing Pipeline
Citation (CITATION.cff)
cff-version: 1.2.0
title: A HyP3 plugin for feature tracking processing with AutoRIFT
message: >-
If you use this software, please cite it using the
metadata from this file.
type: software
license: BSD-3-Clause
keywords:
- feature tracking
- optical
- radar
- satellite imagery
- surface displacement
- glacier velocity
- earthquake displacement
- landslide
- remote sensing
- ice displacement
authors:
- given-names: Joseph
name-particle: H
family-names: Kennedy
orcid: 'https://orcid.org/0000-0002-9348-693X'
affiliation: >-
Alaska Satellite Facility,
Geophysical Institute,
University of Alaska Fairbanks,
Fairbanks, AK 99775, USA
- given-names: Andrew
family-names: Player
orcid: 'https://orcid.org/0009-0008-9736-7314'
affiliation: >-
Alaska Satellite Facility,
Geophysical Institute,
University of Alaska Fairbanks,
Fairbanks, AK 99775, USA
- given-names: Mario
family-names: Angarita
orcid: 'https://orcid.org/0000-0001-7455-2455'
affiliation: >-
Alaska Satellite Facility,
Geophysical Institute,
University of Alaska Fairbanks,
Fairbanks, AK 99775, USA
- given-names: Johnston
family-names: Andrew
orcid: 'https://orcid.org/0009-0008-4317-3995'
affiliation: >-
Alaska Satellite Facility,
Geophysical Institute,
University of Alaska Fairbanks,
Fairbanks, AK 99775, USA
- given-names: Forrest
family-names: Williams
orcid: 'https://orcid.org/0000-0001-8721-6020'
affiliation: >-
Alaska Satellite Facility,
Geophysical Institute,
University of Alaska Fairbanks,
Fairbanks, AK 99775, USA
- given-names: Jacquelyn
family-names: Smale
orcid: 'https://orcid.org/0000-0002-2749-5010'
affiliation: >-
Alaska Satellite Facility,
Geophysical Institute,
University of Alaska Fairbanks,
Fairbanks, AK 99775, USA
- given-names: Jake
family-names: Herrmann
# orcid: ''
affiliation: >-
Alaska Satellite Facility,
Geophysical Institute,
University of Alaska Fairbanks,
Fairbanks, AK 99775, USA
- given-names: Jiang
family-names: Zhu
# orcid: ''
affiliation: >-
Alaska Satellite Facility,
Geophysical Institute,
University of Alaska Fairbanks,
Fairbanks, AK 99775, USA
- given-names: James
family-names: Rine
# orcid: ''
affiliation: >-
Alaska Satellite Facility,
Geophysical Institute,
University of Alaska Fairbanks,
Fairbanks, AK 99775, USA
- given-names: Alex
family-names: Gardner
orcid: 'https://orcid.org/0000-0002-8394-8889'
affiliation: >-
Jet Propulsion Laboratory,
California Institute of Technology,
Pasadena, CA 91109, USA
GitHub Events
Total
- Create event: 57
- Issues event: 16
- Release event: 4
- Watch event: 1
- Delete event: 53
- Issue comment event: 31
- Push event: 238
- Pull request review comment event: 24
- Pull request event: 121
- Pull request review event: 59
- Fork event: 1
Last Year
- Create event: 57
- Issues event: 16
- Release event: 4
- Watch event: 1
- Delete event: 53
- Issue comment event: 31
- Push event: 238
- Pull request review comment event: 24
- Pull request event: 121
- Pull request review event: 59
- Fork event: 1
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 27
- Total pull requests: 289
- Average time to close issues: 3 months
- Average time to close pull requests: 14 days
- Total issue authors: 7
- Total pull request authors: 12
- Average comments per issue: 0.89
- Average comments per pull request: 0.48
- Merged pull requests: 244
- Bot issues: 0
- Bot pull requests: 38
Past Year
- Issues: 4
- Pull requests: 66
- Average time to close issues: about 2 months
- Average time to close pull requests: 9 days
- Issue authors: 3
- Pull request authors: 7
- Average comments per issue: 1.0
- Average comments per pull request: 0.21
- Merged pull requests: 40
- Bot issues: 0
- Bot pull requests: 28
Top Authors
Issue Authors
- jhkennedy (18)
- wangshuaicumt (3)
- Jlrine2 (2)
- cumtwangshuai (1)
- geoxlt (1)
- asjohnston-asf (1)
- alex-s-gardner (1)
Pull Request Authors
- jhkennedy (153)
- dependabot[bot] (38)
- asjohnston-asf (24)
- mfangaritav (19)
- AndrewPlayer3 (16)
- forrestfwilliams (10)
- jtherrmann (10)
- Jlrine2 (6)
- tools-bot (5)
- jacquelynsmale (4)
- cirrusasf (3)
- kmarnoult (1)
Top Labels
Issue Labels
Pull Request Labels
Dependencies
- condaforge/mambaforge latest build
- autorift 1.5.0.*
- boto3
- botocore
- build
- flake8
- flake8-blind-except
- flake8-builtins
- flake8-import-order
- gdal >=3
- h5netcdf
- hyp3lib >=3,<4
- isce2 2.6.1.dev7.*
- matplotlib-base
- netcdf4
- numpy <1.24
- opencv
- pillow
- pip
- pyproj
- pytest
- pytest-console-scripts
- pytest-cov
- python >=3.8,<3.10
- requests
- responses
- scipy
- setuptools >=61
- setuptools_scm >=6.2
- xarray