https://github.com/chenhongyiyang/egoposeformer
[ECCV 2024] EgoPoseFormer: A Simple Baseline for Stereo Egocentric 3D Human Pose Estimation
Science Score: 13.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
○.zenodo.json file
-
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (11.7%) to scientific vocabulary
Keywords
Repository
[ECCV 2024] EgoPoseFormer: A Simple Baseline for Stereo Egocentric 3D Human Pose Estimation
Basic Info
Statistics
- Stars: 8
- Watchers: 1
- Forks: 0
- Open Issues: 1
- Releases: 0
Topics
Metadata Files
README.md
EgoPoseFormer
This repository contains the official PyTorch implementation of our paper:
Usage
Environment Setup
```shell conda create -n egoposeformer python=3.10 -y source activate egoposeformer
pip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 -f https://download.pytorch.org/whl/torch_stable.html pip install pytorch-lightning==2.1.0 pip install numba==0.56.4 pip install numpy==1.23.5 pip install mmcv-full==1.6.0
git clone https://github.com/ChenhongyiYang/egoposeformer.git cd EgoPoseFormer pip install -e . ```
Dataset Setup
We provide support for our main dataset UnrealEgo. Please refer to its official instruction to download the dataset. Specifically, you only need to download the UnrealEgoData_impl split. You also need to download pelvis_pos.pkl, which is extracted from the UnrealEgo meta data, for computing 3D to 2D projection. The file structures should be:
EgoPoseFormer
|-- configs
|-- pose_estimation
|-- ...
|-- data
| |-- unrealego
| | |-- unrealego_impl
| | | |-- ArchVisInterior_ArchVis_RT
| | | |-- ...
| | |-- pelvis_pos.pkl
| | |-- train.txt
| | |-- validation.txt
| | |-- test.txt
Training and Testing
You can easily run an experiments using the following commands: ```shell
train
python run.py fit --config $CONFIG
test
python run.py test --config $CONFIG --ckpt_path $PATH
For example, you can run a full UnrealEgo experiment by:
shell
2D heatmap pre-training
python run.py fit --config ./configs/unrealegor18heatmap.yaml
training EgoPoseFormer
Note: You will need to put the pre-trained encoder path to
the encoder_pretrained entry in the config file
python run.py fit --config ./configs/unrealegor18pose3d.yaml
testing EgoPoseFormer
python run.py test --config ./configs/unrealegor18pose3d.yaml --ckpt_path path/to/ckpt ```
Results
| Backbone | MPJPE | PA-MPJPE | Config | Weights | |:---------:|:-----:|:--------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:--------:| | ResNet-18 | 34.5 | 33.4 | Pre-train / Pose | Link |
Note: The numbers are measured using newly trained models, so they are slightly different from the numbers reported in the paper.
Citation
@inproceedings{yang2024egoposeformer,
title={EgoPoseFormer: A Simple Baseline for Stereo Egocentric 3D Human Pose Estimation},
author={Yang, Chenhongyi and Tkach, Anastasia and Hampali, Shreyas and Zhang, Linguang and Crowley, Elliot J and Keskin, Cem},
journal={European conference on computer vision},
year={2024},
organization={Springer}
}
Acknowledgement
This codebase is partially inspired by the UnrealEgo implementation.
Owner
- Name: Chenhongyi Yang
- Login: ChenhongyiYang
- Kind: user
- Location: Zurich, Switzerland
- Company: Meta
- Website: chenhongyiyang.com
- Repositories: 4
- Profile: https://github.com/ChenhongyiYang
Research Scientist at Meta Reality Labs
GitHub Events
Total
- Issues event: 4
- Watch event: 5
- Issue comment event: 5
- Push event: 1
- Fork event: 1
Last Year
- Issues event: 4
- Watch event: 5
- Issue comment event: 5
- Push event: 1
- Fork event: 1