https://github.com/ctu-vras/monoforce

[IROS 2024] [ICML 2024 Workshop Differentiable Almost Everything] MonoForce: Learnable Image-conditioned Physics Engine

https://github.com/ctu-vras/monoforce

Science Score: 49.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org, ieee.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (7.2%) to scientific vocabulary

Keywords

deep-learning differentiable-physics mobile-robots physics-informed-learning robot-terrain-interaction traversability-estimation traversability-mapping
Last synced: 5 months ago · JSON representation

Repository

[IROS 2024] [ICML 2024 Workshop Differentiable Almost Everything] MonoForce: Learnable Image-conditioned Physics Engine

Basic Info
  • Host: GitHub
  • Owner: ctu-vras
  • License: bsd-3-clause
  • Language: Jupyter Notebook
  • Default Branch: noetic
  • Homepage:
  • Size: 458 MB
Statistics
  • Stars: 76
  • Watchers: 9
  • Forks: 8
  • Open Issues: 1
  • Releases: 7
Topics
deep-learning differentiable-physics mobile-robots physics-informed-learning robot-terrain-interaction traversability-estimation traversability-mapping
Created over 2 years ago · Last pushed 9 months ago
Metadata Files
Readme License

README.md

MonoForce: Learnable Image-conditioned Physics Engine

[!Note] An updated version is available at ctu-vras/fusionforce.

IROS-2024 Arxiv ICML-2024-Diff-XYZ

Video Video Video Poster Data

Robot-terrain interaction prediction from RGB camera images as input: - predicted trajectory, - terrain shape and properties, - interaction forces and contacts.

Examples of predicted trajectories and autonomous traversal through vegetation:

video link video link

Table of Contents

Running

The MonoForce pipeline consists of the Terrain Encoder and the Physics Engine. Given input RGB images and cameras calibration the Terrain Encoder predicts terrain properties. Then the differentiable Physics Engine simulates robot trajectory and interaction forces on the predicted terrain for a provided control sequence. Refer to the monoforce/examples folder for implementation details.

Please run the following command to explore the MonoForce pipeline: commandline cd monoforce/ python scripts/run.py --img-paths IMG1_PATH IMG2_PATH ... IMGN_PATH --cameras CAM1 CAM2 ... CAMN --calibration-path CALIB_PATH

For example if you want to test the model with the provided images from the ROUGH dataset: commandline cd monoforce/scripts/ ./run.sh Please, refer to the installation instructions to download the pre-trained model weights.

ROS Integration

We provide a ROS nodes for both the trained Terrain Encoder model and the Differentiable Physics module. They are integrated into the launch file:

commandline roslaunch monoforce monoforce.launch

Training

The following terrain properties are predicted by the model: - Elevation: the terrain shape. - Friction: the friction coefficient between the robot and the terrain. - Stiffness: the terrain stiffness. - Damping: the terrain damping.

An example of the predicted elevation and friction maps (projected to camera images):

video link

One can see that the model predicts the friction map with higher values for road areas and with the smaller value for grass where the robot could have less traction.

To train the model, please run: commandline cd monoforce/scripts/ python train.py

Please refer to the trainfrictionheadwithpretrainedterrainencoder.ipynb notebook for the example of the terrain properties learning with the pretrained Terrain Encoder model and differentiable physics loss.

Navigation

Navigation method with MonoForce predicting terrain properties and possible robot trajectories from RGB images and control inputs. The package is used as robot-terrain interaction and path planning pipeline.

video link

We provide the differentiable physics model for robot-terrain interaction prediction: - Pytorch: The model is implemented in Pytorch. Please refer to the diff_physics.ipynb notebook for the example of the trajectory prediction.

Navigation consists of the following stages: - Terrain prediction: The Terrain Encoder is used to estimate terrain properties. - Trajectories simulation: The Physics Engine is used to shoot the robot trajectories. - Trajectory selection: The trajectory with the smallest cost based on robot-terrain interaction forces is selected. - Control: The robot is controlled to follow the selected trajectory.

Citation

Consider citing the papers if you find the work relevant to your research:

bibtex @inproceedings{agishev2024monoforce, title={MonoForce: Self-supervised Learning of Physics-informed Model for Predicting Robot-terrain Interaction}, author={Ruslan Agishev and Karel Zimmermann and Vladimr Kubelka and Martin Pecka and Tom Svoboda}, booktitle={IEEE/RSJ International Conference on Intelligent Robots and Systems - IROS}, year={2024}, eprint={2309.09007}, archivePrefix={arXiv}, primaryClass={cs.RO}, url={https://arxiv.org/abs/2309.09007}, doi={10.1109/IROS58592.2024.10801353}, }

bibtex @inproceedings{agishev2024endtoend, title={End-to-end Differentiable Model of Robot-terrain Interactions}, author={Ruslan Agishev and Vladim{\'\i}r Kubelka and Martin Pecka and Tomas Svoboda and Karel Zimmermann}, booktitle={ICML 2024 Workshop on Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators}, year={2024}, url={https://openreview.net/forum?id=XuVysF8Aon} }

Owner

  • Name: Vision for Robotics and Autonomous Systems
  • Login: ctu-vras
  • Kind: organization
  • Location: Prague

Research group at Czech Technical University in Prague (CTU), Faculty of Electrical Engineering, Department of Cybernetics

GitHub Events

Total
  • Create event: 13
  • Issues event: 1
  • Release event: 4
  • Watch event: 39
  • Delete event: 9
  • Issue comment event: 3
  • Member event: 2
  • Push event: 233
  • Fork event: 5
Last Year
  • Create event: 13
  • Issues event: 1
  • Release event: 4
  • Watch event: 39
  • Delete event: 9
  • Issue comment event: 3
  • Member event: 2
  • Push event: 233
  • Fork event: 5

Dependencies

setup.py pypi
singularity/requirements.txt pypi
  • Pillow ==10.0.0
  • albumentations ==1.2.1
  • configparser ==5.0.2
  • defusedxml ==0.7.1
  • empy ==3.3.4
  • gnupg ==2.3.1
  • matplotlib ==3.4.3
  • open3d ==0.10.0.0
  • opencv_python ==4.6.0
  • pathlib ==1.0.1
  • psutil ==5.8.0
  • pyransac3d ==0.6.0
  • python-dateutil ==2.8.2
  • pyyaml ==6.0
  • ransac ==1.0.4
  • rospkg ==1.3.0
  • scikit-image ==0.18.1
  • scikit-learn ==1.0
  • scipy ==1.8.1
  • segmentation_models_pytorch ==0.2.1
  • setuptools ==58.0.4
  • six ==1.15.0
  • sklearn ==0.0
  • torchdiffeq ==0.2.3
  • torchvision ==0.13.1
  • tqdm ==4.62.3
  • yacs ==0.1.6