https://github.com/ami-iit/paper_darvish_2022_humanoids_action-kindyn-predicition

https://github.com/ami-iit/paper_darvish_2022_humanoids_action-kindyn-predicition

Science Score: 49.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 1 DOI reference(s) in README
  • Academic publication links
    Links to: arxiv.org, ieee.org, zenodo.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (10.9%) to scientific vocabulary
Last synced: 7 months ago · JSON representation

Repository

Basic Info
  • Host: GitHub
  • Owner: ami-iit
  • License: mit
  • Language: Python
  • Default Branch: main
  • Size: 67.4 KB
Statistics
  • Stars: 1
  • Watchers: 15
  • Forks: 1
  • Open Issues: 1
  • Releases: 0
Created about 4 years ago · Last pushed about 3 years ago
Metadata Files
Readme License

README.md

Simultaneous Action Recognition and Human Whole-Body Motion and Dynamics Prediction from Wearable Sensors

K. Darvish, S. Ivaldi and D. Pucci, "Simultaneous Action Recognition and Human Whole-Body Motion and Dynamics Prediction from Wearable Sensors," 2022 IEEE-RAS International Conference on Humanoid Robots (Humanoids), Ginowan, Japan, 2022.

https://user-images.githubusercontent.com/17707730/230998370-6b9cd792-85b9-4c73-9bde-6fda4ced1d8f.mp4

Data | Installation | Running | Paper | arXiv | Video

Data

To run the scripts, please download the required datasets and models provided in https://zenodo.org/record/7731386#.ZBABX9LMJhE. After downloading the data, extract the zip file in the root directory of this repo. The data folder will have the following structure:

├── data │ ├── README.md │ ├── annotated-data │ │ ├── ... │ ├── models │ │ ├── ... │ ├── raw-data │ │ ├── ... │ ├── wearable-data │ │ ├── ...

Installation

Requirements

  • Ubuntu 20.04.5 LTS (Focal Fossa)

Installation: mamba & robotology-superbuild

  • Install mamaba if you do not have:

    • follow the instructions provided in here
  • run the following command to create the enviornment for this code: sh cd <to the repo> mamba env create -f environment.yml mamba activate motion-prediction

  • If you do not have robotology-superbuild installed in your system, follow the instructions in here to install it in motion-prediction mamba env.

    • Activate the following profiles in robotology-superbuild if not already:
      • ROBOTOLOGY_ENABLE_CORE
      • ROBOTOLOGY_ENABLE_DYNAMICS
      • ROBOTOLOGY_ENABLE_HUMAN_DYNAMICS
      • ROBOTOLOGY_USES_PYTHON
    • More information about the installation of robotology-superbuild can be found in its GitHub repo.
    • remeber to source the robtology-superbuild by source <robotology-superbuild path>/build/install/share/robotology-superbuild/setup.sh.
  • in an activated and sourced environment try to run the following commands to ensure your environment is correctly setup: ```sh python

    import tensorflow as tf tf.version import yarp ``` N.B. Thereafter, all the terminals should be activated, sourced, and in the root folder of this repo.

    Installation of the project

  • build and test the python project: sh cd <motion-prediction path> pip install -e . pytest

  • build and install the c++ modules by: ```sh cd mkdir build ccmake ../

    update the CMAKEINSTALLPREFIX to your desired directory

    a suggestion is to set CMAKEINSTALLPREFIX to <robotology-superbuild path>/build/install directory.

    make install ```

Running

There are three phases in running the project: annotation, training, and testing.

Annotation

collect all the required data column-wise with desired frequency

  • run yarp server sh yarpserver --write

  • run collected wearable data using yarpdataplayer sh yarpdataplayer --withExtraTimeCol 2

  • run IK solver to stream human states sh yarprobotinterface --config TransformServer.xml yarprobotinterface --config HumanStateProvider.xml

  • run human motion data acquisition to collect human state and human dynamics data (feet force/torque interaction data) with the desired frequency (25 Hz, period 0.04 sec) sh humanDataPreparationModule --from humanDataAcquisitionForLogging.ini

  • At the end of this stage, you should have a file containing time, human states, interction forces/torques, similar to data file in data/raw-data/Dataset_2021_08_19_11_31_13.txt .

    annotate the data

  • run human motion data acquisition to annotate data and stream the vectorized human states (correctly set the path to the file saved in the previous step with variable filePathToRead in src/humanMotionDataAcquisition/app/humanDataAcquisitionForAnnotation.ini; remember to build and install the project) sh humanDataPreparationModule --from humanDataAcquisitionForAnnotation.ini

  • in a new terminal, run human motion prediction visualizer: sh humanPredictionVisualizerModule --from HumanVisualizer.ini

  • At the end of this stage, you should have a file containing columns with time, human states, interction forces/torques, and annotations, similar to data file in data/annotated-data/Dataset_2021_08_19_20_06_39.txt.

Training

  • to train a GMoE model, run the following script: sh python scripts/train.py
  • remember to correctly set data_path variable to the path of the annotated data.
  • you can modify the model and its parameters before training.
  • save and close the plots during training as they are blocking the process.
  • at the end of this script, you will see the results of LSTM and GMoE and the path to their saved models.

Testing and Animation for Realtime Applications

Real time scenario using wearables

  • stream human wearable data, either online or from collected data sh yarpdataplayer --withExtraTimeCol 2
  • run IK solver sh yarprobotinterface --config TransformServer.xml yarprobotinterface --config HumanStateProvider.xml
  • run the following command for vectorizing outputs sh humanDataPreparationModule --from humanDataStreamingTestOnline.ini ### Online scenario using logged data
  • run the following command for vectorizing outputs sh humanDataPreparationModule --from humanDataStreamingTestFromLoggedFile.ini

running test code for realtime prediction

  • run the model for testing sh python scripts/test_moe.py ### visuaization of outputs
  • you can run the following shell script to visualize the outputs: sh sh scripts/run_animations.sh

Citing this work

If you find the work useful, please consider citing:

```bibtex @INPROCEEDINGS{Darvish2022Simultaneous, author={Darvish, Kourosh and Ivaldi, Serena and Pucci, Daniele}, booktitle={2022 IEEE-RAS 21st International Conference on Humanoid Robots (Humanoids)}, title={Simultaneous Action Recognition and Human Whole-Body Motion and Dynamics Prediction from Wearable Sensors}, year={2022}, volume={}, number={}, pages={488-495}, doi={10.1109/Humanoids53995.2022.10000122}}

```

Maintainer

This repository is maintained by:

| | | | |:---:|:---:|:---:| | | @kouroshD | pesonal web page |

Owner

  • Name: Artificial and Mechanical Intelligence
  • Login: ami-iit
  • Kind: organization
  • Location: Italy

GitHub Events

Total
Last Year

Committers

Last synced: 10 months ago

All Time
  • Total Commits: 6
  • Total Committers: 1
  • Avg Commits per committer: 6.0
  • Development Distribution Score (DDS): 0.0
Past Year
  • Commits: 0
  • Committers: 0
  • Avg Commits per committer: 0.0
  • Development Distribution Score (DDS): 0.0
Top Committers
Name Email Commits
Kourosh Darvish k****h@g****m 6

Issues and Pull Requests

Last synced: 10 months ago

All Time
  • Total issues: 1
  • Total pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Total issue authors: 1
  • Total pull request authors: 0
  • Average comments per issue: 12.0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 0
  • Average time to close issues: N/A
  • Average time to close pull requests: N/A
  • Issue authors: 0
  • Pull request authors: 0
  • Average comments per issue: 0
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • Zweisteine96 (1)
Pull Request Authors
Top Labels
Issue Labels
Pull Request Labels

Dependencies

environment.yml conda
  • flake8
  • ipynbname
  • jupyterlab
  • keras
  • keras-applications
  • keras-preprocessing
  • matplotlib
  • numpy
  • oauthlib
  • opt-einsum
  • pandas
  • pip
  • pre-commit
  • pybind11
  • pydot
  • pytest
  • pytest-cov
  • python
  • scikit-learn
  • scipy
  • seaborn
  • setuptools
  • tensorboard
  • tensorflow
  • tensorflow-estimator