https://github.com/facebookresearch/habitat-lab
A modular high-level library to train embodied AI agents across a variety of tasks and environments.
Science Score: 46.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
✓Academic publication links
Links to: arxiv.org -
✓Committers with academic emails
11 of 75 committers (14.7%) from academic institutions -
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (15.0%) to scientific vocabulary
Keywords
Keywords from Contributors
Repository
A modular high-level library to train embodied AI agents across a variety of tasks and environments.
Basic Info
- Host: GitHub
- Owner: facebookresearch
- License: mit
- Language: Python
- Default Branch: main
- Homepage: https://aihabitat.org/
- Size: 271 MB
Statistics
- Stars: 2,529
- Watchers: 46
- Forks: 582
- Open Issues: 371
- Releases: 19
Topics
Metadata Files
README.md
Habitat-Lab
Habitat-Lab is a modular high-level library for end-to-end development in embodied AI. It is designed to train agents to perform a wide variety of embodied AI tasks in indoor environments, as well as develop agents that can interact with humans in performing these tasks.
Towards this goal, Habitat-Lab is designed to support the following features:
- Flexible task definitions: allowing users to train agents in a wide variety of single and multi-agent tasks (e.g. navigation, rearrangement, instruction following, question answering, human following), as well as define novel tasks.
- Diverse embodied agents: configuring and instantiating a diverse set of embodied agents, including commercial robots and humanoids, specifying their sensors and capabilities.
- Training and evaluating agents: providing algorithms for single and multi-agent training (via imitation or reinforcement learning, or no learning at all as in SensePlanAct pipelines), as well as tools to benchmark their performance on the defined tasks using standard metrics.
- Human in the loop interaction: providing a framework for humans to interact with the simulator, enabling to collect embodied data or interact with trained agents.
Habitat-Lab uses Habitat-Sim as the core simulator. For documentation refer here.
Table of contents
Citing Habitat
If you use the Habitat platform in your research, please cite the Habitat 1.0, Habitat 2.0, and Habitat 3.0 papers:
``` @misc{puig2023habitat3, title = {Habitat 3.0: A Co-Habitat for Humans, Avatars and Robots}, author = {Xavi Puig and Eric Undersander and Andrew Szot and Mikael Dallaire Cote and Ruslan Partsey and Jimmy Yang and Ruta Desai and Alexander William Clegg and Michal Hlavac and Tiffany Min and Theo Gervet and Vladimír Vondruš and Vincent-Pierre Berges and John Turner and Oleksandr Maksymets and Zsolt Kira and Mrinal Kalakrishnan and Jitendra Malik and Devendra Singh Chaplot and Unnat Jain and Dhruv Batra and Akshara Rai and Roozbeh Mottaghi}, year={2023}, archivePrefix={arXiv}, }
@inproceedings{szot2021habitat, title = {Habitat 2.0: Training Home Assistants to Rearrange their Habitat}, author = {Andrew Szot and Alex Clegg and Eric Undersander and Erik Wijmans and Yili Zhao and John Turner and Noah Maestre and Mustafa Mukadam and Devendra Chaplot and Oleksandr Maksymets and Aaron Gokaslan and Vladimir Vondrus and Sameer Dharur and Franziska Meier and Wojciech Galuba and Angel Chang and Zsolt Kira and Vladlen Koltun and Jitendra Malik and Manolis Savva and Dhruv Batra}, booktitle = {Advances in Neural Information Processing Systems (NeurIPS)}, year = {2021} }
@inproceedings{habitat19iccv, title = {Habitat: {A} {P}latform for {E}mbodied {AI} {R}esearch}, author = {Manolis Savva and Abhishek Kadian and Oleksandr Maksymets and Yili Zhao and Erik Wijmans and Bhavana Jain and Julian Straub and Jia Liu and Vladlen Koltun and Jitendra Malik and Devi Parikh and Dhruv Batra}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, year = {2019} }
```
Installation
- Preparing conda env
Assuming you have conda installed, let's prepare a conda env:
bash
# We require python>=3.9 and cmake>=3.14
conda create -n habitat python=3.9 cmake=3.14.0
conda activate habitat
conda install habitat-sim
- To install habitat-sim with bullet physics
conda install habitat-sim withbullet -c conda-forge -c aihabitatNote, for newer features added after the most recent release, you may need to installaihabitat-nightly. See Habitat-Sim's installation instructions for more details.
- To install habitat-sim with bullet physics
pip install habitat-lab stable version.
bash git clone --branch stable https://github.com/facebookresearch/habitat-lab.git cd habitat-lab pip install -e habitat-lab # install habitat_labInstall habitat-baselines.
The command above will install only core of Habitat-Lab. To include habitat_baselines along with all additional requirements, use the command below after installing habitat-lab:
bash pip install -e habitat-baselines # install habitat_baselines
Testing
- Let's download some 3D assets using Habitat-Sim's python data download utility:
- Download (testing) 3D scenes:
bash python -m habitat_sim.utils.datasets_download --uids habitat_test_scenes --data-path data/Note that these testing scenes do not provide semantic annotations.
- Download (testing) 3D scenes:
- Download point-goal navigation episodes for the test scenes:
bash python -m habitat_sim.utils.datasets_download --uids habitat_test_pointnav_dataset --data-path data/
Non-interactive testing: Test the Pick task: Run the example pick task script <!--- Please, update
examples/example.pyif you update example. -->bash python examples/example.pywhich uses
habitat-lab/habitat/config/benchmark/rearrange/skills/pick.yamlfor configuration of task and agent. The script roughly does this:```python import gym import habitat.gym
Load embodied AI task (RearrangePick) and a pre-specified virtual robot
env = gym.make("HabitatRenderPick-v0") observations = env.reset()
terminal = False
Step through environment with random actions
while not terminal: observations, reward, terminal, info = env.step(env.action_space.sample()) ```
To modify some of the configurations of the environment, you can also use the
habitat.gym.make_gym_from_configmethod that allows you to create a habitat environment using a configuration.python config = habitat.get_config( "benchmark/rearrange/skills/pick.yaml", overrides=["habitat.environment.max_episode_steps=20"] ) env = habitat.gym.make_gym_from_config(config)If you want to know more about what the different configuration keys overrides do, you can use this reference.
See
examples/register_new_sensors_and_measures.pyfor an example of how to extend habitat-lab from outside the source code.Interactive testing: Using you keyboard and mouse to control a Fetch robot in a ReplicaCAD environment: ```bash
Pygame for interactive visualization, pybullet for inverse kinematics
pip install pygame==2.0.1 pybullet==3.0.4
Interactive play script
python examples/interactive_play.py --never-end ```
Use I/J/K/L keys to move the robot base forward/left/backward/right and W/A/S/D to move the arm end-effector forward/left/backward/right and E/Q to move the arm up/down. The arm can be difficult to control via end-effector control. More details in documentation. Try to move the base and the arm to touch the red bowl on the table. Have fun!
Note: Interactive testing currently fails on Ubuntu 20.04 with an error: X Error of failed request: BadAccess (attempt to access private resource denied). We are working on fixing this, and will update instructions once we have a fix. The script works without errors on MacOS.
Debugging an environment issue
Our vectorized environments are very fast, but they are not very verbose. When using VectorEnv some errors may be silenced, resulting in process hanging or multiprocessing errors that are hard to interpret. We recommend setting the environment variable HABITAT_ENV_DEBUG to 1 when debugging (export HABITAT_ENV_DEBUG=1) as this will use the slower, but more verbose ThreadedVectorEnv class. Do not forget to reset HABITAT_ENV_DEBUG (unset HABITAT_ENV_DEBUG) when you are done debugging since VectorEnv is much faster than ThreadedVectorEnv.
Documentation
Browse the online Habitat-Lab documentation and the extensive tutorial on how to train your agents with Habitat. For Habitat 2.0, use this quickstart guide.
Docker Setup
We provide docker containers for Habitat, updated approximately once per year for the Habitat Challenge. This works on machines with an NVIDIA GPU and requires users to install nvidia-docker. To setup the habitat stack using docker follow the below steps:
Pull the habitat docker image:
docker pull fairembodied/habitat-challenge:testing_2022_habitat_base_dockerStart an interactive bash session inside the habitat docker:
docker run --runtime=nvidia -it fairembodied/habitat-challenge:testing_2022_habitat_base_dockerActivate the habitat conda environment:
conda init; source ~/.bashrc; source activate habitatRun the testing scripts as above:
cd habitat-lab; python examples/example.py. This should print out an output like:bash Agent acting inside environment. Episode finished after 200 steps.
Questions?
Can't find the answer to your question? Look up for common issues or try asking the developers and community on our Discussions forum.
Datasets
Common task and episode datasets used with Habitat-Lab.
Baselines
Habitat-Lab includes reinforcement learning (via PPO) baselines. For running PPO training on sample data and more details refer habitat_baselines/README.md.
ROS-X-Habitat
ROS-X-Habitat (https://github.com/ericchen321/rosxhabitat) is a framework that bridges the AI Habitat platform (Habitat Lab + Habitat Sim) with other robotics resources via ROS. ROS-X-Habitat places emphasis on 1) leveraging Habitat Sim v2's physics-based simulation capability and 2) allowing roboticists to access simulation assets from ROS. The work has also been made public as a paper.
Note that ROS-X-Habitat was developed, and is maintained by the Lab for Computational Intelligence at UBC; it has not yet been officially supported by the Habitat Lab team. Please refer to the framework's repository for docs and discussions.
License
Habitat-Lab is MIT licensed. See the LICENSE file for details.
The trained models and the task datasets are considered data derived from the correspondent scene datasets.
- Matterport3D based task datasets and trained models are distributed with Matterport3D Terms of Use and under CC BY-NC-SA 3.0 US license.
- Gibson based task datasets, the code for generating such datasets, and trained models are distributed with Gibson Terms of Use and under CC BY-NC-SA 3.0 US license.
Owner
- Name: Meta Research
- Login: facebookresearch
- Kind: organization
- Location: Menlo Park, California
- Website: https://opensource.fb.com
- Repositories: 1,060
- Profile: https://github.com/facebookresearch
Committers
Last synced: 9 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Mikaël Dallaire Côté | 1****c | 145 |
| Alexander Clegg | a****g@g****m | 129 |
| Erik Wijmans | e****w@g****u | 88 |
| Vincent-Pierre BERGES | 2****e | 75 |
| Oleksandr | m****o@g****m | 51 |
| Aaron Gokaslan | a****n@f****m | 46 |
| Andrew Szot | A****t | 31 |
| Ruslan | p****2@g****m | 31 |
| Xavier Puig | x****f@g****m | 28 |
| Jimmy Yang | 5****g | 23 |
| dhruvbatra | d****a | 23 |
| JasonJiazhiZhang | 2****g | 20 |
| Eric Undersander | e****r@m****m | 19 |
| Abhishek Kadian | a****n@g****m | 16 |
| Naoki Harrison Yokoyama | n****3@i****u | 15 |
| Vladimír Vondruš | m****a@c****z | 13 |
| John Turner | 7****s@g****m | 12 |
| danielgordon10 | d****0@g****m | 9 |
| Mukul Khanna | m****a@g****m | 8 |
| Karmesh Yadav | k****v@f****m | 6 |
| Vivian Auduong | a****g@f****m | 6 |
| Karmesh Yadav | y****h@g****m | 6 |
| Santhosh Kumar Ramakrishnan | 3****2 | 6 |
| naokiyokoyama | n****7@g****m | 5 |
| Ram Ramrakhya | r****1@g****m | 5 |
| Sam Henry | h****r@m****m | 5 |
| Laikh Tewari | l****h@f****m | 4 |
| Facebook Community Bot | f****t | 4 |
| matsuren | m****n | 4 |
| Jacob Krantz | k****a@o****u | 4 |
| and 45 more... | ||
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 6 months ago
All Time
- Total issues: 168
- Total pull requests: 502
- Average time to close issues: about 1 month
- Average time to close pull requests: 16 days
- Total issue authors: 129
- Total pull request authors: 35
- Average comments per issue: 1.48
- Average comments per pull request: 0.3
- Merged pull requests: 346
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 49
- Pull requests: 130
- Average time to close issues: 24 days
- Average time to close pull requests: 8 days
- Issue authors: 41
- Pull request authors: 15
- Average comments per issue: 0.27
- Average comments per pull request: 0.29
- Merged pull requests: 83
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- zhai-create (5)
- realjoshqsun (4)
- alre5639 (4)
- zoeyliu1999 (3)
- edwardjjj (3)
- albertcity (2)
- Jinghan11 (2)
- Li-ChangHao (2)
- Yuxin916 (2)
- aleflabo (2)
- Kai-X-Org (2)
- jzhzhang (2)
- vaibhavoutat (2)
- Benson722 (2)
- liliang18273158269 (2)
Pull Request Authors
- 0mdc (213)
- aclegg3 (139)
- jimmytyyang (27)
- eundersander (24)
- xavierpuigf (20)
- henrysamer (12)
- jturner65 (8)
- danieltmeta (7)
- Ram81 (5)
- ASzot (4)
- zephirefaith (4)
- Achuthankrishna (3)
- MatthewChang (3)
- yvsriram (2)
- AnandSingh-0619 (2)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 3
-
Total downloads:
- pypi 2,222 last-month
- Total docker downloads: 70
-
Total dependent packages: 0
(may contain duplicates) -
Total dependent repositories: 5
(may contain duplicates) - Total versions: 41
- Total maintainers: 1
pypi.org: habitat-baselines
Habitat-Baselines: Embodied AI baselines.
- Homepage: https://aihabitat.org
- Documentation: https://habitat-baselines.readthedocs.io/
- License: MIT License
-
Latest release: 0.3.320250127
published about 1 year ago
Rankings
Maintainers (1)
proxy.golang.org: github.com/facebookresearch/habitat-lab
- Documentation: https://pkg.go.dev/github.com/facebookresearch/habitat-lab#section-documentation
- License: mit
-
Latest release: v0.3.3
published about 1 year ago
Rankings
pypi.org: habitat-lab
Habitat-Lab: a modular high-level library for end-to-end development in Embodied AI.
- Homepage: https://aihabitat.org
- Documentation: https://habitat-lab.readthedocs.io/
- License: MIT License
-
Latest release: 0.3.320250127
published about 1 year ago
Rankings
Maintainers (1)
Dependencies
- nvidia/cudagl 10.1-devel-ubuntu16.04 build
- lmdb >=0.98
- webdataset ==0.1.40
- ifcfg *
- moviepy >=1.0.1
- protobuf ==3.20.1
- tensorboard ==2.8.0
- torch >=1.3.1
- torchvision *
- faster_fifo ==1.4.2
- threadpoolctl ==3.1.0
- faster_fifo >=1.4.2
- threadpoolctl >=3.1.0
- attrs >=19.1.0
- gym >=0.22.0,<0.23.1
- hydra-core >=1.2.0
- imageio >=2.2.0
- imageio-ffmpeg >=0.2.0
- numba >=0.44.0
- numpy >=1.20.0
- numpy-quaternion >=2019.3.18.14.33.20
- omegaconf >=2.2.3
- opencv-python >=3.3.0
- pickle5 *
- scipy >=1.10.1
- tqdm >=4.0.0