OfflineMOT
OfflineMOT: A Python package for multiple objects detection and tracking from bird view stationary drone videos - Published in JOSS (2022)
Science Score: 100.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 7 DOI reference(s) in README and JOSS metadata -
✓Academic publication links
Links to: joss.theoj.org, zenodo.org -
✓Committers with academic emails
2 of 4 committers (50.0%) from academic institutions -
○Institutional organization owner
-
✓JOSS paper metadata
Published in Journal of Open Source Software
Keywords
Scientific Fields
Repository
Multiple traffic entities detection and tracking from bird-view drone stationary videos https://engyasin.github.io/Offline_MOT/
Basic Info
Statistics
- Stars: 12
- Watchers: 1
- Forks: 2
- Open Issues: 1
- Releases: 12
Topics
Metadata Files
Readme.md
Multiple objects detection and tracking from bird view stationary drone videos
OfflineMOT is a package for multi objects tracking from bird's eye view stationary videos. The accuracy has priority over runtime in this package, therefore it is better suited for offline processing rather than real time applications, hence the name of the package.
Update 21/2/2023: New changes are made to the tracking, like using GOTURN and the option to use kalman filter. Sadly, the documentation is not updated yet, but you can control the new configuration from the configuration file.
Update 27/3/2023: The ability for manaul intervention by pressing s key to save the current tracking state is added. This is useful when the tracking is not working well and you want to correct it manually. The documentation is not yet updated.
A pretrained Yolo network is used for detection in this package and it should be trained separately. The network included with this library is Yolo v4 in a pytorch format (trained to detect pedestrians, cyclists and cars). The loading and running of Yolo model is done with the help of scripts taken from this project (All of them are in offlinemot/tool subfolder)
For training YOLOv4 from scratch the same repo is useful. Additionally, Darknet code can be used to train YOLO and then converting the resulted trained file to pytorch format .pth as described by the docs.
Example output for a sample video, taken from Vehicle-Crowd Interaction (VCI) - CITR Dataset :

This example shows some minor problems because the scene, the background and the drone's camera are outside the detection network training set (never seen before by the detection network).
However, the application of this project (including the Yolo network training) was targeted for a Cyclists dataset videos [to be cited later].
Installation
The package can be installed on python 3.x simply using the pip command:
pip install offlinemot
For developers
To work on the latest development branch, one can simple clone the repo
git clone https://github.com/engyasin/Offline_MOT
and then from the repo root run:
pip install -e .
Citation Info
If you use this software in your work, please cite it as following
@article{Yousif2022,
doi = {10.21105/joss.04099},
url = {https://doi.org/10.21105/joss.04099},
year = {2022},
publisher = {The Open Journal},
volume = {7},
number = {74},
pages = {4099},
author = {Yasin Maan Yousif and Awad Mukbil and Jörg P. Müller},
title = {OfflineMOT: A Python package for multiple objects detection and tracking from bird view stationary drone videos},
journal = {Journal of Open Source Software}
}
Documentation
The documentation includes some example and guides to run this package and it is available here https://engyasin.github.io/Offline_MOT/
Jupyter notebooks tutorials format are also available here
A technical report can be found here
Getting Started
After installing the library, and in order to test the example provided with it, the following lines can be used in as python commands:
```python In [1]: import offlinemot
In [2]: from offlinemot.config import configs
In [3]: cfg = configs() # if you have avaliable configuration file '.ini', you can pass it
In [4]: cfg.print_summary() # show the current values and sections
In [5]: cfg['detecteveryN'] = 3
In [6]: cfg.print_section('Detection') # show parameters of single section
In [7]: cfg['detect_thresh'] = 15
In [8]: offlinemot.core.extract_paths(config=cfg) # no input to run the example video
In [9]: cfg.write('newconfigfile.ini') # to be loaded for similar videos
``` Lines 2 through 7 import and change the parameters for running the program.
For the first time line 8 is ran, the example network model will be downloaded (around 250MB). And a window for the example video with the tracked objects will be shown.
Line 9 save the current set of parameters into the provided file name.
The tracked objects will be surrounded with boxes in 5 different colors. Each color has a spicific meaning:
- Green: Pedestrian is detected.
- Blue: Cyclist is detected.
- Black: Car is detected.
- Red: The tracked object has failed the tracking step for the current frame
- White: The object is moving but still not classified to a class.
Of course, for a different case, the colors can be changed from the configs class attribute (colors_map). This also depends on the number of classes to predict.
To control this parameter and many others, the values can be assigned to the configs instance:
python
cfg = configs() #input can be named file, or empty for the default values.
cfg['colors_map'] = [(255,0,0),(0,0,255),(0,255,0)]
Note: It's highly recommended to set all the parameters when running on a new video. A detailed description for their meaning is available in the config file. Additionally, a complete example for parameters tuning is available in the documentation here
Running
Then to run it on a new video, the command is:
```python offlinemot.core.extractpaths('pathto_video',config=cfg)
[directory of the videos, leave empty to run the example video]
``` to show the result on the same video after post processing, use the command:
```python offlinemot.showresults.showresult('pathtosame_video',config=cfg)
[directory of the videos, leave empty to run the example video]
```
Finally, to change the yolo network used in the package, the complete directory to 3 files need to be assigned through configs class:
- .pth for the model weights
cfg['model_name'] = 'directory' - .cfg for the Yolo configuration.
cfg['model_config'] = 'directory' - .names for a simple text file containing the names of the classes.
cfg['class_file_name'] = 'directory'
Use cases
This project can be used for:
Traffic trajectories extraction from videos (It is originally built to extract trajectories for a cyclist's dataset for traffic modelling research recorded in TU-Clausthal).
Tracking other objects (like animals) from bird's eye view in an offline manner.
Testing
There are a number of test units for this project. If a development of the package is intended then they can be run after cloning this repo with the command:
$ pytest -v ./offlinemot/tests
For the previous command pytest library is needed to be installed.
Additionally, upon pushes and pull requests a github action would run pytest as well.
Support
If you like to contribute to a feature of a bug fix, please take a look at the contribution instructions page. It has further details.
Alternatively, you can contribute by creating an issue for a problem when running the program. If your issue is about the accuracy of the results (like not detecting or failing to track some objects), please tag the issue with logic error. Please also attach some images or gif files depicting how the error happened in running and post-running time of the video.
Stars
Please star this repository if you find it useful, or use it as part of your research.
License
OfflineMOT is free software and is licensed under the MIT License. Copyright (c) 2022, Yasin Yousif
Owner
- Name: Yasin M. Yousif
- Login: engyasin
- Kind: user
- Website: https://engyasin.github.io/
- Twitter: YasinYousif001
- Repositories: 3
- Profile: https://github.com/engyasin
JOSS Publication
OfflineMOT: A Python package for multiple objects detection and tracking from bird view stationary drone videos
Authors
Institut für Informatik, Technische Universität Clausthal 38678, Clausthal-Zellerfeld, Germany
Tags
Multiple objects tracking Traffic trajectories Objects detection Drone video analysisCitation (CITATION.cff)
# This CITATION.cff file was generated with cffinit.
# Visit https://bit.ly/cffinit to generate yours today!
cff-version: 1.2.0
title: >-
OfflineMOT: A Python Package for multiple objects
detection and tracking from bird view stationary
drone videos
message: 'If you use this software, please cite it as below.'
type: software
authors:
- given-names: Yasin
name-particle: Maan
family-names: Yousif
email: yy33@tu-clausthal.de
orcid: 'https://orcid.org/0000-0002-5282-7259'
affiliation: >-
Institut für Informatik, Technische Universität
Clausthal
- given-names: Awad
family-names: Mukbil
affiliation: >-
Institut für Informatik, Technische Universität
Clausthal
- given-names: Jörg
name-particle: P.
family-names: Müller
affiliation: >-
Institut für Informatik, Technische Universität
Clausthal
identifiers:
- type: doi
value: 10.5281/zenodo.6569417
repository-code: 'https://github.com/engyasin/Offline_MOT'
url: 'https://github.com/engyasin/Offline_MOT'
GitHub Events
Total
- Watch event: 1
Last Year
- Watch event: 1
Committers
Last synced: 5 months ago
Top Committers
| Name | Commits | |
|---|---|---|
| Yasin Yousif | y****f@t****e | 84 |
| Yasin M. Yousif | e****f@h****m | 1 |
| Hugo Ledoux | h****x@t****l | 1 |
| Arfon Smith | a****n | 1 |
Committer Domains (Top 20 + Academic)
Issues and Pull Requests
Last synced: 4 months ago
All Time
- Total issues: 1
- Total pull requests: 2
- Average time to close issues: N/A
- Average time to close pull requests: about 4 hours
- Total issue authors: 1
- Total pull request authors: 2
- Average comments per issue: 5.0
- Average comments per pull request: 0.0
- Merged pull requests: 2
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 0
- Pull requests: 0
- Average time to close issues: N/A
- Average time to close pull requests: N/A
- Issue authors: 0
- Pull request authors: 0
- Average comments per issue: 0
- Average comments per pull request: 0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- M-Colley (1)
Pull Request Authors
- arfon (1)
- hugoledoux (1)
Top Labels
Issue Labels
Pull Request Labels
Packages
- Total packages: 1
-
Total downloads:
- pypi 12 last-month
- Total dependent packages: 0
- Total dependent repositories: 1
- Total versions: 7
- Total maintainers: 1
pypi.org: offlinemot
- Homepage: https://github.com/engyasin/Offline_MOT
- Documentation: https://offlinemot.readthedocs.io/
- License: MIT
-
Latest release: 1.2.1
published almost 3 years ago
Rankings
Maintainers (1)
Dependencies
- gdown *
- numpy *
- opencv-contrib-python *
- scikit-image *
- scipy *
- torch *
- gdown *
- get *
- numpy *
- opencv-contrib-python *
- scikit-image *
- scipy *
- torch *
- actions/checkout v2 composite
- actions/upload-artifact v1 composite
- openjournals/openjournals-draft-action master composite
- actions/checkout v2 composite
- actions/setup-python v2 composite
- codecov/codecov-action v2 composite