petrack

This is a mirror. For more information/issue reporting please use the source repo:

https://github.com/ped-dyn-emp/petrack

Science Score: 49.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 14 DOI reference(s) in README
  • Academic publication links
    Links to: zenodo.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (12.3%) to scientific vocabulary
Last synced: 6 months ago · JSON representation

Repository

This is a mirror. For more information/issue reporting please use the source repo:

Basic Info
Statistics
  • Stars: 4
  • Watchers: 2
  • Forks: 3
  • Open Issues: 0
  • Releases: 0
Created over 4 years ago · Last pushed 6 months ago
Metadata Files
Readme Changelog Contributing License Code of conduct Zenodo

ReadMe.md

PeTrack

Documentation Status pipeline status Latest Release download DOI DOI License Contributor Covenant

PeTrack logo

For the understanding of the dynamics inside crowds reliable empirical data are needed enabling an increase of safety and comfort for pedestrians and the design of models reflecting the real dynamics. PeTrack (Pedestrian Tracking) automatically extracts accurate trajectories of marked pedestrians from video recordings. The joint trajectories of all pedestrians provide data like velocity, flow and density at any time and position. With such a tool extensive experimental series with a large number of persons can be analyzed. Individual codes enables personalized trajectories with static information of each participant (e.g. age, gender).

The program deals with wide angle lenses and a high density of pedestrians. Lens distortion and perspective view are taken into account. The procedure includes calibration, recognition, tracking and height detection. Different kinds of markers (e.g. with height information, head direction, individual code) are implemented. With a stereo camera more accurate height measurements and also markerless tracking is possible.

Download & installation:

We publish installers for windows. To get the latest installer, visit this webpage and download the current installer.

How to use PeTrack

To learn how to use PeTrack, please have a look at our documentation here.

Note that PeTrack can only be used if some steps are considered during the planning and execution of the experiments. This includes, but is not limited to, calibration and selecting hats. See Planning of Experiments

After all of that, enjoy successful recognition and tracking Picture of pedestrians with colored hats with indications of detected heads and tracked past trajectory

Some documentation is built into PeTrack itself. For example, you can use the help-menu to find a list of all keybindings. For running in a "batch-mode", PeTrack has a CLI, whose options are listed by calling bash petrack -help

Tutorial & demo

A small demo project is provided in the demo folder. It contains all necessary files for setting up a project with PeTrack. You can download these files via this link. A tutorial leading through all steps can be found in the Documentation. The intermediate project files for each step are also included in the demo folder.

How to cite

Please cite the general paper and the corresponding software version in your publications if PeTrack helps your research.

General Paper: - Boltes, M. and Seyfried, A.: Collecting Pedestrian Trajectories; In: Neurocomputing, Special Issue on Behaviours in Video, vol. 100, pp. 127-133 (2013)

BibTeX

@article{BOLTES2013127, title = {Collecting pedestrian trajectories}, journal = {Neurocomputing}, volume = {100}, pages = {127-133}, year = {2013}, note = {Special issue: Behaviours in video}, issn = {0925-2312}, doi = {https://doi.org/10.1016/j.neucom.2012.01.036}, author = {Maik Boltes and Armin Seyfried}, keywords = {Pedestrian detection, Laboratory experiment}, }

Software Version:

BibTeX for current version ``` @software{boltes_2025_15119517, author = {Boltes, Maik and Kilic, Deniz and Schrdter, Tobias and Arens, Tobias and Dreen, Luke and Adrian, Juliane and Boomers, Ann Katrin and Kandler, Alica and Kpper, Mira and Graf, Arne and Salden, Daniel and Brualla, Ricardo Martin and Hger, Paul and Hillebrand, Daniel and Lieberenz, Paul and Klein, Janine}, title = {PeTrack}, month = apr, year = 2025, publisher = {Zenodo}, version = {v1.0}, doi = {10.5281/zenodo.15119517}, url = {https://doi.org/10.5281/zenodo.15119517}, } ```

To find your corresponding version, check the about menu or look into the terminal output of PeTrack. To cite PeTrack as software without specifying a version, use the DOI 10.5281/zenodo.5078176.

License

This project is licensed under the terms of the GPLv3 license. For further information see LICENSE.

Changelog:

See CHANGELOG.md

Dependencies:

  • Qwt (https://qwt.sf.net)
  • Qt (https://www.qt.io/)
  • OpenCV (https://opencv.org/)

Owner

  • Name: ped-dyn-emp
  • Login: ped-dyn-emp
  • Kind: organization

GitHub Events

Total
  • Release event: 3
  • Watch event: 1
  • Member event: 1
  • Issue comment event: 5
  • Push event: 65
  • Pull request review event: 2
  • Pull request review comment event: 1
  • Pull request event: 7
  • Create event: 1
Last Year
  • Release event: 3
  • Watch event: 1
  • Member event: 1
  • Issue comment event: 5
  • Push event: 65
  • Pull request review event: 2
  • Pull request review comment event: 1
  • Pull request event: 7
  • Create event: 1

Dependencies

container/ubuntu/Dockerfile docker
  • ubuntu 22.04 build