mot_with_pmmm

A novel Position and Multi-step Memory Matching (PMMM) module to enhance long-term association accuracy by integrating positional cues with multi-frame historical context.

https://github.com/huangsir12/mot_with_pmmm

Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (12.9%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

A novel Position and Multi-step Memory Matching (PMMM) module to enhance long-term association accuracy by integrating positional cues with multi-frame historical context.

Basic Info
  • Host: GitHub
  • Owner: Huangsir12
  • License: gpl-3.0
  • Language: Python
  • Default Branch: master
  • Size: 13.1 MB
Statistics
  • Stars: 0
  • Watchers: 0
  • Forks: 0
  • Open Issues: 2
  • Releases: 0
Created 9 months ago · Last pushed 7 months ago
Metadata Files
Readme Funding License Citation

README.md

MOTWITHPMMM

License Python Version Pytorch Version

Position and Multi-step Memory Matching (PMMM) module for enhancing long-term association accuracy in Multi-Object Tracking (MOT).

📌 Overview

This repository contains the implementation of PMMM - a novel module that integrates positional cues with multi-frame historical context to improve long-term association accuracy in multi-object tracking scenarios.

Key features: - Position-aware memory matching - Multi-step historical context integration - Enhanced long-term association - [Add other key features]

Quick Start

Prerequisites

  • Python 3.8+
  • PyTorch 2.0+
  • [Other dependencies]

Installation

bash git clone https://github.com/Huangsir12/MOT_WITH_PMMM.git cd MOT_WITH_PMMM conda create -n pmmm python=3.9 pip install -r requirements.txt

🛠️ Usage

Training

Firstly, we can train appearance feature representation model(reidmodel) ```bash conda activate pmmm cd bpbreid/torchreid python ./scripts/main.py --config-file configs/bpbreid/bpbreid_train.yaml ``` Then, we can train detection model.Train YOLO10x on the COCO8 dataset(or other coustom dataset with yolo format) for 100 epochs at image size 640. The training device can be specified using the device argument. If no argument is passed GPU device=0 will be used if available, otherwise device='cpu' will be used. See Arguments section below for a full list of training arguments.

```bash

Build a new model from YAML and start training from scratch

yolo detect train data=coco8.yaml model=yolo11n.yaml epochs=100 imgsz=640

Start training from a pretrained *.pt model

yolo detect train data=coco8.yaml model=yolo11n.pt epochs=100 imgsz=640

Build a new model from YAML, transfer pretrained weights to it and start training

yolo detect train data=coco8.yaml model=yolo11n.yaml pretrained=yolo11n.pt epochs=100 imgsz=640 ```

Evaluation

Evaluate a combination of detector, tracking method and ReID model on standard MOT dataset or you custom one. You can change detection model, reid model, track method and benchmark(MOT17, MOT20, you custom one) bash python tracking/val.py --yolo-model WEIGHTS / 'your_trained.pt' --reid-model WEIGHTS / 'your_trained.pt' --tracking_method botsort --source DATA / "datasets" / "Emporium" / "train" --ues_pmmm True

Inference

track without pmmm module bash cd MOT_WITH_PMMM python tracking/track.py --yolo-model WEIGHTS / 'your_trained.pt' --reid-model WEIGHTS / 'your_trained.pt' --tracking_method botsort track with pmmm module bash python tracking/track_with_pmmm.py --yolo-model WEIGHTS / 'your_trained.pt' --reid-model WEIGHTS / 'your_trained.pt' --tracking_method botsort

🧩 PMMM Module Architecture

PMMM Architecture

The PMMM module consists of: -Position-aware Branch -Multi-step Memory Bank -Attention-based Cross-frame Matching mechanism

📈 Performance

Benchmark Results Dataset MOTA ↑ IDF1 ↑ FP ↓ FN ↓ IDs ↓ MOT17 xx xx xx xx xx MOT20 xx xx xx xx xx Custom xx xx xx xx xx

🔗 References

This project builds upon these excellent works:

  1. FairMOT (GitHub)

    • Used the JDE-based framework as our baseline
    • Modified the original detection head implementation
  2. TransTrack (GitHub)

    • Adapted parts of the attention mechanism
    • Inspired our memory bank design

We sincerely thank the original authors for their work.

📜 Citation

If you use this work in your research, please cite:

BIBTEX @article{yourcitation, title={MOTWITHPMMM: Position and Multi-step Memory Matching for Long-term Association}, author={Your Name}, journal={Journal or Conference Name}, year={2023} }

🤝 Contributing

We welcome contributions! Please see CONTRIBUTING.md for guidelines.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

✉️ Contact

For questions or suggestions, please contact: huangming -@qq.com Project Link: https://github.com/Huangsir12/MOTWITHPMMM

Owner

  • Name: HuangBaoming
  • Login: Huangsir12
  • Kind: user

My name is HuangBaoming, master of Management Science and Engineering Harbin Institute of technology, Research interested in NLP and CV

Citation (CITATION.cff)

cff-version: 12.0.8
preferred-citation:
  type: software
  message: "If you use Yolo Tracking, please cite it as below."
  authors:
  - family-names: Broström
    given-names: Mikel
  title: "BoxMOT: pluggable SOTA tracking modules for object detection, segmentation and pose estimation models"
  version: 12.0.8
  doi: https://zenodo.org/record/7629840
  date-released: 2024-6
  license: AGPL-3.0
  url: "https://github.com/mikel-brostrom/boxmot"

GitHub Events

Total
  • Delete event: 4
  • Issue comment event: 5
  • Push event: 8
  • Pull request event: 17
  • Create event: 13
Last Year
  • Delete event: 4
  • Issue comment event: 5
  • Push event: 8
  • Pull request event: 17
  • Create event: 13

Dependencies

.github/workflows/benchmark.yml actions
  • actions/cache v4 composite
  • actions/checkout v4 composite
  • actions/download-artifact v4 composite
  • actions/setup-python v5 composite
  • actions/upload-artifact v4 composite
  • peter-evans/create-pull-request v7 composite
.github/workflows/ci.yml actions
  • actions/checkout v4 composite
  • actions/setup-python v5 composite
.github/workflows/docker.yml actions
  • actions/checkout v4 composite
  • docker/build-push-action v6 composite
  • docker/login-action v3 composite
.github/workflows/publish.yml actions
  • actions/checkout v4 composite
  • actions/create-release v1 composite
  • actions/setup-python v5 composite
.github/workflows/stale.yml actions
  • actions/stale v9 composite
.github/workflows/update.yml actions
  • actions/checkout v4 composite
  • actions/setup-python v5 composite
  • peter-evans/create-pull-request v7 composite
.github/workflows/your-workflow.yml actions
Dockerfile docker
  • pytorch/pytorch 2.3.1-cuda11.8-cudnn8-runtime build
bpbreid/.ipynb_checkpoints/requirements-checkpoint.txt pypi
  • Cython *
  • Pillow *
  • albumentations *
  • deepdiff *
  • flake8 *
  • future *
  • gdown *
  • h5py *
  • isort *
  • matplotlib *
  • monai *
  • numpy *
  • opencv-python *
  • pandas *
  • scipy *
  • six *
  • tabulate *
  • tb-nightly *
  • torchmetrics ==0.10.3
  • wandb *
  • yacs *
  • yapf *
bpbreid/.ipynb_checkpoints/requirements_labels-checkpoint.txt pypi
  • openpifpaf *
bpbreid/docs/requirements.txt pypi
  • recommonmark *
  • sphinx *
  • sphinx-markdown-tables *
  • sphinx-rtd-theme *
  • sphinxcontrib-napoleon *
  • sphinxcontrib-websupport *
bpbreid/pyproject.toml pypi
bpbreid/requirements.txt pypi
  • Cython *
  • Pillow *
  • albumentations *
  • deepdiff *
  • flake8 *
  • future *
  • gdown *
  • h5py *
  • isort *
  • matplotlib *
  • monai *
  • numpy *
  • opencv-python *
  • pandas *
  • scipy *
  • six *
  • tabulate *
  • tb-nightly *
  • torchmetrics ==0.10.3
  • wandb *
  • yacs *
  • yapf *
bpbreid/requirements_labels.txt pypi
  • openpifpaf *
bpbreid/setup.py pypi
bpbreid/torchreid/metrics/rank_cylib/setup.py pypi
pyproject.toml pypi
uv.lock pypi
  • 133 dependencies