radtrack
Dataset for Range-Azimuth-Doppler Based Radar Multi-Object Tracking
Science Score: 67.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 1 DOI reference(s) in README -
✓Academic publication links
Links to: arxiv.org, ieee.org -
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (7.7%) to scientific vocabulary
Keywords
Repository
Dataset for Range-Azimuth-Doppler Based Radar Multi-Object Tracking
Basic Info
Statistics
- Stars: 3
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
RADTrack
RADTrack is the modified version of the RADDet dataset which incorporates the Multi-Object Tracking (MOT) use case. RADTrack transforms the RADDet data into sequences showing different times and locations where the data was recorded. Labeled object IDs allow for tracking of these objects using Range-Azimuth-Doppler (RAD) data.
With the RADTrack dataset, you can: - Train and evaluate object detection models on Range-Azimuth, Range-Doppler, and Cartesian Radar data - Train and evaluate object classification models on Range-Azimuth, Range-Doppler, and Cartesian Radar data - Train and evaluate object tracking models on Range-Azimuth, Range-Doppler, and Cartesian Radar data
Paper
- This dataset is introduced and used by RadarMOTR which is accepted in 2024 International Radar Conference (RADAR).
- Accepted Preprint
- IEEE Xplore Version
Radar(stationary) Dataset for Dynamic Road Users
Dataset link
Dataset format
The folder structure and additional metadata in RADTrack are designed to mirror the MOT20 format. This similarity enables seamless integration with standard evaluation tools like TrackEval, requiring only minor adjustments to the ground truth (GT) data format.
|-- {RADTrack ROOT}
| |-- radtrack-train
| | |-- radtrack0001
| | | |-- RAD
| | | | |-- 000001.npy
| | | | |-- 000002.npy
| | | | |-- ...
| | | |-- stereo_image
| | | | |-- 000001.jpg
| | | | |-- 000002.jpg
| | | | |-- ...
| | | |-- gt
| | | | |-- gt.json
| | | |-- frame_mapping.txt
| | | |-- seqinfo.ini
| | |-- ...
| |-- radtrack-val
| | |-- ...
| |-- radtrack-test
| | |-- ...
| |-- seqmaps
| | |-- radtrack-train.txt
| | |-- radtrack-val.txt
| | |-- radtrack-test.txt
| |-- sensors_para
| | |-- registration_matrix
| | | |-- ...
| | |-- stereo_para
| | | |-- ...
| | |-- radar_config.json
Dataset details
The RADTrack dataset comprises 10,158 frames, organized into 24 sequences.
To ensure consistency, the same radar configuration throughout the entire data collection process was used.
The specifics of the data capture settings are summarized below and can also be found in the sensors_para/radar_config.json file.
jsonc
"designed_frequency": 76.8, // Hz
"config_frequency": 77, // Hz
"maximum_range": 50, // m
"range_size": 256,
"azimuth_size": 256,
"doppler_size": 64,
"range_resolution": 0.1953125, // m/bin
"angular_resolution": 0.006135923, // radian/bin
"velocity_resolution": 0.41968030701528203, // (m/s)/bin
The dataset consists of 6 classes and various input and ground truth formats. Below is a summary of the information stored in the dataset.
- RAD: 3D-FFT radar data stored as a matrix of
complex64numbers with dimensions (256, 256, 64) in NumPy format. - stereo_image: Two rectified stereo images.
- gt: A JSON file containing a list of dictionaries. Each dictionary corresponds to a specific frame and includes a list of entries, where the number of entries is equal to the number of radar objects present in that frame. The dictionary has the following keys:
classes: A list of class labels. There are 6 classes:person,bicycle,car,motorcycle,bus, andtruck.boxes: A list of bounding box coordinates in the format[x_center, y_center, z_center, w, h, d], wherexrepresents Range,yrepresents Angle, andzrepresents Doppler axis.cart_boxes: A list of Cartesian bounding box coordinates in the format[y_center, x_center, h, w].ids: A list of object IDs.
- sensors_para: Includes
stereo_parafor stereo depth estimation andregistration_matrixfor cross-sensor registration. - seqmaps: A CSV file for each split, indicating which sequences are included.
- frame_mapping.txt: A CSV file containing a mapping of matching frames between the RADDet and RADTrack datasets for backwards compatibility.
- seqinfo.ini: A file containing meta information about each sequence, such as its length.
[!NOTE] The
stereo_paraincludesleft_maps.npyandright_maps.npy, which are derived fromcv2.initUndistortRectifyMap(...)and contain maps in bothxandydirections. All other matrices are derived fromcv2.stereoRectify(...).
Dataset splits
The dataset is divided into three splits: 70% for training, 20% for validation, and 10% for testing.
If you need to modify these splits, you can merge them by copying the sequence directories into the desired split folder and updating the corresponding seqmaps files.
Statistics
| Statistics | Test | Train | Validation | Overall | | --------------------------------- | :---: | :---: | :--------: | :-----: | | Frames | 585 | 7528 | 2045 | 10158 | | Objects | 88 | 1313 | 295 | 1696 | | Frames with one object | 104 | 1273 | 499 | 1876 | | Frames with two objects | 166 | 1918 | 758 | 2842 | | Frames with three objects | 116 | 2071 | 441 | 2628 | | Frames with four objects | 87 | 1249 | 238 | 1574 | | Frames with five objects | 85 | 694 | 71 | 850 | | Frames with six objects | 26 | 252 | 35 | 313 | | Frames with seven or more objects | 1 | 71 | 3 | 75 |
| Classes | Test | Train | Validation | Overall | | ----------- | :---: | :---: | :--------: | :-----: | | cars | 64 | 926 | 203 | 1193 | | trucks | 13 | 202 | 45 | 260 | | buses | 0 | 10 | 1 | 11 | | bicycles | 2 | 24 | 9 | 35 | | persons | 9 | 153 | 36 | 198 | | motorcycles | 0 | 4 | 1 | 5 |
Dataset license
The tools and code are licensed under MIT. The dataset is licensed under Creative Commons Attribution 4.0 International License.
RADTrack Labeling/Viewing Tool
The labeling tool for the annotation of the object IDs and class can be located in the labeling_tool folder. It also can be used as a dataset viewer.
Installation
Requirements:
Clone repository
sh git clone https://github.com/madeit07/RADTrack.git cd RADTrack/labeling_toolCreate virtual environment (e.g. using conda)
sh conda create -n radtrack python=3.11 conda activate radtrackInstall other requirements
sh pip install -r requirements.txt(Optional) If you want to use the scripts in the
toolsfolder, install those requirements as well:sh pip install -r ../tools/requirements.txt conda install ffmpeg # Only for tools/visualize.py
Run Labeling Tool
Execute Python script in console:
sh
python radtrack_labeler.py
Under File->New... create a new project.
After the project is opened, all RAD data is processed and cached so that scrubbing through the images is responsive. This is done only once and can a long time.
Default project
If you don't want to load the project every time you start the tool, you can set a default project which gets automatically loaded on startup.
Create a new project and configure it. Name it Default and save it under labeling_tool\projects\.
Keybinds
Playback Controls:
| Action | Keybind | Alternative Keybind | | ------------------- | :----------: | :---------------------: | | Next frame | D | → | | Previous frame | A | ← | | Jump to first frame | Q | Home | | Jump to last frame | E | End | | Next sequence | S | ↓ | | Previous sequence | W | ↑ |
Labeling:
[!WARNING] Alternative keybinds for playback control do not work while an object is selected. They are used to control the ID field.
| Action | Keybind | | ------------------------------------------ | :-------------------------------------------------: | | Select next object | F | | Select previous object | Ctrl + F | | Deselect object | R | | Set ID in current frame | Enter | | Set ID in all frames | Ctrl + Enter | | Set ID in current and all following frames | Alt + Enter | | Set ID in current and all previous frames | Ctrl + Alt + Enter |
Utilities:
| Action | Keybind | | ------------------------- | :----------------------------: | | Backup ground truth files | Ctrl + B | | Reorder IDs in sequence | Ctrl + R |
[!NOTE] To be able to activate the Enter keybinds the focus must be in the ID field.
Credits
The original dataset RADDet which RADTrack is based on, was created by Ao Zhang, Farzan Erlik Nowruzi and Robert Laganiere from University of Ottawa and Sensorcortek Inc.
Citation
Please use the following citation when using the dataset:
bib
@inproceedings{RadarMOTR2024,
author = {Dell, Martin and Bradfisch, Wolfgang and Schober, Steffen and Klöck, Clemens},
title = {{RadarMOTR: Multi-Object Tracking with Transformers on Range-Doppler Maps}},
booktitle = {2024 International Radar Conference (RADAR)},
year = {2024},
pages = {1-6},
doi = {10.1109/RADAR58436.2024.10994166}
}
Owner
- Name: Martin Dell
- Login: madeit07
- Kind: user
- Location: Germany
- Repositories: 1
- Profile: https://github.com/madeit07
Student in Applied Computer Science (M. Sc.) at Esslingen University of Applied Sciences
Citation (CITATION.cff)
cff-version: 1.2.0
message: "If you use this dataset, please cite it as below."
preferred-citation:
type: conference-paper
authors:
- family-names: "Dell"
given-names: "Martin"
- family-names: "Bradfisch"
given-names: "Wolfgang"
- family-names: "Schober"
given-names: "Steffen"
- family-names: "Klöck"
given-names: "Clemens"
title: "RadarMOTR: Multi-Object Tracking with Transformers on Range-Doppler Maps"
year: 2024
collection-title: "2024 International Radar Conference (RADAR)"
doi: 10.1109/RADAR58436.2024.10994166
references:
- authors:
- family-names: "Zhang"
given-names: "Ao"
- family-names: "Nowruzi"
given-names: "Farzan Erlik"
- family-names: "Laganiere"
given-names: "Robert"
type: conference-paper
title: "RADDet: Range-Azimuth-Doppler based Radar Object Detection for Dynamic Road Users"
year: 2021
collection-title: "2021 18th Conference on Robots and Vision (CRV)"
GitHub Events
Total
- Watch event: 2
- Push event: 2
Last Year
- Watch event: 2
- Push event: 2