https://github.com/clydemcqueen/ardusub_log_tools
A collection of log analysis tools for working with ArduSub vehicles
Science Score: 13.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
○.zenodo.json file
-
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.8%) to scientific vocabulary
Repository
A collection of log analysis tools for working with ArduSub vehicles
Basic Info
Statistics
- Stars: 2
- Watchers: 5
- Forks: 1
- Open Issues: 8
- Releases: 0
Metadata Files
README.md
ArduSub Log Tools 
This is a collection of log analysis tools for working with ArduSub vehicles.
All tools support file globbing and recursion on Linux.
Examples: ~~~ tool.py --help tool.py *.tlog tool.py --recurse directory tool.py --recurse . ~~~
Requirements
ardusublogtools requires Python 3.10 or 3.11. Other requirements are listed in requirements.txt.
A note on BAD_DATA messages
Some BlueOS-generated messages in QGC-generated tlog files may have bad CRC values. These messages will show up as type=BADDATA. See this discussion for the cause(es) and fix(es). A simple workaround is to set the `MAVIGNORECRC` environment variable: ~~~ export MAVIGNORECRC=1 showtypes.py *.tlog ~~~
Segments
Some of the tlog tools support --keep start_time,end_time,name options, which is a way to specify which parts of the
file you are interested in processing. Only these messages between start_time and end_time are processed; the rest
of the file is ignored.
The timestamps must be specified in Unix time (seconds since January 1st, 1970 UTC).
If you provide multiple tlog files they are logically concatenated, which allows a segment to span multiple files.
Tools that generate files will generate one file per segment, and the name of the segment will appear in the file name.
The following bash script that shows how 4 segments are processed from several tlog files: ~~~
!/bin/bash
export SEGMENT1="--keep 1694812410,1694813075,transect1" export SEGMENT2="--keep 1694813405,1694814090,transect2" export SEGMENT3="--keep 1694814995,1694815740,transect3" export SEGMENT4="--keep 1694816175,1694816945,transect4"
export SEGMENTS="$SEGMENT1 $SEGMENT2 $SEGMENT3 $SEGMENT4"
echo "Segments are:" echo $SEGMENTS
echo "#############" echo "Exploding (tlogmerge.py)" echo "#############" tlogmerge.py --types GLOBALPOSITIONINT,GPSGLOBALORIGIN,VISIONPOSITIONDELTA,LOCALPOSITIONNED \ --no-merge --explode --rate $SEGMENTS *.tlog
echo "#############" echo "Making maps (tlogmapmaker)" echo "#############" tlogmapmaker.py $SEGMENTS *.tlog
echo "#############" echo "Plotting position (plotlocalposition)" echo "#############" plotlocalposition.py $SEGMENTS *.tlog ~~~
Mapping and plotting
tlogmapmaker.py
~~~ $ tlogmapmaker.py --help usage: tlogmapmaker.py [-h] [-r] [-k KEEP] [-v] [--lat LAT] [--lon LON] [--zoom ZOOM] [--types TYPES] [--hdop-max HDOP_MAX] path [path ...]
Read tlog files and build Leaflet (interactive HTML) maps from GPS coordinates.
By default, read these messages and draw them bottom-to-top: GPSINPUT -- sensor data sent from ugps-extension to ArduSub, light grey line GPSRAWINT -- sensor data sent from ArduSub to QGC, slightly darker grey line GLOBALPOSITION_INT -- the filtered position estimate, blue line
Supports segments.
positional arguments: path
options: -h, --help show this help message and exit -r, --recurse enter directories looking for files -k KEEP, --keep KEEP process just these segments; a segment is 2 timestamps and a name, e.g., start,end,s1 -v, --verbose print a lot more information --lat LAT center the map at this latitude, default is mean of all points --lon LON center the map at this longitude, default is mean of all points --zoom ZOOM initial zoom, default is 18 --types TYPES comma separated list of message types --hdop-max HDOPMAX reject GPSINPUT messages where hdop exceeds this limit, default 100.0 (no limit) ~~~
map_maker.py
~~~ $ mapmaker.py --help usage: mapmaker.py [-h] [-r] [-v] [--lat LAT] [--lon LON] [--zoom ZOOM] path [path ...]
Read csv and txt files and build Leaflet (interactive HTML) maps from GPS coordinates.
For csv files: Latitude column header should be 'gps.lat' or 'lat' Longitude column header should be 'gps.lon' or 'lon'
For txt files: Look for NMEA 0183 GGA messages of the form $[A-Z]+ at the end of a line of text
positional arguments: path
options: -h, --help show this help message and exit -r, --recurse enter directories looking for csv and txt files -v, --verbose print a lot more information --lat LAT center the map at this latitude, default is mean of all points --lon LON center the map at this longitude, default is mean of all points --zoom ZOOM initial zoom, default is 18 ~~~
plotlocalposition.py
plotlocalposition scans MAVLink (.tlog) files for LOCALPOSITIONNED messages and plots the (x,y) positions. The plots are saved as PDF files.
~~~ $ plotlocalposition.py --help usage: plotlocalposition.py [-h] [-r] [-k KEEP] path [path ...]
Look for LOCATIONPOSITIONNED messages in tlog files, plot x and y, and write PDF files.
Supports segments.
positional arguments: path
options: -h, --help show this help message and exit -r, --recurse enter directories looking for files -k KEEP, --keep KEEP process just these segments; a segment is 2 timestamps and a name, e.g., start,end,s1 ~~~
Generating csv files
tlog_merge.py
Read MAVLink messages from a tlog file (telemetry log) and merge the messages into a single, wide csv file. The merge operation does a forward-fill (data is copied from the previous row), so the resulting merged csv file may be substantially larger than the sum of the per-type csv files.
Thie tool can also write multiple csv files, one per type, using the --explode option.
This example will open a tlog file, read in a set of tables that I find useful, calculate the data rates and report on any gaps in the data, and then write a series of csv files. Each csv file will correspond to a table type, a sysid and a compid, so you may see multiple csv files per table type: ~~~ $ tlogmerge.py --explode --no-merge --rate --split-source '2023-09-04 09-03-22.tlog' Processing 1 files Looking for these types: ['AHRS', 'AHRS2', 'ATTITUDE', 'BATTERYSTATUS', 'EKFSTATUSREPORT', 'GLOBALPOSITIONINT', 'GLOBALVISIONPOSITIONESTIMATE', 'GPS2RAW', 'GPSGLOBALORIGIN', 'GPSRAWINT', 'HEARTBEAT', 'LOCALPOSITIONNED', 'POWERSTATUS', 'RANGEFINDER', 'RAWIMU', 'RCCHANNELS', 'SCALEDIMU2',
'SCALEDPRESSURE', 'SCALEDPRESSURE2', 'SERVOOUTPUTRAW', 'SETGPSGLOBALORIGIN', 'SYSSTATUS', 'SYSTEMTIME', 'TIMESYNC', 'VFRHUD', 'VISIONPOSITIONDELTA']
Reading 2023-09-04 09-03-22.tlog WIREPROTOCOLVERSION 2.0 Parsing messages ... $ ls *.csv '2023-09-04 09-03-22AHRS11.csv' '2023-09-04 09-03-22HEARTBEAT11.csv' '2023-09-04 09-03-22SERVOOUTPUTRAW11.csv' '2023-09-04 09-03-22AHRS211.csv' '2023-09-04 09-03-22HEARTBEAT255190.csv' '2023-09-04 09-03-22SETGPSGLOBALORIGIN1197.csv' '2023-09-04 09-03-22ATTITUDE11.csv' '2023-09-04 09-03-22LOCALPOSITIONNED11.csv' '2023-09-04 09-03-22SYSSTATUS11.csv' '2023-09-04 09-03-22BATTERYSTATUS11.csv' '2023-09-04 09-03-22POWERSTATUS11.csv' '2023-09-04 09-03-22SYSTEMTIME11.csv' '2023-09-04 09-03-22EKFSTATUSREPORT11.csv' '2023-09-04 09-03-22RANGEFINDER11.csv' '2023-09-04 09-03-22SYSTEMTIME255190.csv' '2023-09-04 09-03-22GLOBALPOSITIONINT11.csv' '2023-09-04 09-03-22RAWIMU11.csv' '2023-09-04 09-03-22TIMESYNC11.csv' '2023-09-04 09-03-22GPSGLOBALORIGIN11.csv' '2023-09-04 09-03-22RCCHANNELS11.csv' '2023-09-04 09-03-22VFRHUD11.csv' '2023-09-04 09-03-22HEARTBEAT1194.csv' '2023-09-04 09-03-22SCALEDIMU211.csv' '2023-09-04 09-03-22VISIONPOSITIONDELTA1197.csv' '2023-09-04 09-03-22HEARTBEAT1197.csv' '2023-09-04 09-03-22SCALEDPRESSURE1_1.csv' ~~~
Parameters: ~~~ $ tlogmerge.py --help usage: tlogmerge.py [-h] [-r] [-k KEEP] [-v] [--explode] [--no-merge] [--types TYPES] [--max-msgs MAX_MSGS] [--max-rows MAX_ROWS] [--rate] [--sysid SYSID] [--compid COMPID] [--split-source] [--system-time] [--surftrak] path [path ...]
Read MAVLink messages from a tlog file (telemetry log) and merge the messages into a single, wide csv file. The merge operation does a forward-fill (data is copied from the previous row), so the resulting merged csv file may be substantially larger than the sum of the per-type csv files.
HEARTBEAT.mode is a combination of HEARTBEAT.basemode and HEARTBEAT.custommode with these values: -10 disarmed 0 armed, stabilize 1 armed, acro 2 armed, althold 3 armed, auto 4 armed, guided 7 armed, circle 9 armed, surface 16 armed, poshold 19 armed, manual 20 armed, motor detect 21 armed, rng_hold
Supports segments.
positional arguments: path
options: -h, --help show this help message and exit -r, --recurse enter directories looking for files -k KEEP, --keep KEEP process just these segments; a segment is 2 timestamps and a name, e.g., start,end,s1 -v, --verbose print a lot more information --explode write a csv file for each message type --no-merge do not merge tables, useful if you also select --explode --types TYPES comma separated list of message types, the default is a set of useful types --max-msgs MAXMSGS stop after processing this number of messages (default 500K) --max-rows MAXROWS stop if the merged table exceeds this number of rows (default 500K) --rate calculate rate for each message type --sysid SYSID select source system id (default is all source systems) --compid COMPID select source component id (default is all source components) --split-source split messages by source (sysid, compid) --system-time experimental: use ArduSub SYSTEMTIME.timeboot_ms rather than QGC timestamp --surftrak experimental: surftrak-specific analysis, see code ~~~
BIN_merge.py
Similar to tlogmerge.py, BINmerge.py opens Dataflash (.BIN) files and merges the messages into a single, wide csv file. The merge operation does a forward-fill (data is copied from the previous row), so the resulting merged csv file may be substantially larger than the sum of the per-type csv files.
Parameters: ~~~ $ BINmerge.py -h usage: BINmerge.py [-h] [-r] [-v] [--explode] [--no-merge] [--types TYPES] [--max-msgs MAXMSGS] [--max-rows MAXROWS] path [path ...]
Read ArduSub dataflash messages from a BIN file and merge the messages into a single, wide csv file. The merge operation does a forward-fill (data is copied from the previous row), so the resulting merged csv file may be substantially larger than the sum of the per-type csv files.
BIN_merge.py can also write multiple csv files, one per type, using the --explode option.
You can examine the contents of a single table using the --explode, --no-merge and --types options: BIN_merge.py --explode --no-merge --types GPS 000011.BIN
positional arguments: path
options: -h, --help show this help message and exit -r, --recurse enter directories looking for BIN files -v, --verbose print a lot more information --explode write a csv file for each message type --no-merge do not merge tables, useful if you also select --explode --types TYPES comma separated list of message types, the default is a small set of useful types --max-msgs MAXMSGS stop after processing this number of messages (default 500K) --max-rows MAXROWS stop if the merged table exceeds this number of rows (default 500K) --raw show all records; default is to drop BARO records where id==0 ~~~
General tools
show_types.py
show_types opens MAVLink (.tlog) and Dataflash (.BIN) files and counts records by type.
Sample output: ~~~ $ show_types.py 100.BIN
Processing 1 files
Reading 100.BIN
824 AHR2 Backup AHRS data
3 ARM Arming status changes
824 ATT Canonical vehicle attitude
1652 BARO Gathered Barometer data
826 BAT Gathered battery data
824 CTRL Attitude Control oscillation monitor diagnostics
826 CTUN Control Tuning information
82 DSF Onboard logging statistics
83 DU32 Generic 32-bit-unsigned-integer storage
3 ERR Specifically coded error messages
6 EV Specifically coded event messages
153 FMT Message defining the format of messages in this file
153 FMTU Message defining units and multipliers used for fields of other messages
826 FTN Filter Tuning Message - per motor
193 GPA GPS accuracy information
193 GPS Information received from GNSS systems attached to the autopilot
4124 IMU Inertial Measurement Unit data
79 IOMC TODO -- update tool
826 MAG Information received from compasses
246 MAV GCS MAVLink link statistics
4 MODE vehicle control mode information
824 MOTB Motor mixer information
68 MSG Textual messages
14 MULT Message mapping from single character to numeric multiplier
2 ORGN Vehicle navigation origin or other notable position
902 PARM parameter value
7 PM autopilot system performance and general data dumping ground
479 POS Canonical vehicle position [lat/lon/alt]
826 POWR TODO -- update tool
29 PSCD Position Control Down
824 RATE Desired and achieved vehicle attitude rates. Not logged in Fixed Wing Plane modes.
824 RCI2 (More) RC input channels to vehicle
824 RCIN RC input channels to vehicle
824 RCOU Servo channel output values 1 to 14
34 UNIT Message mapping from single character to SI unit
1648 VIBE Processed (acceleration) vibration information
1648 XKF1 EKF3 estimator outputs
1648 XKF2 EKF3 estimator secondary outputs
1648 XKF3 EKF3 innovations
1648 XKF4 EKF3 variances
824 XKF5 EKF3 Sensor innovations (primary core) and general dumping ground
1648 XKFS EKF3 sensor selection
1648 XKQ EKF3 quaternion defining the rotation from NED to XYZ (autopilot) axes
16 XKT EKF3 timing information
164 XKV1 EKF3 State variances (primary core)
164 XKV2 more EKF3 State Variances (primary core)
542 XKY0 EKF Yaw Estimator States
542 XKY1 EKF Yaw Estimator Innovations
~~~
Parameters: ~~~ $ showtypes.py --help usage: showtypes.py [-h] [-r] path [path ...]
Read messages from tlog (telemetry) and BIN (dataflash) logs and report on the message types found.
positional arguments: path
options: -h, --help show this help message and exit -r, --recurse enter directories looking for tlog and BIN files ~~~
tlog_info.py
Read tlog files and report on a few interesting things.
~~~ $ tloginfo.py --help usage: tloginfo.py [-h] [-r] [-k KEEP] path [path ...]
Read MAVLink messages from a tlog file (telemetry log) and report on a few interesting things.
Supports segments.
positional arguments: path
options: -h, --help show this help message and exit -r, --recurse enter directories looking for files -k KEEP, --keep KEEP process just these segments; a segment is 2 timestamps and a name, e.g., start,end,s1 ~~~
BIN_info.py
BIN_info opens Dataflash (.BIN) files and reports on a few interesting things.
Sample output: ~~~ $ BIN_info.py *.BIN
Processing 2 files
Results for 100.BIN 193 GPS records, gpsweek is always 0, no datetime information List of messages, with counts: 3 ArduSub V4.1.0 (f2af3c7e) 3 ChibiOS: 93e6e03d 1 EKF3 IMU0 stopped aiding 1 EKF3 IMU1 stopped aiding 2 Frame: VECTORED6DOF 1 GPS 1: specified as MAV 2 IMU0: fast sampling enabled 8.0kHz/1.0kHz 3 Lost manual control 44 MYGCS: 255, heartbeat lost 1 New mission 1 Param space used: 951/3840 3 Pixhawk1 0049001F 34395111 30323935 1 RC Protocol: None
2 RCOut: PWM:1-12
Results for 102.BIN 6111 GPS records, gps_week is always 0, no datetime information List of messages, with counts: 2 #Gain is 30% 4 #Gain is 40% 3 #Gain is 50% 2 #Gain is 60% 1 #Gain is 70% 1 ArduSub V4.1.0 (f2af3c7e) 1 ChibiOS: 93e6e03d 1 EKF3 IMU0 stopped aiding 1 EKF3 IMU1 stopped aiding 1 EKF3 lane switch 1 1 GPS 1: specified as MAV 1 Lost manual control 1 MYGCS: 255, heartbeat lost 1 New mission 1 Param space used: 951/3840 1 Pixhawk1 0049001F 34395111 30323935 1 RC Protocol: None ~~~
Parameters: ~~~ $ BINinfo.py -h usage: BINinfo.py [-h] [-r] path [path ...]
Read dataflash messages from a BIN file and report on a few interesting things.
positional arguments: path
options: -h, --help show this help message and exit -r, --recurse enter directories looking for BIN files ~~~
tlog_param.py
Generate QGC-compatible parameter files from tlog files.
~~~ $ tlogparam.py --help usage: tlogparam.py [-h] [-r] [-c] path [path ...]
Read MAVLink PARAM_VALUE messages from a tlog file (telemetry log), reconstruct the parameter state of a vehicle, and write the parameters to a QGC-compatible params file.
positional arguments: path
options: -h, --help show this help message and exit -r, --recurse enter directories looking for tlog files -c, --changes only show changes across files, do not write *.params files ~~~
Tools for working with Waterlinked UGPS systems
wlugpslogger.py
~~~ $ wlugpslogger.py -h usage: wlugpslogger.py [-h] [--url URL] [--filtered] [--raw] [--locator] [--g2] [--all] [--rate RATE]
Get position data from the Water Linked UGPS API and write it to one or more csv files.
To run in the field, capturing all outputs: wlugpslogger.py --all
To test with the demo server, capturing all outputs: wlugpslogger.py --url https://demo.waterlinked.com --all
options: -h, --help show this help message and exit --url URL URL of UGPS topside unit --filtered log position/acoustic/filtered --raw log position/acoustic/raw --locator log position/global (locator) --g2 log position/master (G2 box) --all log everything --rate RATE polling rate ~~~
wlugpsprocess.py
~~~ $ wlugpsprocess.py -h usage: wlugpsprocess.py [-h] [-r] [--lat LAT] [--lon LON] [--heading HEADING] [--zoom ZOOM] path [path ...]
Generate a folium map from a log generated by wlugpslogger.py.
positional arguments: path
options: -h, --help show this help message and exit -r, --recurse enter directories looking for csv files --lat LAT WL UGPS antenna latitude --lon LON WL UGPS antenna longitude --heading HEADING WL UGPS antenna heading --zoom ZOOM initial zoom, default is 18 ~~~
MAVLink debugging tools
These might be useful when debugging tlog files or pymavlink.
- tlog_scan.py: report on any pymavlink crashes
- tlogbaddata.py: report on BAD_DATA messages
- tlog_backwards.py: read the timestamps and note when time appears to go backwards
Timestamp notes
For a QGC-generated tlog file, msg._timestamp is the UNIX system time when the message was logged, to the nearest ms.
References: * QGC log file format * QGC code * pymavlink code
Other tools
MAVExplorer.py
A terrific tool. Some nifty things it can do with tlog files:
* map GPS_INPUT GPS_RAW_INT GLOBAL_POSITION_INT is basically the same as tlogmapmaker.py
Nifty things it can do with BIN files:
* map GPS POS will show a map comparing the GPS inputs to EKF outputs
References
- https://github.com/waterlinked/examples
- https://github.com/tridge/log_analysis
Owner
- Name: Clyde McQueen
- Login: clydemcqueen
- Kind: user
- Location: Seattle
- Repositories: 14
- Profile: https://github.com/clydemcqueen
Xoogler ramping up on all things computer vision, ROS and maritime robotics.
GitHub Events
Total
- Issues event: 1
- Delete event: 1
- Push event: 8
- Pull request event: 2
Last Year
- Issues event: 1
- Delete event: 1
- Push event: 8
- Pull request event: 2
Issues and Pull Requests
Last synced: 7 months ago
All Time
- Total issues: 1
- Total pull requests: 1
- Average time to close issues: N/A
- Average time to close pull requests: less than a minute
- Total issue authors: 1
- Total pull request authors: 1
- Average comments per issue: 0.0
- Average comments per pull request: 0.0
- Merged pull requests: 1
- Bot issues: 0
- Bot pull requests: 0
Past Year
- Issues: 1
- Pull requests: 1
- Average time to close issues: N/A
- Average time to close pull requests: less than a minute
- Issue authors: 1
- Pull request authors: 1
- Average comments per issue: 0.0
- Average comments per pull request: 0.0
- Merged pull requests: 1
- Bot issues: 0
- Bot pull requests: 0
Top Authors
Issue Authors
- clydemcqueen (1)
Pull Request Authors
- clydemcqueen (1)
Top Labels
Issue Labels
Pull Request Labels
Dependencies
- actions/checkout v3 composite
- actions/setup-python v4 composite
- folium *
- matplotlib *
- numpy *
- pandas *
- pymavlink *
- pynmea2 *
- pytest *
- requests *