wildwing
Autonomous UAS Software for In Situ Imageomics Missions
Science Score: 75.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
✓DOI references
Found 9 DOI reference(s) in README -
✓Academic publication links
Links to: zenodo.org -
○Academic email domains
-
✓Institutional organization owner
Organization imageomics has institutional domain (imageomics.osu.edu) -
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (12.2%) to scientific vocabulary
Keywords
Scientific Fields
Repository
Autonomous UAS Software for In Situ Imageomics Missions
Basic Info
- Host: GitHub
- Owner: Imageomics
- License: mit
- Language: HTML
- Default Branch: main
- Homepage: https://imageomics.github.io/wildwing/
- Size: 26.3 MB
Statistics
- Stars: 2
- Watchers: 4
- Forks: 0
- Open Issues: 14
- Releases: 1
Topics
Metadata Files
README.md
wildwing 
WildWing: An open-source, autonomous and affordable UAS for animal behaviour video monitoring

This repo contains software to autonomously track group-living animals using Parrot Anafi drones.
Overview of WildWing framework
The WildWing Unmanned Aerial System (UAS) consists of three components: a Parrot Anafi drone, open-source control software, and a laptop equipped with GPU. The control software connects the drone to the autonomous navigation policy and allows users to monitor the system during deployment. The navigation policy analyzes video frames using computer vision models and determines the next commands to send to the drone. The control software is hosted on the laptop, where the users can also monitor the live WildWing system deployment.
Paper and Dataset
Read our paper here and find dataset here.
Instructions for use
See the WildWing wiki for detailed instructions on how to deploy the WildWing system.
Hardware Requirements
This tool requires a Parrot Anafi drone and its controller, and a laptop running Ubuntu 22.04.4 OS on 86_64 architecture.
Software Requirements
This tool requires a laptop running Ubuntu 22.04.4 OS with VLC media player and a text editor, such as nano, emacs, or vi. VisualStudio Code may also be used.
See requirements.txt for required packages. \ Details on the control software here: SoftwarePilot \ Optional: SmartPhone with FreeFlight 6 app to connect drone and controller.
Step 1: Set-up hardware
- On the laptop, create conda environment using the requirements.txt file by running the following command. Note: you only need to create this conda environment once.
conda create --name wildwing --file requirements.txt - Connect drone and controller. Power on the drone and the Parrot Skycontroller. Plug the drone and the controller together with an USB-A (controller) to USB-C (drone) cable. The LED light on the controller will turn blue once connected and you can unplug the controller from the drone. Optional: you may also follow the instructions in the FreeFlight app to connect the drone and controller.
- Plug the controller into the laptop using USB-C cable.
- Using VLC media player, connect to drone live-stream. From the “Media” menu of VLC, select “Open network stream”. Enter “rtsp://192.168.53.1/live” in the Network URL field.
Step 2: Initialize software parameters
Initialize the following parameters in the python scripts. You can use these parameters to customize the mission for specific weather conditions, species, and habitats. - controller.py - DURATION: number of sections to execute autonomous tracking mission - navigation.py - xdist: move +/- X meters in forward/backward plane - xdistnosubject: move +/- X meters forward if no subject detected - ydist: move +/- X meters in left/right plane - zdist: move +/- X meters in up/down plane
Step 3: Launch drone
- Place drone in an area that is clear of obstructions
Execute launch.sh from the command line. This will launch the drone, and initialize the log and telemetry files to record the mission data.
./launch.shOptional: manually maneuver from to a higher altitude using hand-held controller
Step 3: Monitor the system
- Monitor the drone's PoV using the VLC livestream
- Monitor the YOLO model output and telemetry data in /missions/missionrecordYYYYMMDD_HHMMSS/
- Check logs in /log/outputsYYYYMMDDHHMMSS.log
Step 4: End mission
Once the mission duration is complete, you may continue the autonomous tracking mission, or land the drone using the remote controller. To continue the mission without first landing, comment out the takeoff line in controller.py, shown below, and save the file. Run launch.sh from the terminal to start the new mission.
```
drone.piloting.takeoff()
```
Step 5: Analyze video data
This script saves the video recordings, telemetry data, and YOLO outputs for each mission. See the WildWing deployment data in Zenodo for example outputs.
To automatically label video data with behavior, we recommend using KABR tools.
To analyze the telemetry, use the data analysis notebook.
Mapped mission telemetry.
Acknowledgements
This work was supported by the Imageomics Institute, which is funded by the US National Science Foundation's Harnessing the Data Revolution (HDR) program under Award #2118240 (Imageomics: A New Frontier of Biological Information Powered by Knowledge-Guided Machine Learning). Additional support was also provided by the AI Institute for Intelligent Cyberinfrastructure with Computational Learning in the Environment (ICICLE), which is funded by the US National Science Foundation under Award #2112606. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
The data was gathered at The Wilds, a private, non-profit conservation center located on nearly 10,000 acres of reclaimed coal mine land in southeastern Ohio. The Wilds is home to rare and endangered species from around the world living in natural, open-range habitats. The data collection was conducted under approval by the The Wilds Science Committee.
Owner
- Name: Imageomics Institute
- Login: Imageomics
- Kind: organization
- Website: https://imageomics.osu.edu
- Twitter: imageomics
- Repositories: 4
- Profile: https://github.com/Imageomics
Citation (CITATION.cff)
cff-version: 1.0.0 message: "If you use this software, please cite it as below." authors: - family-names: Jenna given-names: Kline orcid: https://orcid.org/0009-0006-7301-5774 - family-names: Irizarry given-names: Kevyn - family-names: Zhong given-names: Alison title: "wildwing" version: 1.0.0 doi: 10.5281/zenodo.1234 date-released: 2025-02-20 url: "https://github.com/Imageomics/wildwing"
GitHub Events
Total
- Create event: 4
- Issues event: 1
- Release event: 1
- Watch event: 4
- Delete event: 1
- Issue comment event: 1
- Member event: 1
- Public event: 1
- Push event: 3
- Pull request event: 4
- Fork event: 1
Last Year
- Create event: 4
- Issues event: 1
- Release event: 1
- Watch event: 4
- Delete event: 1
- Issue comment event: 1
- Member event: 1
- Public event: 1
- Push event: 3
- Pull request event: 4
- Fork event: 1
Issues and Pull Requests
Last synced: 4 months ago
All Time
- Total issues: 1
- Total pull requests: 3
- Average time to close issues: 5 months
- Average time to close pull requests: N/A
- Total issue authors: 1
- Total pull request authors: 1
- Average comments per issue: 0.0
- Average comments per pull request: 0.0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 3
Past Year
- Issues: 1
- Pull requests: 3
- Average time to close issues: 5 months
- Average time to close pull requests: N/A
- Issue authors: 1
- Pull request authors: 1
- Average comments per issue: 0.0
- Average comments per pull request: 0.0
- Merged pull requests: 0
- Bot issues: 0
- Bot pull requests: 3
Top Authors
Issue Authors
- jennamk14 (1)
Pull Request Authors
- dependabot[bot] (3)