Science Score: 44.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (16.4%) to scientific vocabulary
Keywords
Repository
A mirror of https://gitlab.com/birder/birder
Basic Info
Statistics
- Stars: 8
- Watchers: 2
- Forks: 0
- Open Issues: 0
- Releases: 0
Topics
Metadata Files
README.md
Birder
An open-source computer vision framework for wildlife image analysis, featuring state-of-the-art models for species classification and detection.
Introduction
Birder is an open-source computer vision framework designed for wildlife imagery analysis, offering robust classification and detection capabilities for various species. While initially developed with a focus on avian species, the framework's architecture and methodologies are applicable to a wide range of wildlife computer vision tasks. This project leverages deep neural networks to provide models that can handle real-world data challenges in natural environments.
For comprehensive documentation, tutorials, and more visit the main documentation at docs/README.md.
The project features:
- A diverse collection of classification and detection models
- Support for self-supervised pre-training
- Knowledge distillation training (teacher-student)
- Custom utilities and data augmentation techniques
- Comprehensive training scripts
- Advanced error analysis tools
- Extensive documentation and tutorials (hopefully...)
Unlike projects that aim to reproduce ImageNet training results from common papers, Birder is tailored specifically for practical applications in wildlife monitoring, conservation efforts, ecological research, and nature photography.
As Ross Wightman eloquently stated in the timm README:
The work of many others is present here. I've tried to make sure all source material is acknowledged via links to github, arXiv papers, etc. in the README, documentation, and code docstrings. Please let me know if I missed anything.
The same principle applies to Birder. We stand on the shoulders of giants in the fields of computer vision, machine learning, and ecology. We've made every effort to acknowledge and credit the work that has influenced and contributed to this project. If you believe we've missed any attributions, please let us know by opening an issue.
Setup
Ensure PyTorch 2.7 is installed on your system
Install the latest Birder version:
sh
pip install birder
For detailed installation options, including source installation, refer to our Setup Guide.
Getting Started

Check out the Birder Colab notebook for an interactive tutorial.
If you prefer a local setup, follow the installation instructions below.
Once Birder is installed, you can start exploring its capabilities.
Birder provides pre-trained models that you can download using the download-model tool.
To download a model, use the following command:
sh
python -m birder.tools download-model mvit_v2_t_il-all
Create a data directory and download an example image:
sh
mkdir data
wget https://huggingface.co/spaces/birder-project/birder-image-classification/resolve/main/Eurasian%20teal.jpeg -O data/img_001.jpeg
To classify bird images, use the birder-predict script as follows:
sh
birder-predict -n mvit_v2_t -t il-all --show data/img_001.jpeg
For more options and detailed usage of the prediction tool, run:
sh
birder-predict --help
For more detailed usage instructions and examples, please refer to our documentation.
Pre-trained Models
Birder provides a comprehensive suite of pre-trained models for wildlife species classification, with current models specialized for avian species recognition.
To explore the full range of available pre-trained models, use the list-models tool:
sh
python -m birder.tools list-models --pretrained
This command displays a catalog of models ready for download.
Model Nomenclature
The naming convention for Birder models encapsulates key information about their architecture and training approach.
Architecture: The first part of the model name indicates the core neural network structure (e.g., MobileNet, ResNet).
Training indicators:
- intermediate: Signifies models that underwent a two-stage training process, beginning with a large-scale weakly labeled dataset before fine-tuning on the primary dataset
- mim: Indicates models that leveraged self-supervised pre-training techniques, primarily Masked Autoencoder (MAE), prior to supervised training
Other tags:
- quantized: Model that has been quantized to reduce the computational and memory costs of running inference
- reparameterized: Model that has been restructured to simplify its architecture for optimized inference performance
Epoch Number (optional): The last part of the model name may include an underscore followed by a number (e.g., 0, 200), which represents the epoch.
For instance, mnasnet10intermediate300 represents a MnasNet model with a alpha value of 1.0 that underwent intermediate training and is from epoch 300.
Self-supervised Image Pre-training
Our pre-training process utilizes a diverse collection of image datasets, combining general imagery with wildlife-specific content. This approach allows our models to learn rich, general-purpose visual representations before fine-tuning on specific classification tasks.
The pre-training dataset is composed of a mix of general images and bird specific imagery to improve downstream performance on the bird classification tasks.
For detailed information about these datasets, including descriptions, citations, and licensing details, please refer to docs/public_datasets.md.
Detection
Detection features are currently under development and will be available in future releases.
Project Status and Contributions
Birder is currently a personal project in active development. As the sole developer, I am focused on building and refining the core functionalities of the framework. At this time, I am not actively seeking external contributors.
However, I greatly appreciate the interest and support from the community. If you have suggestions, find bugs, or want to provide feedback, please feel free to:
- Open an issue in the project's issue tracker
- Use the project and share your experiences
- Star the repository if you find it useful
While I may not be able to incorporate external contributions at this stage, your input is valuable and helps shape the direction of Birder. I'll update this section if the contribution policy changes in the future.
Thank you for your understanding and interest in Birder!
Licenses
Code
The code in this project is licensed under Apache 2.0. See LICENSE for details. Some code is adapted from other projects. There are notices with links to the references at the top of the file or at the specific class/function. It is your responsibility to ensure compliance with licenses here and conditions of any dependent licenses.
If you think we've missed a reference or a license, please create an issue.
Pretrained Weights
Some of the pretrained weights available here are pretrained on ImageNet. ImageNet was released for non-commercial research purposes only (https://image-net.org/download). It's not clear what the implications of that are for the use of pretrained weights from that dataset. It's best to seek legal advice if you intend to use the pretrained weights in a commercial product.
Disclaimer
If you intend to use Birder, its pretrained weights, or any associated datasets in a commercial product, we strongly recommend seeking legal advice to ensure compliance with all relevant licenses and terms of use.
It's the user's responsibility to ensure that their use of this project, including any pretrained weights or datasets, complies with all applicable licenses and legal requirements.
Acknowledgments
Birder owes much to the work of others in computer vision, machine learning, and ornithology.
Special thanks to:
Ross Wightman: His work on PyTorch Image Models (timm) greatly inspired the design and approach of Birder.
Image Contributors:
- Yaron Schmid - from YS Wildlife
for their generous donations of bird photographs.
This project also benefits from numerous open-source libraries and ornithological resources.
If any attribution is missing, please open an issue to let us know.
Owner
- Name: Birder Project
- Login: birder-project
- Kind: organization
- Repositories: 1
- Profile: https://github.com/birder-project
Citation (CITATION.cff)
# This CITATION.cff file was generated with cffinit.
cff-version: 1.2.0
title: Birder
message: >-
If you use this software, please cite it using the
metadata from this file.
type: software
authors:
- given-names: Ofer
family-names: Hasson
repository-code: 'https://gitlab.com/birder/birder'
url: 'https://birder.gitlab.io/birder/'
repository-artifact: 'https://pypi.org/project/birder/'
abstract: >-
An open-source computer vision framework for wildlife
image analysis, focusing on avian species.
keywords:
- computer-vision
- image-classification
- object-detection
- self-supervised-learning
license: Apache-2.0
GitHub Events
Total
- Issues event: 4
- Watch event: 12
- Delete event: 44
- Issue comment event: 2
- Push event: 250
- Create event: 92
Last Year
- Issues event: 4
- Watch event: 12
- Delete event: 44
- Issue comment event: 2
- Push event: 250
- Create event: 92
Dependencies
- MonkeyType * development
- altair * development
- bandit * development
- black * development
- build * development
- bumpver * development
- captum * development
- coverage * development
- flake8 * development
- flake8-pep585 * development
- invoke * development
- ipython * development
- isort * development
- mkdocs * development
- mypy * development
- parameterized * development
- pylint * development
- requests * development
- setuptools * development
- tensorboard * development
- torchprofile ==0.0.4 development
- twine * development
- types-requests * development
- wheel * development
- ipykernel *
- ipywidgets *
- jupyterlab *
- jupyterlab-code-formatter *
- jupytext *
- nbconvert *
- torch ==2.4.1
- torchaudio ==2.4.1
- torchvision ==0.19.1
- torch ==2.4.1
- torchaudio ==2.4.1
- torchvision ==0.19.1
- Pillow *
- matplotlib *
- numpy *
- onnx *
- polars *
- pyarrow *
- pycocotools *
- rich *
- scikit-learn *
- scipy *
- torch-model-archiver *
- torch-workflow-archiver *
- torchinfo *
- torchmetrics *
- torchserve *
- tqdm *
- urllib3 *
- webdataset *
- Jinja2 * development
- MonkeyType * development
- altair * development
- bandit * development
- black * development
- build * development
- bumpver * development
- captum * development
- coverage * development
- flake8 * development
- flake8-pep585 * development
- invoke * development
- ipython * development
- isort * development
- mkdocs * development
- mkdocs-exclude * development
- mypy * development
- parameterized * development
- pylint * development
- requests * development
- safetensors * development
- setuptools * development
- torchprofile ==0.0.4 development
- twine * development
- types-requests * development
- wheel * development
- huggingface_hub *
- transformers *