DeepOF

DeepOF: a Python package for supervised and unsupervised pattern recognition in mice motion tracking data - Published in JOSS (2023)

https://github.com/mlfpm/deepof

Science Score: 93.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
    Found 9 DOI reference(s) in README and JOSS metadata
  • Academic publication links
    Links to: joss.theoj.org
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
    Published in Journal of Open Source Software

Scientific Fields

Mathematics Computer Science - 88% confidence
Economics Social Sciences - 85% confidence
Artificial Intelligence and Machine Learning Computer Science - 45% confidence
Last synced: 4 months ago · JSON representation

Repository

DeepLabCut based data analysis package including pose estimation and representation learning mediated behavior recognition

Basic Info
  • Host: GitHub
  • Owner: mlfpm
  • License: mit
  • Language: Python
  • Default Branch: master
  • Size: 612 MB
Statistics
  • Stars: 45
  • Watchers: 4
  • Forks: 6
  • Open Issues: 13
  • Releases: 0
Created over 5 years ago · Last pushed 4 months ago
Metadata Files
Readme Contributing License Code of conduct

README.md

Pipeline Coverage Documentation Status CodeFactor Version MLFPM Black DOI Docker


A suite for postprocessing time-series extracted from videos of freely moving rodents using DeepLabCut and SLEAP.

You can use this package to either extract pre-defined motifs from the time series (such as time-in-zone, climbing, basic social interactions) or to embed your data into a sequence-aware latent space to extract meaningful motifs in an unsupervised way! Both of these can be used within the package, for example, to automatically compare user-defined experimental groups. The package is compatible with single and multi-animal DLC 2.X, and SLEAP projects.

How do I start?

Installation:

The easiest way to install DeepOF is to use pip. Create and activate a virtual environment with Python >=3.9 and <3.11, for example using conda:

bash conda create -n deepof python=3.9

Then, activate the environment and install DeepOF:

bash conda activate deepof pip install deepof

Alternatively, you can download our pre-built Docker image, which contains all compatible dependencies:

```bash

download the latest available image

docker pull lucasmiranda42/deepof:latest

run the image in interactive mode, enabling you to open python and import deepof

docker run -it lucasmiranda42/deepof ```

Or use poetry:

```bash

after installing poetry and clonning the DeepOF repository, just run

poetry install # from the main directory ```

NOTE: installation via pip is at the moment not compatible with Apple Silicon. If you'd like to install DeepOF on such machines, please use either poetry or Docker. You should also install hdf5 using homebrew, as described in this issue.

Before we delve in:

DeepOF relies heavily on DeepLabCut and SLEAP output. Thorough tutorials on how to get started with pose estimation using DLC can be found here, and for SLEAP here. Once your videos are processed and tagged, you can use DeepOF to extract and annotate your motion-tracking time-series. While many features in DeepOF can work regardless of the set of labels used, we currently recommend using videos from a top-down perspective, and follow our recommended set of labels (which can be found in the full documentation page). Pre-trained models following this scheme, and capable of recognizing either C57Bl6 mice alone, or C57Bl6 and CD1 mice can be downloaded from our repository.

Basic usage:

The main module with which you'll interact is called deepof.data. Let's import it and create a project:

python import deepof.data my_deepof_project = deepof.data.Project( project_path=".", # Path where to create project files video_path="/path/to/videos", # Path to DLC or SLEAP tracked videos table_path="/path/to/tables", # Path to DLC or SLEAP output project_name="my_deepof_project", # Name of the current project exp_conditions={exp_ID: exp_condition} # Dictionary containing one or more experimental conditions per provided video )

This command will create a deepof.data.Project object storing all the necessary information to start. There are many parameters that we can set here, but let's stick to the basics for now.

One you have this, you can run you project using the .create() method, which will do quite a lot of computing under the hood (load your data, smooth your trajectories, compute distances, angles, and areas between body parts, and save all results to disk). The returned object belongs to the deepof.data.Coordinates class.

python my_project = my_project.create(verbose=True)

Once you have this, you can do several things! But let's first explore how the results of those computations mentioned are stored. To extract trajectories, distances, angles and/or areas, you can respectively type:

python my_project_coords = my_project.get_coords(center="Center", polar=False, align="Nose", speed=0) my_project_dists = my_project.get_distances(speed=0) my_project_angles = my_project.get_angles(speed=0) my_project_areas = my_project.get_areas(speed=0)

Here, the data are stored as deepof.data.table_dict instances. These are very similar to python dictionaries with experiment IDs as keys and pandas.DataFrame objects as values, with a few extra methods for convenience. Peeping into the parameters you see in the code block above, center centers your data (it can be either a boolean or one of the body parts in your model! in which case the coordinate origin will be fixed to the position of that point); polar makes the .get_coords() method return polar instead of Cartesian coordinates, and speed indicates the derivation level to apply (0 is position-based, 1 speed, 2 acceleration, 3 jerk, etc). Regarding align and align-inplace, they take care of aligning the animal position to the y Cartesian axis: if we center the data to "Center" and set align="Nose", align_inplace=True, all frames in the video will be aligned in a way that will keep the Center-Nose axis fixed. This is useful to constrain the set of movements that one can extract with our unsupervised methods.

As mentioned above, the two main analyses that you can run are supervised and unsupervised. They are executed by the .supervised_annotation() method, and the .deep_unsupervised_embedding() methods of the deepof.data.Coordinates class, respectively.

python supervised_annot = my_project.supervised_annotation() gmvae_embedding = my_project.deep_unsupervised_embedding()

The former returns a deepof.data.TableDict object, with a pandas.DataFrame per experiment containing a series of annotations. The latter is a bit more complicated: it returns a series of objects that depend on the model selected (we offer three flavours of deep clustering models), and allow for further analysis comparing cluster expression and dynamics.

That's it for this (very basic) introduction. Check out the tutorials and full documentation for details!


Cite us!

If you use DeepOF for your research, please consider citing the following:

bibtex @article{DeepOF:JOSS, title = {{DeepOF: a Python package for supervised and unsupervised pattern recognition in mice motion tracking data}}, year = {2023}, journal = {Journal of Open Source Software}, author = {Miranda, Lucas and Bordes, Joeri and P{\"{u}}tz, Benno and Schmidt, Mathias V. and M{\"{u}}ller-Myhsok, Bertram}, number = {86}, month = {6}, pages = {5394}, volume = {8}, url = {https://joss.theoj.org/papers/10.21105/joss.05394}, doi = {10.21105/JOSS.05394}, issn = {2475-9066} }

bibtex @article{DeepOF:NCOMMS, doi = {10.1038/s41467-023-40040-3}, url = {https://doi.org/10.1038/s41467-023-40040-3}, year = {2023}, month = jul, publisher = {Springer Science and Business Media {LLC}}, volume = {14}, number = {1}, author = {Joeri Bordes and Lucas Miranda and Maya Reinhardt and Sowmya Narayan and Jakob Hartmann and Emily L. Newman and Lea Maria Brix and Lotte van Doeselaar and Clara Engelhardt and Larissa Dillmann and Shiladitya Mitra and Kerry J. Ressler and Benno P\"{u}tz and Felix Agakov and Bertram M\"{u}ller-Myhsok and Mathias V. Schmidt}, title = {Automatically annotated motion tracking identifies a distinct social behavioral profile following chronic social defeat stress}, journal = {Nature Communications} }

All data and code used to generate the results in the NCOMMS paper are available here (password: DeepOF2023).


Issues

If you encounter any problems while using this package, please open an issue in the issue tracker.


Contributions

We welcome contributions from the community! If you want to contribute to this project, please check out our contribution guidelines.


This project has received funding from the European Union's Horizon 2020 research and innovation programme under the Marie Skodowska-Curie grant agreement No. 813533


Owner

  • Name: MLFPM
  • Login: mlfpm
  • Kind: organization

GitHub repository of the H2020 ITN 'Machine Learning Frontiers in Precision Medicine'

JOSS Publication

DeepOF: a Python package for supervised and unsupervised pattern recognition in mice motion tracking data
Published
June 12, 2023
Volume 8, Issue 86, Page 5394
Authors
Lucas Miranda ORCID
Research Group Statistical Genetics, Max Planck Institute of Psychiatry, Munich, Germany
Joeri Bordes ORCID
Research Group Neurobiology of Stress Resilience, Max Planck Institute of Psychiatry, Munich, Germany
Benno Pütz ORCID
Research Group Statistical Genetics, Max Planck Institute of Psychiatry, Munich, Germany
Mathias V. Schmidt ORCID
Research Group Neurobiology of Stress Resilience, Max Planck Institute of Psychiatry, Munich, Germany
Bertram Müller-Myhsok ORCID
Research Group Statistical Genetics, Max Planck Institute of Psychiatry, Munich, Germany
Editor
Elizabeth DuPre ORCID
Tags
biology neuroscience behavioral annotation

GitHub Events

Total
  • Issues event: 16
  • Watch event: 8
  • Issue comment event: 65
  • Push event: 223
  • Create event: 8
Last Year
  • Issues event: 16
  • Watch event: 8
  • Issue comment event: 65
  • Push event: 223
  • Create event: 8

Committers

Last synced: 7 months ago

All Time
  • Total Commits: 2,622
  • Total Committers: 3
  • Avg Commits per committer: 874.0
  • Development Distribution Score (DDS): 0.075
Past Year
  • Commits: 222
  • Committers: 3
  • Avg Commits per committer: 74.0
  • Development Distribution Score (DDS): 0.55
Top Committers
Name Email Commits
lucas_miranda l****2@g****m 2,425
Patrick S p****3@y****e 121
Patrick TS p****3@o****e 76
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 4 months ago

All Time
  • Total issues: 59
  • Total pull requests: 0
  • Average time to close issues: about 1 month
  • Average time to close pull requests: N/A
  • Total issue authors: 36
  • Total pull request authors: 0
  • Average comments per issue: 3.88
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 11
  • Pull requests: 0
  • Average time to close issues: 14 days
  • Average time to close pull requests: N/A
  • Issue authors: 7
  • Pull request authors: 0
  • Average comments per issue: 3.64
  • Average comments per pull request: 0
  • Merged pull requests: 0
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • edeno (8)
  • cellistigs (6)
  • guodong0912 (3)
  • juloder (3)
  • maxcaa (3)
  • rosalba3574 (2)
  • chemarestrepoCB (2)
  • joeribordes (2)
  • yukin0mura (2)
  • Lucas97223 (2)
  • marccanela (1)
  • neuraldino (1)
  • Xiangshougudu (1)
  • lisadiez (1)
  • rpace13 (1)
Pull Request Authors
Top Labels
Issue Labels
enhancement (10) good first issue (4)
Pull Request Labels

Packages

  • Total packages: 1
  • Total downloads:
    • pypi 282 last-month
  • Total dependent packages: 0
  • Total dependent repositories: 1
  • Total versions: 30
  • Total maintainers: 1
pypi.org: deepof
  • Versions: 30
  • Dependent Packages: 0
  • Dependent Repositories: 1
  • Downloads: 282 Last month
Rankings
Dependent packages count: 10.0%
Average: 16.9%
Downloads: 19.0%
Dependent repos count: 21.8%
Maintainers (1)
Last synced: 4 months ago

Dependencies

Dockerfile docker
  • python 3.9.14 build
poetry.lock pypi
  • 232 dependencies
requirements.txt pypi
  • absl-py ==1.2.0
  • appnope ==0.1.3
  • asttokens ==2.0.8
  • astunparse ==1.6.3
  • av ==10.0.0
  • backcall ==0.2.0
  • blosc2 ==2.0.0
  • cachetools ==5.2.0
  • catboost ==1.1
  • certifi ==2022.9.24
  • cffi ==1.15.1
  • charset-normalizer ==2.1.1
  • cloudpickle ==2.2.0
  • colorama ==0.4.5
  • contourpy ==1.0.5
  • cycler ==0.11.0
  • cython ==0.29.33
  • dask ==2022.9.2
  • dask-image ==2022.9.0
  • debugpy ==1.6.3
  • decorator ==5.1.1
  • dm-tree ==0.1.7
  • entrypoints ==0.4
  • et-xmlfile ==1.1.0
  • executing ==1.1.0
  • flatbuffers ==23.1.4
  • fonttools ==4.37.4
  • fsspec ==2022.8.2
  • gast ==0.4.0
  • google-auth ==2.12.0
  • google-auth-oauthlib ==0.4.6
  • google-pasta ==0.2.0
  • graphviz ==0.20.1
  • grpcio ==1.49.1
  • h5py ==3.7.0
  • hmmlearn ==0.2.8
  • idna ==3.4
  • imageio ==2.22.1
  • imageio-ffmpeg ==0.4.7
  • imbalanced-learn ==0.9.1
  • imblearn ==0.0
  • importlib-metadata ==5.0.0
  • ipykernel ==6.16.0
  • ipython ==8.5.0
  • ipywidgets ==8.0.2
  • jedi ==0.18.1
  • joblib ==1.2.0
  • jupyter-client ==7.3.5
  • jupyter-core ==4.11.1
  • jupyterlab-widgets ==3.0.3
  • kaleido ==0.2.1
  • keras ==2.11.0
  • keras-tcn ==3.5.0
  • keras-tcn-macos ==1.0
  • keras-tuner ==1.1.3
  • kiwisolver ==1.4.4
  • kt-legacy ==1.0.4
  • libclang ==14.0.6
  • llvmlite ==0.39.1
  • locket ==1.0.0
  • lxml ==4.9.1
  • markdown ==3.4.1
  • markupsafe ==2.1.1
  • matplotlib ==3.6.0
  • matplotlib-inline ==0.1.6
  • msgpack ==1.0.4
  • nest-asyncio ==1.5.6
  • networkx ==2.8.7
  • numba ==0.56.2
  • numexpr ==2.8.4
  • numpy ==1.23.3
  • oauthlib ==3.2.1
  • opencv-python ==4.6.0.66
  • openpyxl ==3.0.10
  • opt-einsum ==3.3.0
  • packaging ==21.3
  • pandas ==1.5.0
  • parso ==0.8.3
  • partd ==1.3.0
  • patsy ==0.5.3
  • pexpect ==4.8.0
  • pickleshare ==0.7.5
  • pillow ==9.2.0
  • pims ==0.6.1
  • plotly ==5.10.0
  • pot ==0.8.2
  • prompt-toolkit ==3.0.31
  • protobuf ==3.19.6
  • psutil ==5.9.2
  • ptyprocess ==0.7.0
  • pure-eval ==0.2.2
  • py ==1.11.0
  • py-cpuinfo ==9.0.0
  • pyasn1 ==0.4.8
  • pyasn1-modules ==0.2.8
  • pycparser ==2.21
  • pygments ==2.13.0
  • pynndescent ==0.5.7
  • pyparsing ==3.0.9
  • python-dateutil ==2.8.2
  • pytz ==2022.4
  • pywin32 ==304
  • pyyaml ==6.0
  • pyzmq ==24.0.1
  • regex ==2022.9.13
  • requests ==2.28.1
  • requests-oauthlib ==1.3.1
  • rsa ==4.9
  • ruptures ==1.1.7
  • scikit-learn ==1.1.2
  • scipy ==1.9.1
  • seaborn ==0.11.2
  • seglearn ==1.2.5
  • setuptools ==59.8.0
  • setuptools-scm ==7.0.5
  • shap ==0.41.0
  • shapely ==1.8.4
  • six ==1.16.0
  • slicer ==0.0.7
  • slicerator ==1.1.0
  • stack-data ==0.5.1
  • statannotations ==0.5.0
  • statsmodels ==0.13.2
  • tables ==3.8.0
  • tenacity ==8.1.0
  • tensorboard ==2.11.2
  • tensorboard-data-server ==0.6.1
  • tensorboard-plugin-wit ==1.8.1
  • tensorflow ==2.11.0
  • tensorflow-addons ==0.18.0
  • tensorflow-estimator ==2.11.0
  • tensorflow-io-gcs-filesystem ==0.27.0
  • tensorflow-macos ==2.11.0
  • tensorflow-probability ==0.17.0
  • termcolor ==2.0.1
  • threadpoolctl ==3.1.0
  • tifffile ==2022.8.12
  • tomli ==2.0.1
  • toolz ==0.12.0
  • tornado ==6.2
  • tqdm ==4.64.1
  • traitlets ==5.4.0
  • typeguard ==2.13.3
  • typing-extensions ==4.4.0
  • umap-learn ==0.5.3
  • urllib3 ==1.26.12
  • wcwidth ==0.2.5
  • werkzeug ==2.2.2
  • wheel ==0.37.1
  • widgetsnbextension ==4.0.3
  • wrapt ==1.14.1
  • zipp ==3.8.1
setup.py pypi
  • pkg.replace *