mmengine

OpenMMLab Foundational Library for Training Deep Learning Models

https://github.com/open-mmlab/mmengine

Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Committers with academic emails
    4 of 141 committers (2.8%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (8.0%) to scientific vocabulary

Keywords

ai computer-vision deep-learning machine-learning python pytorch

Keywords from Contributors

swin-transformer transformer realtime-segmentation pspnet medical-image-segmentation deeplabv3 retinal-vessel-segmentation self-supervised-learning semantic-segmentation vessel-segmentation
Last synced: 6 months ago · JSON representation ·

Repository

OpenMMLab Foundational Library for Training Deep Learning Models

Basic Info
Statistics
  • Stars: 1,366
  • Watchers: 24
  • Forks: 411
  • Open Issues: 249
  • Releases: 29
Topics
ai computer-vision deep-learning machine-learning python pytorch
Created about 4 years ago · Last pushed 6 months ago
Metadata Files
Readme Contributing License Code of conduct Citation

README.md

 
OpenMMLab website HOT      OpenMMLab platform TRY IT OUT
 
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/mmengine)](https://pypi.org/project/mmengine/) [![pytorch](https://img.shields.io/badge/pytorch-1.6~2.1-yellow)](#installation) [![PyPI](https://img.shields.io/pypi/v/mmengine)](https://pypi.org/project/mmengine) [![license](https://img.shields.io/github/license/open-mmlab/mmengine.svg)](https://github.com/open-mmlab/mmengine/blob/main/LICENSE) [Introduction](#introduction) | [Installation](#installation) | [Get Started](#get-started) | [📘Documentation](https://mmengine.readthedocs.io/en/latest/) | [🤔Reporting Issues](https://github.com/open-mmlab/mmengine/issues/new/choose)
English | [简体中文](README_zh-CN.md)

What's New

v0.10.6 was released on 2025-01-13.

Highlights:

  • Support custom artifact_location in MLflowVisBackend #1505
  • Enable exclude_frozen_parameters for DeepSpeedEngine._zero3_consolidated_16bit_state_dict #1517

Read Changelog for more details.

Introduction

MMEngine is a foundational library for training deep learning models based on PyTorch. It serves as the training engine of all OpenMMLab codebases, which support hundreds of algorithms in various research areas. Moreover, MMEngine is also generic to be applied to non-OpenMMLab projects. Its highlights are as follows:

Integrate mainstream large-scale model training frameworks

Supports a variety of training strategies

Provides a user-friendly configuration system

Covers mainstream training monitoring platforms

Installation

Supported PyTorch Versions | MMEngine | PyTorch | Python | | ------------------ | ------------ | -------------- | | main | >=1.6 \<=2.1 | >=3.8, \<=3.11 | | >=0.9.0, \<=0.10.4 | >=1.6 \<=2.1 | >=3.8, \<=3.11 |

Before installing MMEngine, please ensure that PyTorch has been successfully installed following the official guide.

Install MMEngine

bash pip install -U openmim mim install mmengine

Verify the installation

bash python -c 'from mmengine.utils.dl_utils import collect_env;print(collect_env())'

Get Started

Taking the training of a ResNet-50 model on the CIFAR-10 dataset as an example, we will use MMEngine to build a complete, configurable training and validation process in less than 80 lines of code.

Build Models First, we need to define a **model** which 1) inherits from `BaseModel` and 2) accepts an additional argument `mode` in the `forward` method, in addition to those arguments related to the dataset. - During training, the value of `mode` is "loss", and the `forward` method should return a `dict` containing the key "loss". - During validation, the value of `mode` is "predict", and the forward method should return results containing both predictions and labels. ```python import torch.nn.functional as F import torchvision from mmengine.model import BaseModel class MMResNet50(BaseModel): def __init__(self): super().__init__() self.resnet = torchvision.models.resnet50() def forward(self, imgs, labels, mode): x = self.resnet(imgs) if mode == 'loss': return {'loss': F.cross_entropy(x, labels)} elif mode == 'predict': return x, labels ```
Build Datasets Next, we need to create **Dataset**s and **DataLoader**s for training and validation. In this case, we simply use built-in datasets supported in TorchVision. ```python import torchvision.transforms as transforms from torch.utils.data import DataLoader norm_cfg = dict(mean=[0.491, 0.482, 0.447], std=[0.202, 0.199, 0.201]) train_dataloader = DataLoader(batch_size=32, shuffle=True, dataset=torchvision.datasets.CIFAR10( 'data/cifar10', train=True, download=True, transform=transforms.Compose([ transforms.RandomCrop(32, padding=4), transforms.RandomHorizontalFlip(), transforms.ToTensor(), transforms.Normalize(**norm_cfg) ]))) val_dataloader = DataLoader(batch_size=32, shuffle=False, dataset=torchvision.datasets.CIFAR10( 'data/cifar10', train=False, download=True, transform=transforms.Compose([ transforms.ToTensor(), transforms.Normalize(**norm_cfg) ]))) ```
Build Metrics To validate and test the model, we need to define a **Metric** called accuracy to evaluate the model. This metric needs to inherit from `BaseMetric` and implements the `process` and `compute_metrics` methods. ```python from mmengine.evaluator import BaseMetric class Accuracy(BaseMetric): def process(self, data_batch, data_samples): score, gt = data_samples # Save the results of a batch to `self.results` self.results.append({ 'batch_size': len(gt), 'correct': (score.argmax(dim=1) == gt).sum().cpu(), }) def compute_metrics(self, results): total_correct = sum(item['correct'] for item in results) total_size = sum(item['batch_size'] for item in results) # Returns a dictionary with the results of the evaluated metrics, # where the key is the name of the metric return dict(accuracy=100 * total_correct / total_size) ```
Build a Runner Finally, we can construct a **Runner** with previously defined `Model`, `DataLoader`, and `Metrics`, with some other configs, as shown below. ```python from torch.optim import SGD from mmengine.runner import Runner runner = Runner( model=MMResNet50(), work_dir='./work_dir', train_dataloader=train_dataloader, # a wrapper to execute back propagation and gradient update, etc. optim_wrapper=dict(optimizer=dict(type=SGD, lr=0.001, momentum=0.9)), # set some training configs like epochs train_cfg=dict(by_epoch=True, max_epochs=5, val_interval=1), val_dataloader=val_dataloader, val_cfg=dict(), val_evaluator=dict(type=Accuracy), ) ```
Launch Training ```python runner.train() ```

Learn More

Tutorials - [Runner](https://mmengine.readthedocs.io/en/latest/tutorials/runner.html) - [Dataset and DataLoader](https://mmengine.readthedocs.io/en/latest/tutorials/dataset.html) - [Model](https://mmengine.readthedocs.io/en/latest/tutorials/model.html) - [Evaluation](https://mmengine.readthedocs.io/en/latest/tutorials/evaluation.html) - [OptimWrapper](https://mmengine.readthedocs.io/en/latest/tutorials/optim_wrapper.html) - [Parameter Scheduler](https://mmengine.readthedocs.io/en/latest/tutorials/param_scheduler.html) - [Hook](https://mmengine.readthedocs.io/en/latest/tutorials/hook.html)
Advanced tutorials - [Registry](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/registry.html) - [Config](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/config.html) - [BaseDataset](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/basedataset.html) - [Data Transform](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/data_transform.html) - [Weight Initialization](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/initialize.html) - [Visualization](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/visualization.html) - [Abstract Data Element](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/data_element.html) - [Distribution Communication](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/distributed.html) - [Logging](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/logging.html) - [File IO](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/fileio.html) - [Global manager (ManagerMixin)](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/manager_mixin.html) - [Use modules from other libraries](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/cross_library.html) - [Test Time Agumentation](https://mmengine.readthedocs.io/en/latest/advanced_tutorials/test_time_augmentation.html)
Examples - [Train a GAN](https://mmengine.readthedocs.io/en/latest/examples/train_a_gan.html)
Common Usage - [Resume Training](https://mmengine.readthedocs.io/en/latest/common_usage/resume_training.html) - [Speed up Training](https://mmengine.readthedocs.io/en/latest/common_usage/speed_up_training.html) - [Save Memory on GPU](https://mmengine.readthedocs.io/en/latest/common_usage/save_gpu_memory.html)
Design - [Hook](https://mmengine.readthedocs.io/en/latest/design/hook.html) - [Runner](https://mmengine.readthedocs.io/en/latest/design/runner.html) - [Evaluation](https://mmengine.readthedocs.io/en/latest/design/evaluation.html) - [Visualization](https://mmengine.readthedocs.io/en/latest/design/visualization.html) - [Logging](https://mmengine.readthedocs.io/en/latest/design/logging.html) - [Infer](https://mmengine.readthedocs.io/en/latest/design/infer.html)
Migration guide - [Migrate Runner from MMCV to MMEngine](https://mmengine.readthedocs.io/en/latest/migration/runner.html) - [Migrate Hook from MMCV to MMEngine](https://mmengine.readthedocs.io/en/latest/migration/hook.html) - [Migrate Model from MMCV to MMEngine](https://mmengine.readthedocs.io/en/latest/migration/model.html) - [Migrate Parameter Scheduler from MMCV to MMEngine](https://mmengine.readthedocs.io/en/latest/migration/param_scheduler.html) - [Migrate Data Transform to OpenMMLab 2.0](https://mmengine.readthedocs.io/en/latest/migration/transform.html)

Contributing

We appreciate all contributions to improve MMEngine. Please refer to CONTRIBUTING.md for the contributing guideline.

Citation

If you find this project useful in your research, please consider cite:

@article{mmengine2022, title = {{MMEngine}: OpenMMLab Foundational Library for Training Deep Learning Models}, author = {MMEngine Contributors}, howpublished = {\url{https://github.com/open-mmlab/mmengine}}, year={2022} }

License

This project is released under the Apache 2.0 license.

Ecosystem

Projects in OpenMMLab

  • MIM: MIM installs OpenMMLab packages.
  • MMCV: OpenMMLab foundational library for computer vision.
  • MMEval: A unified evaluation library for multiple machine learning libraries.
  • MMPreTrain: OpenMMLab pre-training toolbox and benchmark.
  • MMagic: OpenMMLab Advanced, Generative and Intelligent Creation toolbox.
  • MMDetection: OpenMMLab detection toolbox and benchmark.
  • MMYOLO: OpenMMLab YOLO series toolbox and benchmark.
  • MMDetection3D: OpenMMLab's next-generation platform for general 3D object detection.
  • MMRotate: OpenMMLab rotated object detection toolbox and benchmark.
  • MMTracking: OpenMMLab video perception toolbox and benchmark.
  • MMPose: OpenMMLab pose estimation toolbox and benchmark.
  • MMSegmentation: OpenMMLab semantic segmentation toolbox and benchmark.
  • MMOCR: OpenMMLab text detection, recognition, and understanding toolbox.
  • MMHuman3D: OpenMMLab 3D human parametric model toolbox and benchmark.
  • MMSelfSup: OpenMMLab self-supervised learning toolbox and benchmark.
  • MMFewShot: OpenMMLab fewshot learning toolbox and benchmark.
  • MMAction2: OpenMMLab's next-generation action understanding toolbox and benchmark.
  • MMFlow: OpenMMLab optical flow toolbox and benchmark.
  • MMDeploy: OpenMMLab model deployment framework.
  • MMRazor: OpenMMLab model compression toolbox and benchmark.
  • Playground: A central hub for gathering and showcasing amazing projects built upon OpenMMLab.

Owner

  • Name: OpenMMLab
  • Login: open-mmlab
  • Kind: organization
  • Location: China

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
  - name: "MMEngine Contributors"
title: "OpenMMLab Foundational Library for Training Deep Learning Models"
date-released: 2022-09-01
url: "https://github.com/open-mmlab/mmengine"
license: Apache-2.0

GitHub Events

Total
  • Create event: 6
  • Release event: 1
  • Issues event: 37
  • Watch event: 184
  • Member event: 1
  • Issue comment event: 62
  • Push event: 23
  • Pull request review comment event: 4
  • Pull request review event: 7
  • Pull request event: 54
  • Fork event: 68
Last Year
  • Create event: 6
  • Release event: 1
  • Issues event: 37
  • Watch event: 184
  • Member event: 1
  • Issue comment event: 62
  • Push event: 23
  • Pull request review comment event: 4
  • Pull request review event: 7
  • Pull request event: 54
  • Fork event: 68

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 896
  • Total Committers: 141
  • Avg Commits per committer: 6.355
  • Development Distribution Score (DDS): 0.68
Past Year
  • Commits: 17
  • Committers: 6
  • Avg Commits per committer: 2.833
  • Development Distribution Score (DDS): 0.471
Top Committers
Name Email Commits
Mashiro 5****E 287
Zaida Zhou 5****a 160
RangiLyu l****i@g****m 63
Qian Zhao 1****9 27
fanqiNO1 7****1 18
liukuikun 2****k 16
Haian Huang(深度眸) 1****9@q****m 16
Yuan Liu 3****u 12
Ma Zerun m****6@1****m 12
Wenwei Zhang 4****e 11
Jiazhen Wang 4****1 11
Zhihao Lin 3****a 10
Yining Li l****2@g****m 10
Tao Gong g****3@g****m 10
Alex Yang 5****r 9
Xiangxu-0103 x****3@g****m 9
Xin Li 7****7 8
takuoko t****0@g****m 7
Sanbu 9****y 6
Yifei Yang 2****5@q****m 6
jbwang1997 j****7@g****m 6
vansin m****e@1****m 6
Infinity_lee l****5@g****m 4
yancong 3****g 4
Epiphany 9****y 4
Range King R****Z@g****m 4
jason_w w****g@1****m 4
whcao 4****h 3
sjiang95 5****5 3
shenmishajing s****g@G****m 3
and 111 more...
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 209
  • Total pull requests: 359
  • Average time to close issues: about 2 months
  • Average time to close pull requests: 22 days
  • Total issue authors: 152
  • Total pull request authors: 121
  • Average comments per issue: 1.58
  • Average comments per pull request: 1.47
  • Merged pull requests: 225
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 36
  • Pull requests: 59
  • Average time to close issues: 4 days
  • Average time to close pull requests: 6 days
  • Issue authors: 32
  • Pull request authors: 27
  • Average comments per issue: 0.67
  • Average comments per pull request: 0.56
  • Merged pull requests: 23
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • HAOCHENYE (12)
  • zhouzaida (9)
  • anzitong (4)
  • KyanChen (4)
  • apachemycat (3)
  • whlook (3)
  • LinhanXu3928 (3)
  • mypydl (3)
  • BayMaxBHL (3)
  • RangiLyu (3)
  • Tau-J (3)
  • YinAoXiong (2)
  • AkideLiu (2)
  • collinmccarthy (2)
  • Li-Qingyun (2)
Pull Request Authors
  • HAOCHENYE (84)
  • zhouzaida (44)
  • fanqiNO1 (20)
  • LZHgrla (17)
  • tenacioustommy (8)
  • tibor-reiss (7)
  • C1rN09 (5)
  • shufanwu (4)
  • okotaku (4)
  • HIT-cwh (4)
  • lauriebax (4)
  • huaibovip (4)
  • mzr1996 (3)
  • wanghao9610 (3)
  • MGAMZ (3)
Top Labels
Issue Labels
bug (86) good first issue (3) community collaboration (2)
Pull Request Labels
enhancement (5) refactoring (4) awating-CI-pass (2) community discussion (1) NPU (1) Bug:P0 (1) pick2flexiblerunner (1) Bug:P1 (1) ready (1) P1 (1) need-design (1)

Packages

  • Total packages: 6
  • Total downloads:
    • pypi 376,243 last-month
  • Total docker downloads: 5,349
  • Total dependent packages: 42
    (may contain duplicates)
  • Total dependent repositories: 454
    (may contain duplicates)
  • Total versions: 70
  • Total maintainers: 3
pypi.org: mmengine

Engine of OpenMMLab projects

  • Versions: 30
  • Dependent Packages: 40
  • Dependent Repositories: 454
  • Downloads: 347,507 Last month
  • Docker Downloads: 5,349
Rankings
Dependent packages count: 0.5%
Dependent repos count: 0.7%
Downloads: 0.8%
Average: 1.7%
Stargazers count: 2.3%
Docker downloads count: 2.8%
Forks count: 3.3%
Maintainers (1)
Last synced: 6 months ago
proxy.golang.org: github.com/open-mmlab/mmengine
  • Versions: 29
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent packages count: 6.5%
Average: 6.7%
Dependent repos count: 6.9%
Last synced: 6 months ago
pypi.org: mmstat

OpenMMLab Stats Engine

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 16 Last month
Rankings
Stargazers count: 2.8%
Forks count: 4.0%
Dependent packages count: 6.6%
Average: 15.4%
Dependent repos count: 30.6%
Downloads: 33.1%
Maintainers (1)
Last synced: 6 months ago
pypi.org: mmbi

OpenMMLab Stats Engine

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 17 Last month
Rankings
Stargazers count: 2.8%
Forks count: 4.0%
Dependent packages count: 6.6%
Average: 15.4%
Dependent repos count: 30.6%
Downloads: 33.1%
Maintainers (1)
Last synced: 6 months ago
pypi.org: mmengine-open

Engine of OpenMMLab projects

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 52 Last month
Rankings
Stargazers count: 2.7%
Forks count: 3.8%
Dependent packages count: 10.6%
Average: 19.2%
Dependent repos count: 59.7%
Maintainers (1)
Last synced: 6 months ago
pypi.org: mmengine-lite

Engine of OpenMMLab projects

  • Versions: 8
  • Dependent Packages: 2
  • Dependent Repositories: 0
  • Downloads: 28,651 Last month
Rankings
Dependent packages count: 10.0%
Average: 38.8%
Dependent repos count: 67.6%
Maintainers (1)
Last synced: 6 months ago

Dependencies

requirements/docs.txt pypi
  • docutils ==0.16.0
  • myst-parser *
  • opencv-python *
  • sphinx ==4.0.2
  • sphinx-copybutton *
  • sphinx_markdown_tables *
  • torch *
  • torchvision *
requirements/runtime.txt pypi
  • addict *
  • matplotlib *
  • numpy *
  • pyyaml *
  • regex *
  • termcolor *
  • yapf *
requirements/tests.txt pypi
  • coverage * test
  • lmdb * test
  • pytest * test
.github/workflows/deploy.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
.github/workflows/lint.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
.github/workflows/merge_stage_test.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v4 composite
  • codecov/codecov-action v3 composite
.github/workflows/pr_stage_test.yml actions
  • actions/checkout v3 composite
  • actions/setup-python v4 composite
  • codecov/codecov-action v3 composite
.circleci/docker/Dockerfile docker
  • pytorch/pytorch ${PYTORCH}-cuda${CUDA}-cudnn${CUDNN}-devel build
docker/dev/Dockerfile docker
  • pytorch/pytorch ${PYTORCH}-cuda${CUDA}-cudnn${CUDNN}-devel build
docker/release/Dockerfile docker
  • pytorch/pytorch ${PYTORCH}-cuda${CUDA}-cudnn${CUDNN}-devel build
requirements.txt pypi
setup.py pypi
requirements/docs_extra.txt pypi
  • deepspeed *
requirements/runtime_lite.txt pypi
  • addict *
  • numpy *
  • pyyaml *
  • regex *
  • rich *
  • termcolor *
  • yapf *
requirements/tests_lite.txt pypi
  • lmdb * test
  • parameterized * test
  • pytest * test