Skip to content

OpenMixup Release V0.2.0

Compare
Choose a tag to compare
@Lupin1998 Lupin1998 released this 08 Apr 16:22
· 120 commits to main since this release

Highlights

  • Support various popular backbones (ConvNets and ViTs), various image datasets, popular mixup methods, and benchmarks for supervised learning. Config files are available (reorganized).
  • Support popular self-supervised methods (e.g., BYOL, MoCo.V3, MAE, SimMIM) on both large-scale and small-scale datasets, and self-supervised benchmarks (merged from MMSelfSup). Config files are available (reorganized).
  • Support analyzing tools for self-supervised learning (kNN/SVM/linear metrics and t-SNE/UMAP visualization).
  • Convenient usage of configs: fast configs generation by 'auto_train.py' and configs inheriting (MMCV).
  • Support mixed-precision training (NVIDIA Apex or MMCV Apex) for all methods.
  • Model Zoos and lists of Awesome Mixups have been released.

Bug Fixes

  • Done code refactoring follows MMSelfSup and MMClassification #3.
  • Fix mixed-precision training overflow (NAN & INF in supervised mixup methods).
  • Fix fine-tuning settings (ViT and Swin Transformer) as MMSelfsup.