Skip to content

The repository accompanying the paper "Knowledge Distillation for Semantic Segmentation: A Label Space Unification Approach"

License

Notifications You must be signed in to change notification settings

UniBwTAS/data-priors

Repository files navigation

Knowledge Distillation for Semantic Segmentation: A Label Space Unification Approach

This is a complimentary repository to our paper: Knowledge Distillation for Semantic Segmentation A Label Space Unification Approach.

method

Installation

Requirements

  • python >= 3.10
  • pytorch >= 2.3
  • accelerate >= 1.2
  • transformers >= 4.45.0
$ pip install -r requirements.txt

Huggingface accelerate is a wrapper used mainly for multi-gpu and half-precision training. You can adjust the settings prior to training with (recommended for faster training) or just skip it:

$ accelerate config

Download Weights

Model Taxonomy IoU
M2FB GOOSE 64.4
M2FL GOOSE 67.9
M2FB Cityscapes 75.5
M2FL Cityscapes 78.3
M2FL Mapillary 52.7

Training steps

1. Pretrain on source taxonomy

Train a standard Mask2Former on a source dataset.

$ accelerate launch train.py --config config.yaml --exper_name <experiment_name>

Track progress in Tensorboard:

$ tensorboard --logdir experiments/<experiment_name>/logs

2. Ontology mapping

Before you can generate pseudo-labels with priors, you need to define an ontology mapping between target and extra datasets. You can find examples in datasets/<dataset_name>/lists/master_labels_<source_dataset>.csv

3. Generate pseudo-labels

Generate pseudo-labels using dataset priors. Use config_inference.yaml to set your labeling parameters.

$ accelerate launch inference.py

4. Train on the compound dataset

Now that you have generated labels with priors

$ accelerate launch train.py --config config.yaml --exper_name <experiment_name>

Inference

Inference is done using the same script as pseudo-label generation. Adjust the config_inference.yaml to save or display images with or w/o priors.

$ accelerate launch inference.py

TODOs

  • Publish weights
  • Simpler config files (too many legacy parameters)
  • Code cleanup, remove legacy code

About

The repository accompanying the paper "Knowledge Distillation for Semantic Segmentation: A Label Space Unification Approach"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages