Skip to content

Model-heterogeneous federated learning using knowledge distillation, evaluated on VGG-{11, 13, 16, 19} and CharCNN.

Notifications You must be signed in to change notification settings

mjyoussef/HFedKD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Overview

HFedKD is a meta-algorithm for model-Heteregenous Federated learning (FL) using Knowledge Distillation. It is designed to handles two key constraints while incurring minimal communication cost:

  1. The distribution of training samples across clients is non-iid and imbalanced (ie. some clients may have more samples than others.)
  2. Each client may have a different variation of a base model. For example, different clients may be training different versions of the VGG model.

We also provide implementations for isolated and clustered federated learning as baselines for comparison. All three algorithms are evaluated on the AG News and CIFAR10 datasets using the CharCNN and VGG family architectures, respectively.

About

Model-heterogeneous federated learning using knowledge distillation, evaluated on VGG-{11, 13, 16, 19} and CharCNN.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages