Skip to content

ispras/FedXplore

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FedXplore - Framework for Federated Learning Attacks, Defences, Client Selection and Personalization

Table of contents

  1. Quickstart -- Follow the instructions and get the result!
  2. Attacks and Defences -- Deep dive into Byzantine-Robust Federated Learning
  3. Personalization -- Deep dive into Personalized Federated Learning
  4. Client Selection -- Deep dive into Client Selection Strategies
  5. Byzantine Robustness and Client Selection -- Feel the flexibility of framework in modular interaction
  6. C4 notation -- Context Container Component Code scheme.
  7. Federated Method Explaining -- Get the basis and write your own method
  8. Attacks -- Get the basis and write custom attack

🚀 Quickstart Guide

📋 Prerequisites

python -m venv venv
source venv/bin/activate
pip install -e .

⚙️ Experiment Setups

See allowed optionalization in config.md

python src/train.py \
  training_params.batch_size=32 \
  federated_params.print_client_metrics=False \
  training_params.device_ids=[0] \
  > fedavg_cifar.txt

At the first run, downloading CIFAR-10 takes some time.

device_ids controls the GPU number (if there are several GPUs on the machine). You can specify multiple ids, then the training will be evenly distributed across the specified devices.

Additionally, manager.batch_size client processes will be created. To forcefully terminate the training, kill any of the processes.

🌪️ Dirichlet Partition with $\alpha=0.1$ (strong heterogeneity) and FedCor client strategy

python src/train.py \
  training_params.batch_size=32 \
  federated_params.print_client_metrics=False \
  distribution.alpha=0.1 \
  federated_params.amount_of_clients=100 \
  client_selector=fedcor \
  > fedavg_fedcor_cifar10_dirichlet_alpha0.1.txt

🦠 FLTrust with Label Flipping Attack on PTB-XL dataset

python src/train.py \
  federated_method=fltrust \
  dataset@train_dataset=ptbxl \
  dataset@test_dataset=ptbxl \
  dataset@trust_dataset=ptbxl \
  model_trainer=ptbxl \
  distribution=uniform \
  model=resnet1d18 \
  training_params.batch_size=32 \
  federated_params.print_client_metrics=False \
  federated_params.clients_attack_types=label_flip \
  federated_params.prop_attack_clients=0.5 \
  federated_params.attack_scheme=constant \
  federated_params.prop_attack_rounds=1.0 \
  > fltrust_ptbxl_label_flip_half_byzantines.txt

At the first run, downloading PTB-XL takes some time.

🧑‍🤝‍🧑 FedAMP with 10 clusters on CIFAR-10 dataset

python src/train.py \
  federated_method=fedamp \
  federated_method.strategy=sharded \
  federated_method.cluster_params=[10,0.5] \
  federated_params.amount_of_clients=100 \
  federated_params.client_subset_size=100 \
  training_params.batch_size=32 \
  > fedamp_10_clusters_cifar10.txt

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages