FedXplore - Framework for Federated Learning Attacks, Defences, Client Selection and Personalization
- Quickstart -- Follow the instructions and get the result!
- Attacks and Defences -- Deep dive into Byzantine-Robust Federated Learning
- Personalization -- Deep dive into Personalized Federated Learning
- Client Selection -- Deep dive into Client Selection Strategies
- Byzantine Robustness and Client Selection -- Feel the flexibility of framework in modular interaction
- C4 notation -- Context Container Component Code scheme.
- Federated Method Explaining -- Get the basis and write your own method
- Attacks -- Get the basis and write custom attack
python -m venv venv
source venv/bin/activate
pip install -e .See allowed optionalization in config.md
π Federated Averaging on CIFAR-10
python src/train.py \
training_params.batch_size=32 \
federated_params.print_client_metrics=False \
training_params.device_ids=[0] \
> fedavg_cifar.txtAt the first run, downloading CIFAR-10 takes some time.
device_ids controls the GPU number (if there are several GPUs on the machine). You can specify multiple ids, then the training will be evenly distributed across the specified devices.
Additionally, manager.batch_size client processes will be created. To forcefully terminate the training, kill any of the processes.
πͺοΈ Dirichlet Partition with $\alpha=0.1$ (strong heterogeneity) and FedCor client strategy
python src/train.py \
training_params.batch_size=32 \
federated_params.print_client_metrics=False \
distribution.alpha=0.1 \
federated_params.amount_of_clients=100 \
client_selector=fedcor \
> fedavg_fedcor_cifar10_dirichlet_alpha0.1.txtpython src/train.py \
federated_method=fltrust \
dataset@train_dataset=ptbxl \
dataset@test_dataset=ptbxl \
dataset@trust_dataset=ptbxl \
model_trainer=ptbxl \
distribution=uniform \
model=resnet1d18 \
training_params.batch_size=32 \
federated_params.print_client_metrics=False \
federated_params.clients_attack_types=label_flip \
federated_params.prop_attack_clients=0.5 \
federated_params.attack_scheme=constant \
federated_params.prop_attack_rounds=1.0 \
> fltrust_ptbxl_label_flip_half_byzantines.txtAt the first run, downloading PTB-XL takes some time.
python src/train.py \
federated_method=fedamp \
federated_method.strategy=sharded \
federated_method.cluster_params=[10,0.5] \
federated_params.amount_of_clients=100 \
federated_params.client_subset_size=100 \
training_params.batch_size=32 \
> fedamp_10_clusters_cifar10.txt