FedGAC is designed to tackle the challenges of statistical heterogeneity in federated learning. It introduces a personalized FL method that reduces computational and communication overhead by focusing on Critical Learning Periods (CLP) for client participation. The Adaptive Initialization of Local Models (AILM) module enhances global model generalization, while dynamic training parameter adjustment ensures efficiency. Additionally, FedGAC employs a compression method to minimize communication costs. Extensive experiments demonstrate that FedGAC achieves superior accuracy and communication efficiency compared to state-of-the-art methods.
- Requirement: Ubuntu 20.04, Python v3.5+, Pytorch and CUDA environment
- "./FedGAC_main.py" is about configurations and the basic Federated Learning framework
- "./Sims.py" describes the simulators for clients and central server
- "./Utils.py" contains all necessary functions and discusses how to get training and testing data
- "./Settings.py" describes the necessary packages and settings
- "./AILM.py" is the implementation of AILM algorithm
- Folder "./data folder" contains the data for experiments
- Folder "./Models" includes codes for AlexNet, VGG-11, ResNet-18 and LSTM,CNN
- Folder "./Optim" includes codes for FedProx, VRL-SGD, FedNova
- Folder "./Comp_FIM" is the library to calculate Fisher Information Matrix (FIM)
- Use "./FedGAC_main.py" to run results or ./run_FedGAC.sh to run experiments
- The results will be saved in "./log" folder