Skip to content

sidb70/DFL-Secure-Aggregation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DFL-Secure-Aggregation

Federated learning (FL) enables a collaborative environment for training machine learning models without sharing training data between users. This is typically achieved by aggregating model gradients on a central server. Decentralized federated learning is a rising paradigm that enables users to collaboratively train machine learning models in a peer-to-peer manner, without the need for a central aggregation server. However, before applying decentralized FL in real-world use cases, nodes who deviate from the FL process (Byzantine nodes) must be taken into account. Recent research has focused on Byzantine-robustness for client-server or fully connected network topologies, while ignoring the complexity of network configurations possible with decentralized FL. Thus, the need for empirical evidence of Byzantine-robustness in decentralized FL networks is necessary.

We investigate the effects of state-of-the-art Byzantine-robust aggregation methods in complex, large scale network structures. Our findings show that state-of-the-art Byzantine robust aggregation strategies are not resilient to Byzantine agents embedded within large networks which are not fully-connected.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages