Website |
Blog |
Docs |
Conference |
Slack
Flower (flwr
) is a framework for building federated learning systems. The
design of Flower is based on a few guiding principles:
-
Customizable: Federated learning systems vary wildly from one use case to another. Flower allows for a wide range of different configurations depending on the needs of each individual use case.
-
Extendable: Flower originated from a research project at the University of Oxford, so it was built with AI research in mind. Many components can be extended and overridden to build new state-of-the-art systems.
-
Framework-agnostic: Different machine learning frameworks have different strengths. Flower can be used with any machine learning framework, for example, PyTorch, TensorFlow, Hugging Face Transformers, PyTorch Lightning, MXNet, scikit-learn, JAX, TFLite, or even raw NumPy for users who enjoy computing gradients by hand.
-
Understandable: Flower is written with maintainability in mind. The community is encouraged to both read and contribute to the codebase.
Meet the Flower community on flower.dev!
Flower's goal is to make federated learning accessible to everyone. This series of tutorials introduces the fundamentals of federated learning and how to implement them in Flower.
-
An Introduction to Federated Learning
(or open the Jupyter Notebook)
-
Using Strategies in Federated Learning
(or open the Jupyter Notebook)
Stay tuned, more tutorials are coming soon. Topics include Building Strategies for Federated Learning, Privacy and Security in Federated Learning, and Scaling Federated Learning.
- Installation
- Quickstart (TensorFlow)
- Quickstart (PyTorch)
- Quickstart (Hugging Face [code example])
- Quickstart (PyTorch Lightning [code example])
- Quickstart (MXNet)
- Quickstart (JAX)
- Quickstart (scikit-learn)
- Quickstart (TFLite on Android [code example])
Flower Baselines is a collection of community-contributed experiments that reproduce the experiments performed in popular federated learning publications. Researchers can build on Flower Baselines to quickly evaluate new ideas:
- FedBN: Federated Learning on non-IID Features via Local Batch Normalization:
- Adaptive Federated Optimization
Check the Flower documentation to learn more: Using Baselines
The Flower community loves contributions! Make your work more visible and enable others to build on it by contributing it as a baseline: Contributing Baselines
Several code examples show different usage scenarios of Flower (in combination with popular machine learning frameworks such as PyTorch or TensorFlow).
Quickstart examples:
- Quickstart (TensorFlow)
- Quickstart (PyTorch)
- Quickstart (Hugging Face)
- Quickstart (PyTorch Lightning)
- Quickstart (MXNet)
- Quickstart (JAX)
- Quickstart (scikit-learn)
- Quickstart (TFLite on Android)
Other examples:
- Raspberry Pi & Nvidia Jetson Tutorial
- Android & TFLite
- PyTorch: From Centralized to Federated
- MXNet: From Centralized to Federated
- Advanced Flower with TensorFlow/Keras
- Advanced Flower with PyTorch
- Single-Machine Simulation of Federated Learning Systems (PyTorch) (Tensorflow)
Flower is built by a wonderful community of researchers and engineers. Join Slack to meet them, contributions are welcome.
If you publish work that uses Flower, please cite Flower as follows:
@article{beutel2020flower,
title={Flower: A Friendly Federated Learning Research Framework},
author={Beutel, Daniel J and Topal, Taner and Mathur, Akhil and Qiu, Xinchi and Parcollet, Titouan and Lane, Nicholas D},
journal={arXiv preprint arXiv:2007.14390},
year={2020}
}
Please also consider adding your publication to the list of Flower-based publications in the docs, just open a Pull Request.
We welcome contributions. Please see CONTRIBUTING.md to get started!