AI & ML interests
None defined yet.
Recent Activity
BlossomTune is an open-source orchestrator designed to simplify and democratize federated learning.
It provides an intuitive web interface for managing participants, launching experiments, and securely distributing credentials, making complex federated learning setups accessible to everyone.
Why Support BlossomTune?
- Federated learning is powerful, but setting up a secure, multi-party collaboration is notoriously difficult and error-prone.
- Participants often struggle with complex command-line tools, certificate management and environment setup.
- BlossomTune solves this by providing a complete, user-friendly solution right out of the box.
Our project features:
- Intuitive Gradio UI: An easy-to-use web dashboard for administrators to approve participants and manage federated runs.
- Seamless Onboarding: A simple "Join Federation" portal for participants to request access and receive their credentials.
- Enhanced Security: Built-in support for TLS encryption and participant authentication to ensure a secure federation.
By supporting BlossomTune you are investing in the open-source infrastructure needed to make federated learning practical for a wider audience of developers, researchers and organizations.
Your contribution will directly fund the development of new features, improve documentation and help us grow a vibrant community around accessible and secure AI/ML.
The BlossomTune Family
BlossomTune is an ecosystem of open-source tools designed to advance and simplify federated learning.
The project is divided into the core orchestrator and specialized Flower Apps for cutting-edge research.
Orchestrator
- BlossomTune-Orchestrator: This is the core project, providing a complete orchestration and management server for federated learning networks. It features a Gradio UI for participant onboarding, an admin panel, and the backend for running the Flower Superlink. https://github.com/ethicalabs-ai/BlossomTune-Orchestrator
Flower Apps
These applications are built upon the FlowerTune LLM templates for the paper "FlowerTune: A Cross-Domain Benchmark for Federated Fine-Tuning of Large Language Models" (https://arxiv.org/abs/2506.02961) presented at NeurIPS 2025 conference, showcasing federated fine-tuning of Language Models on specialised tasks.
- BlossomTuneLLM: A Flower App for the federated fine-tuning of transformers-based Large Language Models. https://github.com/ethicalabs-ai/BlossomTuneLLM
- BlossomTuneLLM-MLX: A specialized Flower App for the federated fine-tuning of Apple's MLX-LM models https://github.com/ethicalabs-ai/BlossomTuneLLM-MLX
Citations
@misc{gao-2025,
author = {Gao, Yan and Scamarcia, Massimo Roberto and Fernandez-Marques, Javier and Naseri, Mohammad and Ng, Chong Shen and Stripelis, Dimitris and Li, Zexi and Shen, Tao and Bai, Jiamu and Chen, Daoyuan and Zhang, Zikai and Hu, Rui and Song, InSeo and KangYoon, Lee and Jia, Hong and Dang, Ting and Wang, Junyan and Liu, Zheyuan and Beutel, Daniel Janes and Lyu, Lingjuan and Lane, Nicholas D.},
month = {6},
title = {{FlowerTune: a Cross-Domain benchmark for federated Fine-Tuning of large language models}},
year = {2025},
url = {https://arxiv.org/abs/2506.02961},
}