The majority of current machine learning methods rely on centralized model training, where training data is collected and processed in data centers. However, in numerous practical scenarios, centralizing data poses challenges due to security and privacy concerns, leading to the adoption of federated learning (FL) approaches. FL enables distributed model training without data leaving devices or data silos, addressing privacy issues, yet it inherently introduces inefficiencies.These inefficiencies result in significantly higher computational requirements and energy consumption compared to centralized training, often leading to increased CO2 emissions, necessitating research for sustainable FL development.

Research Junior Group Lead

Fellow

Doctoral researcher