Name:
PowerGossip
Description:
Practical low-rank communication compression in decentralized deep learning
Professor — Lab:
Martin JaggiMachine Learning and Optimization Laboratory

Technical description:
Inspired by the PowerSGD algorithm for centralized deep learning, this algorithm uses power iteration steps to maximize the information transferred per bit. We prove that our method requires no additional hyperparameters, converges faster than prior methods, and is asymptotically independent of both the network and the compression.
Papers:
Project status:
inactive — entered showcase: 2021-03-05 — entry updated: 2024-04-09

Source code:
Lab Github - last commit: 2020-08-04
Code quality:
This project has not yet been evaluated by the C4DT Factory team. We will be happy to evaluate it upon request.
Project type:
Library
Programming language:
Python
License:
MIT