Gradient Compression

Gradient compression is a technique used in distributed machine learning to reduce communication bandwidth requirements during large-scale training