Skip to main content
King Abdullah University of Science and Technology
Scientific Computing and Machine Learning
SCML
Scientific Computing and Machine Learning
Main navigation
  • Home
  • People
    • Principal Investigators
    • Research Scientists
    • Postdoctoral Fellows
    • Students
    • All Profiles
  • Events
    • All Events
    • Events Calendar
  • News
  • KAUST Innovation Hub in Shenzhen
  • Opportunities

communications

On the resolution of a theoretical question related to the nature of local training in federated learning

Peter Richtarik, Professor, Computer Science
Sep 13, 15:30 - 17:00

B1 L3 R3119

machine learning mathematical optimization communications algorithms

In this talk, I will explain the problem, its solution, and some subsequent work generalizing, extending and improving the ProxSkip method in various ways. We study distributed optimization methods based on the local training (LT) paradigm - achieving improved communication efficiency by performing richer local gradient-based training on the clients before parameter averaging - which is of key importance in federated learning. Looking back at the progress of the field in the last decade, we identify 5 generations of LT methods: 1) heuristic, 2) homogeneous, 3) sublinear, 4) linear, and 5) accelerated. The 5th generation, initiated by the ProxSkip method of Mishchenko et al (2022) and its analysis, is characterized by the first theoretical confirmation that LT is a communication acceleration mechanism.

Scientific Computing and Machine Learning (SCML)

Footer

  • A-Z Directory
    • All Content
    • Browse Related Sites
  • Site Management
    • Log in

© 2024 King Abdullah University of Science and Technology. All rights reserved. Privacy Notice