“无限未来”学术论坛 | Accelerating Federated Learning by Sparsifying Transmitted Model Updates

发布者:何万源发布时间:2024-11-17浏览次数:85

Date:2024.11.28 14:00-15:00

Venue:Purple Mountain Labotories Buliding B1 Room 722

Abstract:

Federated learning facilitates the collaborative training of a machine learning model among geographically dispersed clients by exchanging model updates with a central server via Internet communication. However, transmitting these updates between the server and numerous decentralized clients over the Internet consumes considerable bandwidth. This presentation showcases our various contributions aimed at improving communication efficiency in federated learning. Our focus lies in sparsifying the transmission of model updates from clients to the parameter server. By meticulously evaluating both the learning value and communication cost of transmitting each individual model update, we effectively mitigate the exposure of low-value updates to minimize communication costs. Extensive experiments conducted on real datasets demonstrate that our algorithms can significantly reduce communication costs compared to the state-of-the-art federated learning baselines.


Biography:

Dr Yipeng Zhou is a senior lecturer with the School of Computing, Faculty of Science and Engineering, Macquarie University, Australia. Before joining Macquarie University, he was a research fellow with the University of South Australia, and a lecturer with Shenzhen University, respectively. He got his Ph.D. and M.Phil degrees from The Chinese University of Hong Kong, and B.S. degree from University of Science and Technology of China, respectively. He received 2023 Macquarie University Vice-Chancellor's Research Excellence Award for Innovative Technology, and 2023 IEEE Open Journal of the Communications Society Best Editor Award. He was the recipient of 2018 Australia Research Council Discover Early Career Researcher Award (DECRA). His research interests lie in federated learning, data privacy-preservation, networking, etc. He has published 100+ papers in top venues such as IEEE INFOCOM, ICML, IJCAI, IEEE ToN, JSAC, TPDS, TMC, etc.