摘要: Federated learning allows edge devices to collaboratively train a global model by synchronizing their local updates without sharing private data. Yet, with limited network bandwidth at the edge, communication often becomes a sever... 展开
作者 | Chen Chen Hong Xu Wei Wang Baochun Li Bo Li Li Chen Gong Zhang | ||
---|---|---|---|
作者单位 | |||
文集名称 | 2021 IEEE 41st International Conference on Distributed Computing Systems | ||
出版年 | 2021 | ||
会议名称 | IEEE International Conference on Distributed Computing Systems | ||
页码 | 1-11 | 开始页/总页数 | 1 / 11 |
会议地点 | Washington(US) | 会议年/会议届次 | 2021 / 41st |
关键词 | Training Perturbation methods Conferences Bandwidth Collaborative work Data transfer Data models | ||
馆藏号 | IEL32806 (9546401) |