- Xiangchi Song ORCID:orcid.org/0009-0004-7212-08424,
- Zhaoyan Wang ORCID:orcid.org/0009-0006-3229-50604,
- KyeongDeok Baek ORCID:orcid.org/0000-0002-5887-59484 &
- …
- In-Young Ko ORCID:orcid.org/0000-0002-3843-263X4
Part of the book series:Communications in Computer and Information Science ((CCIS,volume 2188))
Included in the following conference series:
97Accesses
Abstract
As the number of IoT devices and the volume of data increase, distributed computing systems have become the primary deployment solution for large-scale Internet of Things (IoT) environments. Federated Learning (FL) is a collaborative machine learning framework that allows for model training using data from all participants while protecting their privacy. However, traditional FL suffers from low computational and communication efficiency in large-scale hierarchical cloud-edge collaborative IoT systems. Additionally, due to heterogeneity issues, not all IoT devices necessarily benefit from the global model of traditional FL, but instead require the maintenance of personalized levels in the global training process. Therefore we extend FL into a horizontal peer-to-peer (P2P) structure and introduce our P2PFL framework: Efficient Peer-to-peer Federated Learning for Users (EPFLU). EPFLU transitions the paradigms from vertical FL to horizontal P2P structure from the user perspective and incorporates personalized enhancement techniques using private information. Through horizontal consensus information aggregation and private information supplementation, EPFLU solves the weakness of traditional FL that dilutes the characteristics of individual client data and leads to model deviation. This structural transformation also significantly alleviates the original communication issues. Additionally, EPFLU has a customized simulation evaluation framework to make it more suitable for real-world large-scale IoT. Within this framework, we design extreme data distribution scenarios and conduct detailed experiments of EPFLU and selected baselines on the MNIST and CIFAR-10 datasets. The results demonstrate that the robust and adaptive EPFLU framework can consistently converge to optimal performance even under extreme data distribution scenarios. Compared with the selected vertical aggregation and horizontal transmission cumulative aggregation methods, EPFLU achieves communication improvements of 21% and 57% respectively.
This is a preview of subscription content,log in via an institution to check access.
Access this chapter
Subscribe and save
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime
Buy Now
- Chapter
- JPY 3498
- Price includes VAT (Japan)
- eBook
- JPY 6863
- Price includes VAT (Japan)
- Softcover Book
- JPY 8579
- Price includes VAT (Japan)
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
For detailed model and parameter setting, data distribution operations, and complete experimental records, please visithttps://github.com/XiangchiSong/EPFLU_P2PFL.
References
Alam, S., et al.: FedAIoT: a federated learning benchmark for artificial intelligence of things. arXiv preprintarXiv:2310.00109 (2023)
Chen, Q., Wang, Z., Zhang, W., Lin, X.: PPT: a privacy-preserving global model training protocol for federated learning in P2P networks. Comput. Secur.124, 102966 (2023)
Chen, Q., Wang, Z., Zhou, Y., Chen, J., Xiao, D., Lin, X.: CFL: cluster federated learning in large-scale peer-to-peer networks. In: International Conference on Information Security, pp. 464–472. Springer (2022)
Dang, Q., Zhang, G., Wang, L., Yang, S., Zhan, T.: Hybrid IoT device selection with knowledge transfer for federated learning. IEEE Internet Things J. (2023)
Feng, T., et al.: Fedmultimodal: a benchmark for multimodal federated learning. In: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 4035–4045 (2023)
Hata, M.: Empirical formula for propagation loss in land mobile radio services. IEEE Trans. Veh. Technol.29(3), 317–325 (1980)
Li, A., Sun, J., Wang, B., Duan, L., Li, S., Chen, Y., Li, H.: LotteryFL: empower edge intelligence with personalized and communication-efficient federated learning. In: 2021 IEEE/ACM Symposium on Edge Computing (SEC), pp. 68–79. IEEE (2021)
Lian, X., Zhang, C., Zhang, H., Hsieh, C.J., Zhang, W., Liu, J.: Can decentralized algorithms outperform centralized algorithms? a case study for decentralized parallel stochastic gradient descent. Adv. Neural Inf. Process. Syst.30 (2017)
Liu, L., Zhang, J., Song, S., Letaief, K.B.: Client-edge-cloud hierarchical federated learning. In: ICC 2020-2020 IEEE International Conference on Communications (ICC), pp. 1–6. IEEE (2020)
McMahan, B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)
Mills, J., Hu, J., Min, G.: Communication-efficient federated learning for wireless edge intelligence in IoT. IEEE Internet Things J.7(7), 5986–5994 (2019)
Mills, J., Hu, J., Min, G.: Multi-task federated learning for personalised deep neural networks in edge computing. IEEE Trans. Parallel Distrib. Syst.33(3), 630–641 (2021)
Mudrakarta, P.K., Sandler, M., Zhmoginov, A., Howard, A.: K for the price of 1: parameter-efficient multi-task and transfer learning. arXiv preprintarXiv:1810.10703 (2018)
Reddi, S., et al.: Adaptive federated optimization.arXiv: Learning,arXiv: Learning (Feb 2020)
Sharma, A., Zhao, J.C., Chen, W., Qiu, Q., Bagchi, S., Chaterji, S.: How to learn collaboratively-federated learning to peer-to-peer learning and what’s at stake. In: 2023 53rd Annual IEEE/IFIP International Conference on Dependable Systems and Networks-Supplemental Volume (DSN-S), pp. 122–126. IEEE (2023)
Tran, N.H., Bao, W., Zomaya, A., Nguyen, M.N., Hong, C.S.: Federated learning over wireless networks: optimization model design and analysis. In: IEEE INFOCOM 2019-IEEE Conference on Computer Communications, pp. 1387–1395. IEEE (2019)
Wang, K., et al.: FlexiFed: personalized federated learning for edge clients with heterogeneous model architectures. In: Proceedings of the ACM Web Conference 2023, pp. 2979–2990 (2023)
Wu, Q., He, K., Chen, X.: Personalized federated learning for intelligent IoT applications: a cloud-edge based framework. IEEE Open J. Comput. Soc.1, 35–44 (2020)
Zhang, T., Feng, T., Alam, S., Lee, S., Zhang, M., Narayanan, S.S., Avestimehr, S.: FedAudio: a federated learning benchmark for audio tasks. In: ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 1–5. IEEE (2023)
Acknowledgements
This research was partly supported by the MSIT (Ministry of Science and ICT), Korea, under the ITRC (Information Technology Research Center) support program (IITP-2024-2020-0-01795) supervised by the IITP (Institute for Information & Communications Technology Planning & Evaluation) and IITP grant funded by the Korea government (MSIT) (No. RS-2024-00406245, Development of Software-Defined Infrastructure Technologies for Future Mobility).
Author information
Authors and Affiliations
School of Computing, Korea Advanced Institute of Science and Technology, Daejeon, Republic of Korea
Xiangchi Song, Zhaoyan Wang, KyeongDeok Baek & In-Young Ko
- Xiangchi Song
You can also search for this author inPubMed Google Scholar
- Zhaoyan Wang
You can also search for this author inPubMed Google Scholar
- KyeongDeok Baek
You can also search for this author inPubMed Google Scholar
- In-Young Ko
You can also search for this author inPubMed Google Scholar
Corresponding authors
Correspondence toXiangchi Song orIn-Young Ko.
Editor information
Editors and Affiliations
University of Lugano, Lugano, Ticino, Switzerland
Cesare Pautasso
University of Orléans, Orléans, France
Patrick Marcel
Rights and permissions
Copyright information
© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Song, X., Wang, Z., Baek, K., Ko, IY. (2025). EPFLU: Efficient Peer-to-Peer Federated Learning for Personalized User Models in Edge-Cloud Environments. In: Pautasso, C., Marcel, P. (eds) Current Trends in Web Engineering. ICWE 2024. Communications in Computer and Information Science, vol 2188. Springer, Cham. https://doi.org/10.1007/978-3-031-75110-3_1
Download citation
Published:
Publisher Name:Springer, Cham
Print ISBN:978-3-031-75109-7
Online ISBN:978-3-031-75110-3
eBook Packages:Computer ScienceComputer Science (R0)
Share this paper
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative