Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

Advertisement

Springer Nature Link
Log in

EPFLU: Efficient Peer-to-Peer Federated Learning for Personalized User Models in Edge-Cloud Environments

  • Conference paper
  • First Online:

Abstract

As the number of IoT devices and the volume of data increase, distributed computing systems have become the primary deployment solution for large-scale Internet of Things (IoT) environments. Federated Learning (FL) is a collaborative machine learning framework that allows for model training using data from all participants while protecting their privacy. However, traditional FL suffers from low computational and communication efficiency in large-scale hierarchical cloud-edge collaborative IoT systems. Additionally, due to heterogeneity issues, not all IoT devices necessarily benefit from the global model of traditional FL, but instead require the maintenance of personalized levels in the global training process. Therefore we extend FL into a horizontal peer-to-peer (P2P) structure and introduce our P2PFL framework: Efficient Peer-to-peer Federated Learning for Users (EPFLU). EPFLU transitions the paradigms from vertical FL to horizontal P2P structure from the user perspective and incorporates personalized enhancement techniques using private information. Through horizontal consensus information aggregation and private information supplementation, EPFLU solves the weakness of traditional FL that dilutes the characteristics of individual client data and leads to model deviation. This structural transformation also significantly alleviates the original communication issues. Additionally, EPFLU has a customized simulation evaluation framework to make it more suitable for real-world large-scale IoT. Within this framework, we design extreme data distribution scenarios and conduct detailed experiments of EPFLU and selected baselines on the MNIST and CIFAR-10 datasets. The results demonstrate that the robust and adaptive EPFLU framework can consistently converge to optimal performance even under extreme data distribution scenarios. Compared with the selected vertical aggregation and horizontal transmission cumulative aggregation methods, EPFLU achieves communication improvements of 21% and 57% respectively.

This is a preview of subscription content,log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 6863
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 8579
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide -see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Notes

  1. 1.

    For detailed model and parameter setting, data distribution operations, and complete experimental records, please visithttps://github.com/XiangchiSong/EPFLU_P2PFL.

References

  1. Alam, S., et al.: FedAIoT: a federated learning benchmark for artificial intelligence of things. arXiv preprintarXiv:2310.00109 (2023)

  2. Chen, Q., Wang, Z., Zhang, W., Lin, X.: PPT: a privacy-preserving global model training protocol for federated learning in P2P networks. Comput. Secur.124, 102966 (2023)

    Article MATH  Google Scholar 

  3. Chen, Q., Wang, Z., Zhou, Y., Chen, J., Xiao, D., Lin, X.: CFL: cluster federated learning in large-scale peer-to-peer networks. In: International Conference on Information Security, pp. 464–472. Springer (2022)

    Google Scholar 

  4. Dang, Q., Zhang, G., Wang, L., Yang, S., Zhan, T.: Hybrid IoT device selection with knowledge transfer for federated learning. IEEE Internet Things J. (2023)

    Google Scholar 

  5. Feng, T., et al.: Fedmultimodal: a benchmark for multimodal federated learning. In: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 4035–4045 (2023)

    Google Scholar 

  6. Hata, M.: Empirical formula for propagation loss in land mobile radio services. IEEE Trans. Veh. Technol.29(3), 317–325 (1980)

    Article MATH  Google Scholar 

  7. Li, A., Sun, J., Wang, B., Duan, L., Li, S., Chen, Y., Li, H.: LotteryFL: empower edge intelligence with personalized and communication-efficient federated learning. In: 2021 IEEE/ACM Symposium on Edge Computing (SEC), pp. 68–79. IEEE (2021)

    Google Scholar 

  8. Lian, X., Zhang, C., Zhang, H., Hsieh, C.J., Zhang, W., Liu, J.: Can decentralized algorithms outperform centralized algorithms? a case study for decentralized parallel stochastic gradient descent. Adv. Neural Inf. Process. Syst.30 (2017)

    Google Scholar 

  9. Liu, L., Zhang, J., Song, S., Letaief, K.B.: Client-edge-cloud hierarchical federated learning. In: ICC 2020-2020 IEEE International Conference on Communications (ICC), pp. 1–6. IEEE (2020)

    Google Scholar 

  10. McMahan, B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)

    Google Scholar 

  11. Mills, J., Hu, J., Min, G.: Communication-efficient federated learning for wireless edge intelligence in IoT. IEEE Internet Things J.7(7), 5986–5994 (2019)

    Article MATH  Google Scholar 

  12. Mills, J., Hu, J., Min, G.: Multi-task federated learning for personalised deep neural networks in edge computing. IEEE Trans. Parallel Distrib. Syst.33(3), 630–641 (2021)

    Article MATH  Google Scholar 

  13. Mudrakarta, P.K., Sandler, M., Zhmoginov, A., Howard, A.: K for the price of 1: parameter-efficient multi-task and transfer learning. arXiv preprintarXiv:1810.10703 (2018)

  14. Reddi, S., et al.: Adaptive federated optimization.arXiv: Learning,arXiv: Learning (Feb 2020)

  15. Sharma, A., Zhao, J.C., Chen, W., Qiu, Q., Bagchi, S., Chaterji, S.: How to learn collaboratively-federated learning to peer-to-peer learning and what’s at stake. In: 2023 53rd Annual IEEE/IFIP International Conference on Dependable Systems and Networks-Supplemental Volume (DSN-S), pp. 122–126. IEEE (2023)

    Google Scholar 

  16. Tran, N.H., Bao, W., Zomaya, A., Nguyen, M.N., Hong, C.S.: Federated learning over wireless networks: optimization model design and analysis. In: IEEE INFOCOM 2019-IEEE Conference on Computer Communications, pp. 1387–1395. IEEE (2019)

    Google Scholar 

  17. Wang, K., et al.: FlexiFed: personalized federated learning for edge clients with heterogeneous model architectures. In: Proceedings of the ACM Web Conference 2023, pp. 2979–2990 (2023)

    Google Scholar 

  18. Wu, Q., He, K., Chen, X.: Personalized federated learning for intelligent IoT applications: a cloud-edge based framework. IEEE Open J. Comput. Soc.1, 35–44 (2020)

    Article MATH  Google Scholar 

  19. Zhang, T., Feng, T., Alam, S., Lee, S., Zhang, M., Narayanan, S.S., Avestimehr, S.: FedAudio: a federated learning benchmark for audio tasks. In: ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 1–5. IEEE (2023)

    Google Scholar 

Download references

Acknowledgements

This research was partly supported by the MSIT (Ministry of Science and ICT), Korea, under the ITRC (Information Technology Research Center) support program (IITP-2024-2020-0-01795) supervised by the IITP (Institute for Information & Communications Technology Planning & Evaluation) and IITP grant funded by the Korea government (MSIT) (No. RS-2024-00406245, Development of Software-Defined Infrastructure Technologies for Future Mobility).

Author information

Authors and Affiliations

  1. School of Computing, Korea Advanced Institute of Science and Technology, Daejeon, Republic of Korea

    Xiangchi Song, Zhaoyan Wang, KyeongDeok Baek & In-Young Ko

Authors
  1. Xiangchi Song

    You can also search for this author inPubMed Google Scholar

  2. Zhaoyan Wang

    You can also search for this author inPubMed Google Scholar

  3. KyeongDeok Baek

    You can also search for this author inPubMed Google Scholar

  4. In-Young Ko

    You can also search for this author inPubMed Google Scholar

Corresponding authors

Correspondence toXiangchi Song orIn-Young Ko.

Editor information

Editors and Affiliations

  1. University of Lugano, Lugano, Ticino, Switzerland

    Cesare Pautasso

  2. University of Orléans, Orléans, France

    Patrick Marcel

Rights and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Song, X., Wang, Z., Baek, K., Ko, IY. (2025). EPFLU: Efficient Peer-to-Peer Federated Learning for Personalized User Models in Edge-Cloud Environments. In: Pautasso, C., Marcel, P. (eds) Current Trends in Web Engineering. ICWE 2024. Communications in Computer and Information Science, vol 2188. Springer, Cham. https://doi.org/10.1007/978-3-031-75110-3_1

Download citation

Publish with us

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 6863
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 8579
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide -see info

Tax calculation will be finalised at checkout

Purchases are for personal use only


[8]ページ先頭

©2009-2025 Movatter.jp