pith. machine review for the scientific record. sign in

arxiv: 2605.00970 · v1 · submitted 2026-05-01 · 💻 cs.IT · math.IT

Recognition: unknown

Split and Aggregation Learning for Foundation Models Over Mobile Embodied AI Network (MEAN): A Comprehensive Survey

Dusit Niyato, Jiawen Kang, Minrui Xu, Qianzhou Chen, Sijie Ji, Siqi Sun, Yijie Mao, Zhaohui Yang, Zhouxiang Zhao

Authors on Pith no claims yet

Pith reviewed 2026-05-09 18:38 UTC · model grok-4.3

classification 💻 cs.IT math.IT
keywords split learningaggregation learningfoundation models6G communicationmobile embodied AIdistributed machine learningprivacy preservationwireless networks
0
0 comments X

The pith

Split learning and aggregation learning enable privacy-preserving distributed training of foundation models in 6G mobile embodied AI networks.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

This survey examines split learning and aggregation learning as methods to collaboratively train large foundation models across entities in mobile embodied AI networks over 6G systems. Split learning partitions neural networks so that raw data stays local, while aggregation learning combines intermediate results or updates to boost robustness and cut communication costs. The analysis covers their architectures, how they integrate with AI-native 6G features such as semantic communication and reconfigurable intelligent surfaces, and their use in space-air-ground networks and quantum links. A sympathetic reader would care because these techniques directly tackle the barriers of data privacy, limited device resources, and high communication overhead that otherwise block scalable foundation model deployment on mobile platforms. If the surveyed approaches hold, they could shape efficient and secure distributed AI in future wireless networks.

Core claim

The paper establishes that split learning suits vertical data collaborations needing strict isolation while aggregation learning fits horizontal homogeneous settings, and that the two can be combined to balance privacy with communication efficiency. It analyzes SL configurations, aggregation techniques, and their roles in optimizing distributed foundation models, then maps these to 6G applications including semantic communication, RIS, SAGINs, and quantum communication to show how they advance AI-driven wireless systems focused on efficiency, privacy, and scalability.

What carries the argument

Split learning (SL), which partitions a neural network across multiple entities for collaborative training without sharing raw data, and aggregation learning (AL), which combines intermediate results or model updates from participants to improve robustness and resource use.

If this is right

  • SL and AL together can reduce data leakage risks while optimizing resource allocation in 6G for large-scale foundation model training.
  • Integration with semantic communication and RIS improves overall system efficiency in distributed wireless AI setups.
  • These methods support applications in space-air-ground integrated networks and quantum communication for embodied AI.
  • Combining SL and AL allows balancing strict privacy needs with practical communication constraints in mobile networks.
  • The approaches scale distributed AI training to handle the demands of foundation models without centralizing sensitive data.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The techniques could be adapted to earlier wireless generations or non-6G networks where similar privacy and bandwidth limits exist.
  • Real-world pilots in embodied AI robots communicating over 6G could test whether the privacy gains outweigh added coordination overhead.
  • Connections to other distributed learning variants might reveal hybrid frameworks that further cut latency for mobile devices.
  • If successful, these methods could influence standards for AI-native 6G protocols focused on edge intelligence.

Load-bearing premise

The survey assumes that the existing body of literature on split learning and aggregation learning adequately captures the main challenges, architectures, and solutions required for foundation models in mobile embodied AI networks.

What would settle it

A demonstration that split learning and aggregation learning cannot maintain acceptable accuracy or communication overhead when scaled to current large foundation models in real 6G testbeds with mobile embodied devices would undermine the survey's claims about their viability.

Figures

Figures reproduced from arXiv: 2605.00970 by Dusit Niyato, Jiawen Kang, Minrui Xu, Qianzhou Chen, Sijie Ji, Siqi Sun, Yijie Mao, Zhaohui Yang, Zhouxiang Zhao.

Figure 1
Figure 1. Figure 1: The outline of this survey. shown in Section V. Section VI addresses the communication systems for AL. Section VII explores emerging communica￾tion technologies and applications of SL and AL. Section VIII provides the conclusions. II. FUNDAMENTALS OF SPLIT LEARNING AND AGGREGATION LEARNING FOR FOUNDATION MODELS A. Fundamentals of Foundation Model Foundation models represent a transformative paradigm in mac… view at source ↗
Figure 2
Figure 2. Figure 2: The general workflow of the foundation model. It captures broad-adaptable intelligence and provides numerous downstream applications through view at source ↗
Figure 3
Figure 3. Figure 3: (a) A framework architecture of multi-agent SL. (b) A framework of model aggregation. (c) An ML framework for wireless communications. view at source ↗
Figure 4
Figure 4. Figure 4: A simple setup of SL, where the neural network is divided into two view at source ↗
Figure 5
Figure 5. Figure 5: Configurations of SL (a) simple vanilla, (b) extended vanilla, (c) without label sharing, and (d) vertically partitioned data. view at source ↗
Figure 6
Figure 6. Figure 6: The common framework of AL. The server distributes the global model at the start of each round, and clients asynchronously train and upload updates, which are partially aggregated to form the new global model. limitations [61], [62]. Traditional centralized approaches make it difficult to meet the demands of efficient learning in dis￾tributed wireless environments due to high communication overhead and sig… view at source ↗
Figure 7
Figure 7. Figure 7: An illustration of the SL system over wireless networks. view at source ↗
Figure 8
Figure 8. Figure 8: Training steps of SL with tiny server. (a) Forward propagation of clients and estimate loss via tiny server. (b) Devices send only informative smashed view at source ↗
Figure 9
Figure 9. Figure 9: Overview of the SL mechanism. scalability. It reduces energy consumption and communication overhead compared to traditional FL, but challenges remain. Communication overhead can be mitigated by reducing the frequency and volume of data exchange, using techniques such as selective transmission, quantization, and sparsification. Privacy risks, such as model inversion and label inference at￾tacks, can be addr… view at source ↗
Figure 11
Figure 11. Figure 11: Hierarchical aggregation in FL. The figure depicts the hierarchical view at source ↗
Figure 12
Figure 12. Figure 12: Framework for the partially encrypted MPC-based distributed view at source ↗
Figure 13
Figure 13. Figure 13: Asynchronous aggregation in an FL system. In the asynchronous view at source ↗
Figure 14
Figure 14. Figure 14: The comparison of Cloud-Hosted Mobile Intelligence, FL and FL with Secure Aggregation. Left: In a cloud-centric approach, user devices interact with cloud models, generating logs used for training. These logs are aggregated to improve the model, which is then deployed for future user requests. Middle: In Federated Learning, models are sent to user devices for local evaluation and training. Improved model … view at source ↗
Figure 15
Figure 15. Figure 15: Architecture alternatives for distributed learning systems: centralized, hierarchical, regional and decentralized architecture. In the centralized architecture view at source ↗
read the original abstract

The rapid advancements in foundation models and sixth-generation (6G) wireless communication systems necessitate the development of efficient, scalable, and privacy-preserving machine learning approaches. For foundation models in 6G, split learning (SL) and aggregation learning (AL) have emerged as promising paradigms that address key challenges in distributed artificial intelligence (AI), such as communication efficiency, resource allocation, and data privacy. SL enables multiple entities to collaboratively train deep learning models by partitioning neural networks, while AL focuses on aggregating intermediate results or model updates from multiple participants, improving robustness, optimizing resource utilization, and mitigating data leakage risks. Specifically, SL is ideal for scenarios requiring strict data isolation (e.g., vertical collaborations), whereas AL suits homogeneous horizontal data settings; they can be combined to balance privacy and communication efficiency. This survey provides a comprehensive analysis of SL and AL in 6G communication systems, exploring their architectures, technical methodologies, and integration with AI-native 6G communication technologies. We examine different SL configurations, aggregation techniques, and their roles in optimizing distributed foundation models. Furthermore, we discuss their applications in emerging wireless networks, including semantic communication, reconfigurable intelligent surfaces (RIS), space-air-ground integrated networks (SAGINs), and quantum communication. By analyzing the impact of SL and AL, this survey provides insights into their role in shaping distributed AI-driven communication systems in the 6G era, focusing on efficiency, privacy preservation, and scalability.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

1 major / 2 minor

Summary. The manuscript is a survey on split learning (SL) and aggregation learning (AL) for foundation models deployed over mobile embodied AI networks (MEAN) in 6G systems. It describes SL architectures for vertical data isolation and AL for horizontal aggregation, their combination for balancing privacy and efficiency, and integrations with semantic communication, RIS, SAGINs, and quantum links, with emphasis on communication efficiency, resource allocation, and privacy preservation.

Significance. A well-executed survey in this intersection of distributed learning and 6G could usefully organize the literature and highlight open problems for foundation-model-scale deployments. The paper's value rests entirely on the completeness and accuracy of its literature synthesis; no new derivations, proofs, or empirical results are presented.

major comments (1)
  1. The central claim of providing a 'comprehensive analysis' of SL/AL for foundation models in MEAN is load-bearing on the assumption that the selected literature adequately represents scaling challenges (e.g., communication overhead for billion-parameter vertical splits, mobility-induced latency, and energy constraints in embodied settings). The manuscript does not appear to include an explicit literature-search methodology, inclusion/exclusion criteria, or a dedicated discussion of gaps between generic SL/AL results and foundation-model requirements, which prevents verification that the cited works address these embodied-AI-specific issues rather than generic distributed training.
minor comments (2)
  1. The abstract and title introduce the acronym MEAN but do not define 'mobile embodied AI network' or enumerate its distinguishing constraints (real-time inference, on-device energy, physical mobility) early in the text; this should be clarified in the introduction or a dedicated background section.
  2. Several technical terms (e.g., 'AI-native 6G communication technologies') are used without immediate reference to the specific 6G features or papers being invoked; adding one-sentence definitions or forward pointers to later sections would improve readability.

Simulated Author's Rebuttal

1 responses · 0 unresolved

We thank the referee for the detailed and constructive feedback on our survey manuscript. The comments highlight opportunities to improve transparency and rigor, which we address below. We plan to incorporate the suggested revisions in the next version of the paper.

read point-by-point responses
  1. Referee: The central claim of providing a 'comprehensive analysis' of SL/AL for foundation models in MEAN is load-bearing on the assumption that the selected literature adequately represents scaling challenges (e.g., communication overhead for billion-parameter vertical splits, mobility-induced latency, and energy constraints in embodied settings). The manuscript does not appear to include an explicit literature-search methodology, inclusion/exclusion criteria, or a dedicated discussion of gaps between generic SL/AL results and foundation-model requirements, which prevents verification that the cited works address these embodied-AI-specific issues rather than generic distributed training.

    Authors: We agree that an explicit literature-search methodology and inclusion/exclusion criteria would enhance the verifiability of the survey. The original manuscript did not contain a dedicated subsection describing the search process (e.g., databases, keywords, time frame, or selection criteria focused on relevance to foundation models, SL/AL, 6G, and embodied AI). In the revised manuscript, we will add this information, likely as a new subsection in the introduction or a standalone methods section. We will also add a dedicated discussion (possibly as a new subsection or expanded conclusions) that explicitly examines gaps between generic SL/AL results and the scaling challenges specific to foundation models in MEAN, including communication overhead for large-parameter vertical splits, mobility-induced latency, and energy constraints in embodied settings. This discussion will map cited works to these challenges where possible and identify areas where the literature remains limited. These additions will directly support the claim of a comprehensive analysis while providing readers with the requested transparency. revision: yes

Circularity Check

0 steps flagged

No circularity: survey rests entirely on external citations

full rationale

This is a survey paper with no original derivations, equations, fitted parameters, or predictions. The abstract and structure summarize architectures, methodologies, and integrations from prior literature on split learning and aggregation learning. All technical claims are attributed to cited external works rather than derived internally, so no step reduces by construction to the paper's own inputs or self-citations. The central assumption about literature coverage is a scope limitation, not a circular derivation.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

The paper is a literature survey and introduces no new free parameters, axioms, or invented entities; all concepts are drawn from previously published work in distributed learning and wireless communications.

pith-pipeline@v0.9.0 · 5595 in / 1001 out tokens · 30933 ms · 2026-05-09T18:38:45.318292+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

188 extracted references · 21 canonical work pages · 3 internal anchors

  1. [1]

    Safely learning with private data: A federated learning framework for large language model,

    J. Zheng, H. Zhang, L. Wang, W. Qiu, H. Zheng, and Z. Zheng, “Safely learning with private data: A federated learning framework for large language model,”arXiv preprint arXiv:2406.14898, 2024

  2. [2]

    Communication-efficient learning of deep networks from decentral- ized data,

    B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y Arcas, “Communication-efficient learning of deep networks from decentral- ized data,” inArtificial intelligence and statistics. PMLR, 2017, pp. 1273–1282

  3. [3]

    Applications of distributed machine learning for the internet- of-things: A comprehensive survey,

    M. Le, T. Huynh-The, T. Do-Duy, T.-H. Vu, W.-J. Hwang, and Q.-V . Pham, “Applications of distributed machine learning for the internet- of-things: A comprehensive survey,”IEEE Communications Surveys & Tutorials, 2024

  4. [4]

    A comprehensive survey on communication-efficient federated learning in mobile edge environments,

    N. Jia, Z. Qu, B. Ye, Y . Wang, S. Hu, and S. Guo, “A comprehensive survey on communication-efficient federated learning in mobile edge environments,”IEEE Communications Surveys & Tutorials, pp. 1–1, 2025

  5. [5]

    Energy efficient federated learning over wireless communication networks,

    Z. Yang, M. Chen, W. Saad, C. S. Hong, and M. Shikh-Bahaei, “Energy efficient federated learning over wireless communication networks,” IEEE Trans. Wireless Commun., vol. 20, no. 3, pp. 1935–1949, 2021

  6. [6]

    Federated learning: Challenges, methods, and future directions,

    T. Li, A. K. Sahu, A. Talwalkar, and V . Smith, “Federated learning: Challenges, methods, and future directions,”IEEE Signal Processing Magazine, vol. 37, no. 3, pp. 50–60, 2020

  7. [7]

    Distributed learning of deep neural network over multiple agents,

    O. Gupta and R. Raskar, “Distributed learning of deep neural network over multiple agents,”Journal of Network and Computer Applications, vol. 116, pp. 1–8, 2018

  8. [8]

    Split learning for health: Distributed deep learning without sharing raw patient data

    P. Vepakomma, O. Gupta, T. Swedish, and R. Raskar, “Split learning for health: Distributed deep learning without sharing raw patient data,” arXiv preprint arXiv:1812.00564, 2018

  9. [9]

    Privacy-preserving deep learning,

    R. Shokri and V . Shmatikov, “Privacy-preserving deep learning,” in Proceedings of the 22nd ACM SIGSAC conference on computer and communications security, 2015, pp. 1310–1321

  10. [10]

    Edge learning for B5G networks with distributed signal processing: Semantic communication, edge computing, and wireless sensing,

    W. Xu, Z. Yang, D. W. K. Ng, M. Levorato, Y . C. Eldar, and M. Debbah, “Edge learning for B5G networks with distributed signal processing: Semantic communication, edge computing, and wireless sensing,”IEEE J. Sel. Topics Signal Process., vol. 17, no. 1, pp. 9–39, Jan. 2023

  11. [11]

    Semantic information extraction for text data with probability graph,

    Z. Zhao, Z. Yang, Y . Hu, L. Lin, and Z. Zhang, “Semantic information extraction for text data with probability graph,” inProc. 2023 IEEE/CIC Int. Conf. Commun. China (ICCC Workshops), Aug. 2023

  12. [12]

    Energy efficient UA V commu- nication with energy harvesting,

    Z. Yang, W. Xu, and M. Shikh-Bahaei, “Energy efficient UA V commu- nication with energy harvesting,”IEEE Trans. Veh. Technol., vol. 69, no. 2, pp. 1913–1927, 2019

  13. [13]

    The roadmap to 6G: AI empowered wireless networks,

    K. B. Letaief, W. Chen, Y . Shi, J. Zhang, and Y .-J. A. Zhang, “The roadmap to 6G: AI empowered wireless networks,”IEEE Communi- cations Magazine, vol. 57, no. 8, pp. 84–90, 2019

  14. [14]

    A joint learning and communications framework for federated learning over wireless networks,

    M. Chen, Z. Yang, W. Saad, C. Yin, H. V . Poor, and S. Cui, “A joint learning and communications framework for federated learning over wireless networks,”IEEE Trans. Wireless commun., vol. 20, no. 1, pp. 269–283, 2020

  15. [15]

    Learning rate op- timization for federated learning exploiting over-the-air computation,

    C. Xu, S. Liu, Z. Yang, Y . Huang, and K.-K. Wong, “Learning rate op- timization for federated learning exploiting over-the-air computation,” IEEE J. Sel. Areas Commun., vol. 39, no. 12, pp. 3742–3756, 2021

  16. [16]

    A survey on beyond 5G network slicing for smart cities applications,

    W. Rafique, J. Rani Barai, A. O. Fapojuwo, and D. Krishnamurthy, “A survey on beyond 5G network slicing for smart cities applications,” IEEE Communications Surveys & Tutorials, vol. 27, no. 1, pp. 595– 628, 2025

  17. [17]

    Beyond the edge: An advanced exploration of reinforcement learning for mobile edge computing, its applications, and future research trajectories,

    N. Yang, S. Chen, H. Zhang, and R. Berry, “Beyond the edge: An advanced exploration of reinforcement learning for mobile edge computing, its applications, and future research trajectories,”IEEE Communications Surveys & Tutorials, vol. 27, no. 1, pp. 546–594, 2025

  18. [18]

    A new path to integrated learning and communication (ILAC): Large AI models leveraging hyperdimensional computing,

    W. Xu, Z. Yang, D. W. K. Ng, R. Schober, H. V . Poor, Z. Zhang, and X. You, “A new path to integrated learning and communication (ILAC): Large AI models leveraging hyperdimensional computing,” IEEE Trans. Commun., vol. 74, pp. 4948–4973, 2026

  19. [19]

    An edge-cloud collaboration framework for generative ai service provision with synergetic big cloud model and small edge models,

    Y . Tian, Z. Zhang, Y . Yang, Z. Chen, Z. Yang, R. Jin, T. Q. S. Quek, and K.-K. Wong, “An edge-cloud collaboration framework for generative ai service provision with synergetic big cloud model and small edge models,”IEEE Netw., vol. 38, no. 5, pp. 37–46, 2024

  20. [20]

    On privacy, security, and trustworthiness in distributed wireless large AI models,

    Z. Yang, W. Xu, L. Liang, Y . Cui, Z. Qin, and M. Debbah, “On privacy, security, and trustworthiness in distributed wireless large AI models,” Science China Information Sciences, vol. 68, no. 7, p. 170301, 2025

  21. [21]

    Beyond transmitting bits: Context, semantics, and task-oriented communications,

    D. G ¨und¨uz, Z. Qin, I. E. Aguerri, H. S. Dhillon, Z. Yang, A. Yener, K. K. Wong, and C.-B. Chae, “Beyond transmitting bits: Context, semantics, and task-oriented communications,”IEEE J. Sel. Areas Commun., vol. 41, no. 1, pp. 5–41, 2023

  22. [22]

    Energy efficient semantic communication over wireless networks with rate splitting,

    Z. Yang, M. Chen, Z. Zhang, and C. Huang, “Energy efficient semantic communication over wireless networks with rate splitting,”IEEE J. Sel. Areas Commun., vol. 41, no. 5, pp. 1484–1495, 2023

  23. [23]

    A joint communication and computation design for probabilistic semantic communications,

    Z. Zhao, Z. Yang, M. Chen, Z. Zhang, and H. V . Poor, “A joint communication and computation design for probabilistic semantic communications,”Entropy, vol. 26, no. 5, Apr. 2024

  24. [24]

    Scene graph-aided probabilistic semantic communication for image transmission,

    C. Zhu, S. Liang, Z. Zhao, J. Bao, Z. Yang, Z. Zhang, and D. Niyato, “Scene graph-aided probabilistic semantic communication for image transmission,”IEEE Trans. Mobile Comput., vol. 25, no. 4, pp. 5905– 5919, 2026

  25. [25]

    Envisioning the future of technology integration for accessible hospitality and tourism,

    A. Tlili, F. Altinay, Z. Altinay, and Y . Zhang, “Envisioning the future of technology integration for accessible hospitality and tourism,”In- ternational Journal of Contemporary Hospitality Management, vol. 33, no. 12, pp. 4460–4482, 2021

  26. [26]

    Multi-hop ris-empowered terahertz com- munications: A drl-based hybrid beamforming design,

    C. Huang, Z. Yang, G. C. Alexandropoulos, K. Xiong, L. Wei, C. Yuen, Z. Zhang, and M. Debbah, “Multi-hop ris-empowered terahertz com- munications: A drl-based hybrid beamforming design,”IEEE J. Sel. Areas Commun., vol. 39, no. 6, pp. 1663–1677, 2021

  27. [27]

    A joint communication and computation design for distributed RISs assisted probabilistic semantic communication in IIoT,

    Z. Zhao, Z. Yang, C. Huang, L. Wei, Q. Yang, C. Zhong, W. Xu, and Z. Zhang, “A joint communication and computation design for distributed RISs assisted probabilistic semantic communication in IIoT,”IEEE Internet Things J., vol. 11, no. 16, pp. 26 568–26 579, Aug. 2024

  28. [28]

    Distributed learning for wireless communications: Methods, applications and challenges,

    L. Qian, P. Yang, M. Xiao, O. A. Dobre, M. Di Renzo, J. Li, Z. Han, Q. Yi, and J. Zhao, “Distributed learning for wireless communications: Methods, applications and challenges,”IEEE Journal of Selected Topics in Signal Processing, vol. 16, no. 3, pp. 326–342, 2022

  29. [29]

    Distributed learning in wireless networks: Recent progress and future challenges,

    M. Chen, D. G ¨und¨uz, K. Huang, W. Saad, M. Bennis, A. V . Feljan, and H. V . Poor, “Distributed learning in wireless networks: Recent progress and future challenges,”IEEE Journal on Selected Areas in Communications, vol. 39, no. 12, pp. 3579–3605, 2021

  30. [30]

    Distributed machine learning for wireless communication networks: Techniques, architectures, and applications,

    S. Hu, X. Chen, W. Ni, E. Hossain, and X. Wang, “Distributed machine learning for wireless communication networks: Techniques, architectures, and applications,”IEEE Communications Surveys & Tutorials, vol. 23, no. 3, pp. 1458–1493, 2021

  31. [31]

    Edge artificial intelligence for 6G: Vision, enabling technologies, and applications,

    K. B. Letaief, Y . Shi, J. Lu, and J. Lu, “Edge artificial intelligence for 6G: Vision, enabling technologies, and applications,”IEEE Journal on Selected Areas in Communications, vol. 40, no. 1, pp. 5–36, 2021

  32. [32]

    Communication- efficient edge AI: Algorithms and systems,

    Y . Shi, K. Yang, T. Jiang, J. Zhang, and K. B. Letaief, “Communication- efficient edge AI: Algorithms and systems,”IEEE Communications Surveys & Tutorials, vol. 22, no. 4, pp. 2167–2191, 2020

  33. [33]

    Big AI models for 6G wireless networks: Opportunities, challenges, and research directions,

    Z. Chen, Z. Zhang, and Z. Yang, “Big AI models for 6G wireless networks: Opportunities, challenges, and research directions,”IEEE Wireless Communications, 2024

  34. [34]

    Combined federated and split learning in edge computing for ubiquitous intelligence in internet of things: State-of-the-art and future directions,

    Q. Duan, S. Hu, R. Deng, and Z. Lu, “Combined federated and split learning in edge computing for ubiquitous intelligence in internet of things: State-of-the-art and future directions,”Sensors, vol. 22, no. 16, p. 5983, 2022

  35. [35]

    Advancements of federated learning towards privacy preservation: from federated learning to split learning,

    C. Thapa, M. A. P. Chamikara, and S. A. Camtepe, “Advancements of federated learning towards privacy preservation: from federated learning to split learning,”Federated Learning Systems: Towards Next- Generation AI, pp. 79–109, 2021. 28

  36. [36]

    Model aggregation techniques in federated learning: A comprehensive survey,

    P. Qi, D. Chiaro, A. Guzzo, M. Ianni, G. Fortino, and F. Piccialli, “Model aggregation techniques in federated learning: A comprehensive survey,”Future Generation Computer Systems, 2023

  37. [37]

    Aggregation techniques in federated learn- ing: Comprehensive survey, challenges and opportunities,

    M. P. Sah and A. Singh, “Aggregation techniques in federated learn- ing: Comprehensive survey, challenges and opportunities,” in2022 2nd International Conference on Advance Computing and Innovative Technologies in Engineering (ICACITE). IEEE, 2022, pp. 1962–1967

  38. [38]

    Reviewing federated learning aggregation algorithms; strategies, con- tributions, limitations and future perspectives,

    M. Moshawrab, M. Adda, A. Bouzouane, H. Ibrahim, and A. Raad, “Reviewing federated learning aggregation algorithms; strategies, con- tributions, limitations and future perspectives,”Electronics, vol. 12, no. 10, p. 2287, 2023

  39. [39]

    Florence: A new foundation model for computer vision

    L. Yuan, D. Chen, Y .-L. Chen, N. Codella, X. Dai, J. Gao, H. Hu, X. Huang, B. Li, C. Liet al., “Florence: A new foundation model for computer vision,”arXiv preprint arXiv:2111.11432, 2021

  40. [40]

    On the Opportunities and Risks of Foundation Models

    R. Bommasani, D. A. Hudson, E. Adeli, and et al., “On the opportuni- ties and risks of foundation models,”arXiv preprint arXiv:2108.07258, 2021

  41. [41]

    Semantic communication with probability graph: A joint communication and computation design,

    Z. Zhao, Z. Yang, Q.-V . Pham, Q. Yang, and Z. Zhang, “Semantic communication with probability graph: A joint communication and computation design,” inProc. 2023 IEEE 98th Veh. Technol. Conf. (VTC2023-Fall), Oct. 2023

  42. [42]

    A comprehensive survey on pretrained foundation models: A history from bert to chatgpt,

    C. Zhou, Q. Li, C. Li, J. Yu, Y . Liu, G. Wang, K. Zhang, C. Ji, Q. Yan, L. Heet al., “A comprehensive survey on pretrained foundation models: A history from bert to chatgpt,”International Journal of Machine Learning and Cybernetics, pp. 1–65, 2024

  43. [43]

    Energy efficient multi-modal probabilistic semantic communication (PSCom),

    J. Dai, J. Li, Z. Zhao, Z. Yang, Z. Zhang, and M. Shikh-Bahaei, “Energy efficient multi-modal probabilistic semantic communication (PSCom),” IEEE Trans. Green Commun. Netw., vol. 9, no. 4, pp. 1951–1963, 2025

  44. [44]

    Multi-user probabilistic semantic communication with semantic com- pression ratio optimization,

    Z. Zhao, Z. Yang, M. Chen, C. You, Q. Yang, W. Xu, and Z. Zhang, “Multi-user probabilistic semantic communication with semantic com- pression ratio optimization,” inProc. 2024 IEEE Int. Conf. Commun. (ICC Workshops), Jun. 2024, pp. 1647–1652

  45. [45]

    Big AI models for 6G wireless networks: Opportunities, challenges, and research directions,

    Z. Chen, Z. Zhang, and Z. Yang, “Big AI models for 6G wireless networks: Opportunities, challenges, and research directions,”IEEE Wireless Commun., vol. 31, no. 5, pp. 164–172, 2024

  46. [46]

    The road towards 6G: A comprehensive survey,

    W. Jiang, B. Han, M. A. Habibi, and H. D. Schotten, “The road towards 6G: A comprehensive survey,”IEEE Open Journal of the Communications Society, vol. 2, pp. 334–366, 2021

  47. [47]

    A vision of 6G wireless systems: Applications, trends, technologies, and open research problems,

    W. Saad, M. Bennis, and M. Chen, “A vision of 6G wireless systems: Applications, trends, technologies, and open research problems,”IEEE network, vol. 34, no. 3, pp. 134–142, 2019

  48. [48]

    6G wireless systems: Vision, requirements, challenges, insights, and opportunities,

    H. Tataria, M. Shafi, A. F. Molisch, M. Dohler, H. Sj ¨oland, and F. Tufvesson, “6G wireless systems: Vision, requirements, challenges, insights, and opportunities,”Proceedings of the IEEE, vol. 109, no. 7, pp. 1166–1199, 2021

  49. [49]

    6G and beyond: The future of wireless communications systems,

    I. F. Akyildiz, A. Kak, and S. Nie, “6G and beyond: The future of wireless communications systems,”IEEE access, vol. 8, pp. 133 995– 134 030, 2020

  50. [50]

    Communication and computation reduction for split learning using asynchronous training,

    X. Chen, J. Li, and C. Chakrabarti, “Communication and computation reduction for split learning using asynchronous training,” in2021 IEEE Workshop on Signal Processing Systems (SiPS). IEEE, 2021, pp. 76– 81

  51. [51]

    Active wireless split learning via online cloud-local server delta-knowledge distillation,

    H. Nam, J. Park, and S.-L. Kim, “Active wireless split learning via online cloud-local server delta-knowledge distillation,” in2023 IEEE International Conference on Communications Workshops (ICC Workshops). IEEE, 2023, pp. 825–830

  52. [52]

    Private federated learning on vertically partitioned data via entity resolution and additively homomorphic encryption.arXiv preprint arXiv:1711.10677, 2017

    S. Hardy, W. Henecka, H. Ivey-Law, R. Nock, G. Patrini, G. Smith, and B. Thorne, “Private federated learning on vertically partitioned data via entity resolution and additively homomorphic encryption,”arXiv preprint arXiv:1711.10677, 2017

  53. [53]

    Vertical partition- ing algorithms for database design,

    S. Navathe, S. Ceri, G. Wiederhold, and J. Dou, “Vertical partition- ing algorithms for database design,”ACM Transactions on Database Systems (TODS), vol. 9, no. 4, pp. 680–710, 1984

  54. [54]

    Improving the communication and computation efficiency of split learning for IoT applications,

    A. Ayad, M. Renner, and A. Schmeink, “Improving the communication and computation efficiency of split learning for IoT applications,” in2021 IEEE Global Communications Conference (GLOBECOM). IEEE, 2021, pp. 01–06

  55. [55]

    Privacy-preserving split learning with vision trans- formers using patch-wise random and noisy cutmix,

    S. Oh, S. Baek, J. Park, H. Nam, P. Vepakomma, R. Raskar, M. Bennis, and S.-L. Kim, “Privacy-preserving split learning with vision trans- formers using patch-wise random and noisy cutmix,”arXiv preprint arXiv:2408.01040, 2024

  56. [56]

    6G: The next horizon,

    W. Tong and P. Zhu, “6G: The next horizon,” inTITLES, 2022, p. 54

  57. [57]

    Ultra- reliable and low-latency communications: applications, opportunities and challenges,

    D. Feng, L. Lai, J. Luo, Y . Zhong, C. Zheng, and K. Ying, “Ultra- reliable and low-latency communications: applications, opportunities and challenges,”Science China Information Sciences, vol. 64, pp. 1– 12, 2021

  58. [58]

    Reconfigurable intelligent surfaces: Principles and opportunities,

    Y . Liu, X. Liu, X. Mu, T. Hou, J. Xu, M. Di Renzo, and N. Al-Dhahir, “Reconfigurable intelligent surfaces: Principles and opportunities,” IEEE communications surveys & tutorials, vol. 23, no. 3, pp. 1546– 1577, 2021

  59. [59]

    Integrated sensing and communications: Toward dual- functional wireless networks for 6G and beyond,

    F. Liu, Y . Cui, C. Masouros, J. Xu, T. X. Han, Y . C. Eldar, and S. Buzzi, “Integrated sensing and communications: Toward dual- functional wireless networks for 6G and beyond,”IEEE journal on selected areas in communications, vol. 40, no. 6, pp. 1728–1767, 2022

  60. [60]

    Spinn: synergistic progressive inference of neural networks over device and cloud,

    S. Laskaridis, S. I. Venieris, M. Almeida, I. Leontiadis, and N. D. Lane, “Spinn: synergistic progressive inference of neural networks over device and cloud,” inProceedings of the 26th annual international conference on mobile computing and networking, 2020, pp. 1–15

  61. [61]

    Towards federated learning at scale: System design,

    K. Bonawitz, H. Eichner, W. Grieskamp, D. Huba, A. Ingerman, V . Ivanovet al., “Towards federated learning at scale: System design,” inProceedings of the 2nd SysML Conference, 2019

  62. [62]

    Federated learning over wireless networks: Optimization model design and analysis,

    C. Yang, J. Wang, and G. B. Giannakis, “Federated learning over wireless networks: Optimization model design and analysis,” inProc. IEEE Global Communications Conference (GLOBECOM), 2019, pp. 1–6

  63. [63]

    A survey on mobile edge computing: The communication perspective,

    D. Niyato, P. Wang, D. I. Kim, and Y . Han, “A survey on mobile edge computing: The communication perspective,”IEEE Commun. Surv. Tutor., vol. 19, no. 4, pp. 2322–2358, 2017

  64. [64]

    arXiv preprint arXiv:1806.00582 (2018)

    Y . Zhao, M. Li, L. Lai, N. Suda, D. Civin, and V . Chandra, “Federated learning with non-IID data,”arXiv preprint arXiv:1806.00582, 2018

  65. [65]

    Zhou,Ensemble Methods: Foundations and Algorithms

    Z.-H. Zhou,Ensemble Methods: Foundations and Algorithms. CRC Press, 2012

  66. [66]

    Merge, ensemble, and cooperate! a survey on collaborative strategies in the era of large language models,

    J. Lu, Z. Pang, M. Xiao, Y . Zhu, R. Xia, and J. Zhang, “Merge, ensemble, and cooperate! a survey on collaborative strategies in the era of large language models,”arXiv preprint arXiv:2407.06089, 2024

  67. [67]

    An overview of gradient descent optimization algorithms,

    S. Ruder, “An overview of gradient descent optimization algorithms,” arXiv preprint arXiv:1609.04747, 2016

  68. [68]

    Learning the parts of objects by non- negative matrix factorization,

    D. D. Lee and H. S. Seung, “Learning the parts of objects by non- negative matrix factorization,”Nature, vol. 401, pp. 788–791, 1999

  69. [69]

    Machine learning for wireless networks with artificial intelligence: A tutorial,

    M. Chen, U. Challita, W. Saad, C. Yin, and M. Debbah, “Machine learning for wireless networks with artificial intelligence: A tutorial,” IEEE J. Sel. Areas Commun., vol. 37, no. 6, pp. 1225–1241, 2019

  70. [70]

    Communication-efficient learning of deep networks from decentralized data,

    H. B. McMahan, E. Moore, D. Ramage, and B. A. y Arcas, “Federated learning of deep networks using model averaging,”arXiv preprint arXiv:1602.05629, vol. 2, no. 2, pp. 15–18, 2016

  71. [71]

    Applications of deep reinforcement learning in communications and networking: A survey,

    N. C. Luong, D. T. Hoang, S. Gong, D. Niyato, P. Wang, Y . C. Liang, and D. I. Kim, “Applications of deep reinforcement learning in communications and networking: A survey,”IEEE Commun. Surv. Tutor., vol. 21, no. 4, pp. 3133–3174, 2019

  72. [72]

    Model-agnostic meta-learning for fast adaptation of deep networks,

    C. Finn, P. Abbeel, and S. Levine, “Model-agnostic meta-learning for fast adaptation of deep networks,” inProc. ICML, 2017

  73. [73]

    Federated Optimization: Distributed Machine Learning for On-Device Intelligence

    J. Konecny, H. B. McMahan, F. X. Yu, P. Richtarik, A. T. Suresh, and D. Bacon, “Federated optimization: Distributed machine learning for on-device intelligence,”arXiv preprint arXiv:1610.02527, 2016

  74. [74]

    Asynchronous federated learning over wireless communication networks,

    Z. Wang, Z. Zhang, Y . Tian, Q. Yang, H. Shan, W. Wang, and T. Q. S. Quek, “Asynchronous federated learning over wireless communication networks,”IEEE Transactions on Wireless Communications, vol. 21, no. 9, pp. 6961–6978, 2022

  75. [75]

    Integrated sensing and communication in 6G: Motivations, use cases, requirements, challenges and future directions,

    D. K. P. Tan, J. He, Y . Li, A. Bayesteh, Y . Chen, P. Zhu, and W. Tong, “Integrated sensing and communication in 6G: Motivations, use cases, requirements, challenges and future directions,” in2021 1st IEEE International Online Symposium on Joint Communications & Sensing (JC&S). IEEE, 2021, pp. 1–6

  76. [76]

    Federated Learning: Strategies for Improving Communication Efficiency

    J. Kone ˇcn´y, H. B. McMahan, F. X. Yu, P. Richt ´arik, A. T. Suresh, and D. Bacon, “Federated learning: Strategies for improving communica- tion efficiency,”arXiv preprint arXiv:1610.05492, 2016

  77. [77]

    A bargaining game for person- alized, energy efficient split learning over wireless networks,

    M. Kim, A. DeRieux, and W. Saad, “A bargaining game for person- alized, energy efficient split learning over wireless networks,” in2023 IEEE Wireless Communications and Networking Conference (WCNC). IEEE, 2023, pp. 1–6

  78. [78]

    Energy efficient probabilistic semantic communication over visible light net- works,

    Z. Zhao, Z. Yang, M. Chen, Y . Hu, C. Zhu, and Z. Zhang, “Energy efficient probabilistic semantic communication over visible light net- works,” inProc. GLOBECOM 2025 - 2025 IEEE Global Commun. Conf., 2025, pp. 1105–1110

  79. [79]

    Communication-efficient multimodal split learn- ing for mmwave received power prediction,

    Y . Koda, J. Park, M. Bennis, K. Yamamoto, T. Nishio, M. Morikura, and K. Nakashima, “Communication-efficient multimodal split learn- ing for mmwave received power prediction,”IEEE Communications Letters, vol. 24, no. 6, pp. 1284–1288, 2020

  80. [80]

    Communication- efficient split learning based on analog communication and over the air aggregation,

    M. Krouka, A. Elgabli, C. ben Issaid, and M. Bennis, “Communication- efficient split learning based on analog communication and over the air aggregation,” in2021 IEEE Global Communications Conference (GLOBECOM). IEEE, 2021, pp. 1–6. 29

Showing first 80 references.