pith. machine review for the scientific record. sign in

hub

Split learning for health: Distributed deep learning without sharing raw patient data

13 Pith papers cite this work. Polarity classification is still indexing.

13 Pith papers citing it
abstract

Can health entities collaboratively train deep learning models without sharing sensitive raw data? This paper proposes several configurations of a distributed deep learning method called SplitNN to facilitate such collaborations. SplitNN does not share raw data or model details with collaborating institutions. The proposed configurations of splitNN cater to practical settings of i) entities holding different modalities of patient data, ii) centralized and local health entities collaborating on multiple tasks and iii) learning without sharing labels. We compare performance and resource efficiency trade-offs of splitNN and other distributed deep learning methods like federated learning, large batch synchronous stochastic gradient descent and show highly encouraging results for splitNN.

hub tools

years

2026 13

verdicts

UNVERDICTED 13

representative citing papers

Networked Information Aggregation for Binary Classification

cs.LG · 2026-05-01 · unverdicted · novelty 6.0

Sequential prediction passing on DAGs for logistic regression yields O(M/sqrt(D)) excess loss when M-agent windows cover all features, with Omega(k/D) lower bound identifying depth as the fundamental limit.

Secure and Privacy-Preserving Vertical Federated Learning

cs.CR · 2026-04-15 · unverdicted · novelty 5.0

Three optimized MPC protocols for privacy-preserving vertical federated learning that support global and global-local updates while reducing computation versus naive full-MPC delegation.

citing papers explorer

Showing 13 of 13 citing papers.