Podcast cover for "Osmotic Learning: A Self-Supervised Paradigm for Decentralized Contextual Data Representation" by Mario Colosi et al.
Episode

Osmotic Learning: A Self-Supervised Paradigm for Decentralized Contextual Data Representation

Dec 28, 20259:12
Machine LearningDistributed, Parallel, and Cluster Computing
No ratings yet

Abstract

Data within a specific context gains deeper significance beyond its isolated interpretation. In distributed systems, interdependent data sources reveal hidden relationships and latent structures, representing valuable information for many applications. This paper introduces Osmotic Learning (OSM-L), a self-supervised distributed learning paradigm designed to uncover higher-level latent knowledge from distributed data. The core of OSM-L is osmosis, a process that synthesizes dense and compact representation by extracting contextual information, eliminating the need for raw data exchange between distributed entities. OSM-L iteratively aligns local data representations, enabling information diffusion and convergence into a dynamic equilibrium that captures contextual patterns. During training, it also identifies correlated data groups, functioning as a decentralized clustering mechanism. Experimental results confirm OSM-L's convergence and representation capabilities on structured datasets, achieving over 0.99 accuracy in local information alignment while preserving contextual integrity.

Links & Resources

Authors

Cite This Paper

Year:2025
Category:cs.LG
APA

Colosi, M., Farahani, R., Fazio, M., Prodan, R., Villari, M. (2025). Osmotic Learning: A Self-Supervised Paradigm for Decentralized Contextual Data Representation. arXiv preprint arXiv:2512.23096.

MLA

Mario Colosi, Reza Farahani, Maria Fazio, Radu Prodan, and Massimo Villari. "Osmotic Learning: A Self-Supervised Paradigm for Decentralized Contextual Data Representation." arXiv preprint arXiv:2512.23096 (2025).