Saturday, January 31, 2026

Self-Supervised Studying with Gaussian Processes


Self supervised studying (SSL) is a machine studying paradigm the place fashions be taught to know the underlying construction of information with out specific supervision from labeled samples. The acquired representations from SSL have demonstrated helpful for a lot of downstream duties together with clustering, and linear classification, and so forth. To make sure smoothness of the illustration house, most SSL strategies depend on the flexibility to generate pairs of observations which are much like a given occasion. Nonetheless, producing these pairs could also be difficult for a lot of varieties of knowledge. Furthermore, these strategies lack consideration of uncertainty quantification and may carry out poorly in out-of-sample prediction settings. To deal with these limitations, we suggest Gaussian course of self supervised studying (GPSSL), a novel method that makes use of Gaussian processes (GP) fashions on illustration studying. GP priors are imposed on the representations, and we acquire a generalized Bayesian posterior minimizing a loss operate that encourages informative representations. The covariance operate inherent in GPs naturally pulls representations of comparable models collectively, serving as an alternative choice to utilizing explicitly outlined optimistic samples. We present that GPSSL is intently associated to each kernel PCA and VICReg, a well-liked neural network-based SSL methodology, however not like each permits for posterior uncertainties that may be propagated to downstream duties. Experiments on numerous datasets, contemplating classification and regression duties, reveal that GPSSL outperforms conventional strategies by way of accuracy, uncertainty quantification, and error management.

Related Articles

Latest Articles