The goal of contrastive representation learning is to learn such an embedding space in which similar sample pairs stay close to each other while dissimilar ones are far apart. Contrastive learning can be applied to both supervised and unsupervised settings. When working with unsupervised data, contrastive learning is one of the most powerful approaches in self-supervised learning.
Contrastive Training Objectives In early versions of loss functions for contrastive learning, only one positive and one negative sample are involved.
The performance of supervised learning tasks improves with more high-quality labels available. However, it is expensive to collect a large number of labeled .
Contrastive Representation Learning lilianweng.github.io - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from lilianweng.github.io Daily Mail and Mail on Sunday newspapers.