Definition
A learning approach where the training data provides its own supervision by leveraging the underlying structure in the data.
Detailed Explanation
Self-supervised learning creates supervised learning tasks from unlabeled data by using naturally available relationships within the data. It involves pretext tasks, contrastive learning, and masked prediction tasks to learn meaningful representations without explicit labels.
Use Cases
Pre-training large language models on internet text, Learning visual representations from video sequences, Understanding protein structure from amino acid sequences