Self-Supervised Learning Task

From GM-RKB
(Redirected from self-supervised task)
Jump to navigation Jump to search

A self-supervised learning task is a semi-supervised learning task that requires a labeling heuristic (to label some unlabeled examples).



References

2023

  • (Wikipedia, 2023) ⇒ https://en.wikipedia.org/wiki/Self-supervised_learning Retrieved:2023-10-16.
    • Self-supervised learning (SSL) is a paradigm in machine learning for processing data of lower quality, rather than improving ultimate outcomes. Self-supervised learning more closely imitates the way humans learn to classify objects.

      The typical SSL method is based on an artificial neural network or other model such as a decision list. The model learns in two steps. First, the task is solved based on an auxiliary or pretext classification task using pseudo-labels which help to initialize the model parameters. Second, the actual task is performed with supervised or unsupervised learning. Other auxiliary tasks involve pattern completion from masked input patterns (silent pauses in speech or image portions masked in black). Self-supervised learning has produced promising results in recent years and has found practical application in audio processing and is being used by Facebook and others for speech recognition.

2008

2001

  • (Wu et al., 2001) ⇒ Y. Wu, T. S. Huang, and K. Toyama. (2001). “Self-Supervised Learning for Object Recognition Based on Kernel Discriminant-EM Algorithm.” In: Proceedings of the IEEE International Conference on Computer Vision.

1995

  • (Dayan et al., 1995) ⇒ P. Dayan, Geoffrey E. Hinton, R. M. Neal, and R. S. Zemel. (1995). “The Helmholtz Machine.” In: Neural Computation, 7(5).