Margin Infused Relaxed Algorithm

From GM-RKB
Jump to navigation Jump to search

A Margin Infused Relaxed Algorithm is a Model-based Supervised Classification Algorithm that ...



References

2009

2001

  • (Crammer & Singer, 2001) ⇒ Koby Crammer, and Yoram Singer. (2001). “Ultraconservative Online Algorithms for Multiclass Problems" In: Proceedings of the Fourteenth Annual Conference on Computational Learning Theory (COLT 2001). doi:10.1007/3-540-44581-1_7
    • Abstract: In this paper we study online classification algorithms for multiclass problems in the mistake bound model. The hypotheses we use maintain one prototype vector per class. Given an input instance, a multiclass hypothesis computes a similarity-score between each prototype and the input instance and then sets the predicted label to be the index of the prototype achieving the highest similarity. To design and analyze the learning algorithms in this paper we introduce the notion of ultraconservativeness. Ultraconservative algorithms are algorithms that update only the prototypes attaining similarity-scores which are higher than the score of the correct label’s prototype. We start by describing a family of additive ultraconservative algorithms where each algorithm in the family updates its prototypes by finding a feasible solution for a set of linear constraints that depend on the instantaneous similarity-scores. We then discuss a specific online algorithm that seeks a set of prototypes which have a small norm. The resulting algorithm, which we term MIRA (for Margin Infused Relaxed Algorithm) is ultraconservative as well. We derive mistake bounds for all the algorithms and provide further analysis of MIRA using a generalized notion of the margin for multiclass problems.