L1-Regularized Logistic Regression Algorithm: Difference between revisions
Jump to navigation
Jump to search
m (Text replacement - ". ----" to ". ----") |
m (Text replacement - "]]↵*" to "]]. *") |
||
Line 1: | Line 1: | ||
An [[L1-Regularized Logistic Regression Algorithm]] is a [[Logistic Regression Algorithm]] that uses [[L1-norm regularization]] | An [[L1-Regularized Logistic Regression Algorithm]] is a [[Logistic Regression Algorithm]] that uses [[L1-norm regularization]]. | ||
* <B>Counter-Example(s):</B> | * <B>Counter-Example(s):</B> | ||
** [[L2-Regularized Logistic Regression Algorithm]]. | ** [[L2-Regularized Logistic Regression Algorithm]]. |
Latest revision as of 17:52, 4 October 2023
An L1-Regularized Logistic Regression Algorithm is a Logistic Regression Algorithm that uses L1-norm regularization.
- Counter-Example(s):
- See: L1-Norm.
References
2008
- (Murphy, 2008) ⇒ Kevin P. Murphy. (2008). “L1 Regularization." CS540 Machine learning, Lecture 13.
2006
- (Lee et al., 2006) ⇒ Su-In Lee, Honglak Lee, Pieter Abbeel, and Andrew Y. Ng. (2006). “Efficient L1 Regularized Logistic Regression.” In: Proceedings of the 2006 AAAI Conference (AAAI 2006).
2004
- (Ng, 2004) ⇒ Andrew Y. Ng. (2004). “Feature Selection, L1 vs. L2 Regularization, and Rotational Invariance.” In: Proceedings of the twenty-first International Conference on Machine learning (ICML 2004) doi:10.1145/1015330.1015435