Regularized Matrix Factorization Algorithm

From GM-RKB
Jump to navigation Jump to search

A Regularized Matrix Factorization Algorithm is a matrix factorization algorithm that is implemented by a regularized matrix factorization system (to solve a regularized matrix factorization task.



References

2013

  • http://www.quuxlabs.com/blog/2010/09/matrix-factorization-a-simple-tutorial-and-implementation-in-python/#regularization
    • QUOTE: The above algorithm is a very basic algorithm for factorizing a matrix. There are a lot of methods to make things look more complicated. A common extension to this basic algorithm is to introduce regularization to avoid overfitting. This is done by adding a parameter [math]\displaystyle{ \beta }[/math] and modify the squared error as follows: : [math]\displaystyle{ e_{ij}^2 = (r_{ij} - \sum_{k=1}^K{p_{ik}q_{kj}})^2 + \frac{\beta}{2} \sum_{k=1}^K{(||P||^2 + ||Q||^2)} }[/math] In other words, the new parameter [math]\displaystyle{ \beta }[/math] is used to control the magnitudes of the user-feature and item-feature vectors such that P and Q would give a good approximation of R without having to contain large numbers. In practice, [math]\displaystyle{ \beta }[/math] is set to some values in the range of 0.02. The new update rules for this squared error can be obtained by a procedure similar to the one described above. The new update rules are as follows. : [math]\displaystyle{ p'_{ik} = p_{ik} + \alpha \frac{\partial}{\partial p_{ik}}e_{ij}^2 = p_{ik} + \alpha(2 e_{ij} q_{kj} - \beta p_{ik} ) }[/math] : [math]\displaystyle{ q'_{kj} = q_{kj} + \alpha \frac{\partial}{\partial q_{kj}}e_{ij}^2 = q_{kj} + \alpha(2 e_{ij} p_{ik} - \beta q_{kj} ) }[/math]

2008