# Limited-Memory Quasi-Newton Algorithm

(Redirected from L-BFGS)

A Limited-Memory Quasi-Newton Algorithm is a quasi-Newton algorithm that uses the Broyden–Fletcher–Goldfarb–Shanno Update to approximate the Hessian matrix.

## References

### 2012

• http://en.wikipedia.org/wiki/L-BFGS
• QUOTE: The limited-memory BFGS (L-BFGS or LM-BFGS) algorithm is a member of the broad family of quasi-Newton optimization methods that uses a limited memory variation of the Broyden–Fletcher–Goldfarb–Shanno (BFGS) update to approximate the inverse Hessian matrix (denoted by $H_k$). Unlike the original BFGS method which stores a dense $n \times n$ approximation, L-BFGS stores only a few vectors that represent the approximation implicitly. Due to its moderate memory requirement, L-BFGS method is particularly well suited for optimization problems with a large number of variables. L-BFGS never explicitly forms or stores $H_k$. Instead, it maintains a history of the past $m\,\!$ updates of the position $x\,\!$ and gradient $\nabla f(x)$, where generally the history $m\,\!$ can be short, often less than 10. These updates are used to implicitly do operations requiring the $H_k$-vector product. While strictly, a straightforward BFGS implementation at the $i\,\!$-th iteration would represent the inverse Hessian approximation as informed by all updates on $0 \ldots i-1$, L-BFGS does quite well using updates from only the most recent iterations $i-m \ldots i-1$.