Non-Linear Programming Algorithm: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
m (Text replacement - ". ----" to ". ----")
m (Text replacement - "]] **" to "]]. **")
 
Line 20: Line 20:
***** [[Signomial]] — similar to polynomials, but exponents need not be integers
***** [[Signomial]] — similar to polynomials, but exponents need not be integers
***** [[Posynomial]] — a signomial with positive coefficients
***** [[Posynomial]] — a signomial with positive coefficients
**** [[Quadratically constrained quadratic program]]
**** [[Quadratically constrained quadratic program]].
**** [[Linear-fractional programming]] — objective is ratio of linear functions, constraints are linear
**** [[Linear-fractional programming]] — objective is ratio of linear functions, constraints are linear
***** [[Fractional programming]] — objective is ratio of nonlinear functions, constraints are linear
***** [[Fractional programming]] — objective is ratio of nonlinear functions, constraints are linear
**** [[Nonlinear complementarity problem]] (NCP) — find ''x'' such that ''x'' &ge; 0, ''f''(''x'') &ge; 0 and ''x''<sup>T</sup> ''f''(''x'') = 0
**** [[Nonlinear complementarity problem]] (NCP) — find ''x'' such that ''x'' &ge; 0, ''f''(''x'') &ge; 0 and ''x''<sup>T</sup> ''f''(''x'') = 0
**** [[Least squares]] — the objective function is a sum of squares
**** [[Least squares]] — the objective function is a sum of squares
***** [[Non-linear least squares]]
***** [[Non-linear least squares]].
***** [[Gauss–Newton algorithm]]
***** [[Gauss–Newton algorithm]].
****** [[BHHH algorithm]] — variant of Gauss–Newton in econometrics
****** [[BHHH algorithm]] — variant of Gauss–Newton in econometrics
****** [[Generalized Gauss–Newton method]] — for constrained nonlinear least-squares problems
****** [[Generalized Gauss–Newton method]] — for constrained nonlinear least-squares problems
***** [[Levenberg–Marquardt algorithm]]
***** [[Levenberg–Marquardt algorithm]].
***** [[Iteratively reweighted least squares]] (IRLS) — solves a weigted least-squares problem at every iteration
***** [[Iteratively reweighted least squares]] (IRLS) — solves a weigted least-squares problem at every iteration
***** [[Partial least squares]] — statistical techniques similar to principal components analysis
***** [[Partial least squares]] — statistical techniques similar to principal components analysis
Line 35: Line 35:
**** [[Mathematical programming with equilibrium constraints]] — constraints include variational inequalities or complementarities
**** [[Mathematical programming with equilibrium constraints]] — constraints include variational inequalities or complementarities
****Univariate optimization:
****Univariate optimization:
***** [[Golden section search]]
***** [[Golden section search]].
***** [[Successive parabolic interpolation]] — based on quadratic interpolation through the last three iterates
***** [[Successive parabolic interpolation]] — based on quadratic interpolation through the last three iterates
***General algorithms:
***General algorithms:
****Concepts:
****Concepts:
***** [[Descent direction]]
***** [[Descent direction]].
***** [[Guess value]] — the initial guess for a solution with which an algorithm starts
***** [[Guess value]] — the initial guess for a solution with which an algorithm starts
***** [[Line search]]
***** [[Line search]].
****** [[Backtracking line search]]
****** [[Backtracking line search]].
****** [[Wolfe conditions]]
****** [[Wolfe conditions]].
**** [[Gradient method]] — method that uses the gradient as the search direction
**** [[Gradient method]] — method that uses the gradient as the search direction
***** [[Gradient descent]]
***** [[Gradient descent]].
****** [[Stochastic gradient descent]]
****** [[Stochastic gradient descent]].
***** [[Landweber iteration]] — mainly used for ill-posed problems
***** [[Landweber iteration]] — mainly used for ill-posed problems
**** [[Successive linear programming]] (SLP) — replace problem by a linear programming problem, solve that, and repeat
**** [[Successive linear programming]] (SLP) — replace problem by a linear programming problem, solve that, and repeat
**** [[Sequential quadratic programming]] (SQP) — replace problem by a quadratic programming problem, solve that, and repeat
**** [[Sequential quadratic programming]] (SQP) — replace problem by a quadratic programming problem, solve that, and repeat
**** [[Newton's method in optimization]]
**** [[Newton's method in optimization]].
*****See also under ''Newton algorithm'' in the [[#Finding roots of nonlinear equations|section ''Finding roots of nonlinear equations'']]
*****See also under ''Newton algorithm'' in the [[#Finding roots of nonlinear equations|section ''Finding roots of nonlinear equations'']].
**** [[Nonlinear conjugate gradient method]]
**** [[Nonlinear conjugate gradient method]].
****Derivative-free methods
****Derivative-free methods
***** [[Coordinate descent]] — move in one of the coordinate directions
***** [[Coordinate descent]] — move in one of the coordinate directions
****** [[Adaptive coordinate descent]] — adapt coordinate directions to objective function
****** [[Adaptive coordinate descent]] — adapt coordinate directions to objective function
****** [[Random coordinate descent]] — randomized version
****** [[Random coordinate descent]] — randomized version
***** [[Nelder–Mead method]]
***** [[Nelder–Mead method]].
***** [[Pattern search (optimization)]]
***** [[Pattern search (optimization)]].
***** [[Powell's method]] — based on conjugate gradient descent
***** [[Powell's method]] — based on conjugate gradient descent
***** [[Rosenbrock methods]] — derivative-free method, similar to Nelder–Mead but with guaranteed convergence
***** [[Rosenbrock methods]] — derivative-free method, similar to Nelder–Mead but with guaranteed convergence
**** [[Augmented Lagrangian method]] — replaces constrained problems by unconstrained problems with a term added to the objective function
**** [[Augmented Lagrangian method]] — replaces constrained problems by unconstrained problems with a term added to the objective function
**** [[Ternary search]]
**** [[Ternary search]].
**** [[Tabu search]]
**** [[Tabu search]].
**** [[Guided Local Search]] — modification of search algorithms which builds up penalties during a search
**** [[Guided Local Search]] — modification of search algorithms which builds up penalties during a search
**** [[Reactive search optimization]] (RSO) — the algorithm adapts its parameters automatically
**** [[Reactive search optimization]] (RSO) — the algorithm adapts its parameters automatically
**** [[MM algorithm]] — majorize-minimization, a wide framework of methods
**** [[MM algorithm]] — majorize-minimization, a wide framework of methods
**** [[Least absolute deviations]]
**** [[Least absolute deviations]].
***** [[Expectation–maximization algorithm]]
***** [[Expectation–maximization algorithm]].
****** [[Ordered subset expectation maximization]]
****** [[Ordered subset expectation maximization]].
**** [[Adaptive projected subgradient method]]
**** [[Adaptive projected subgradient method]].
**** [[Nearest neighbor search]]
**** [[Nearest neighbor search]].
**** [[Space mapping]] — uses "coarse" (ideal or low-fidelity) and "fine" (practical or high-fidelity) models
**** [[Space mapping]] — uses "coarse" (ideal or low-fidelity) and "fine" (practical or high-fidelity) models



Latest revision as of 15:35, 24 July 2023

A Non-Linear Programming Algorithm is a continuous optimization algorithm that can be implemented by non-linear programming system (to solve a non-linear programming task).



References

2016