# Metric Space Optimization Task

(Redirected from Cost Function Optimization Task)

A Metric Space Optimization Task is an optimization task that requires the identification of an optimal point in a metric space.

**AKA:**Continuous Function Optimization.**Context:**- It can be solved by a Numerical Optimization System (that applies a numerical optimization algorithm).
- It can (often) be a Function Fitting Task.

**Example(s):****Counter-Example(s):**- a Combinatorial Optimization Task, such as a Knapsack problem.

**See:**Parameter Estimation, Least-Squares Regression.

## References

### 2012

- http://en.wikipedia.org/wiki/Optimization_problem#Continuous_optimization_problem
- QUOTE: The
*standard form*of a (continuous) optimization problem is^{[1]}:[math]\begin{align} &\underset{x}{\operatorname{minimize}}& & f(x) \\ &\operatorname{subject\;to} & &g_i(x) \leq 0, \quad i = 1,\dots,m \\ &&&h_i(x) = 0, \quad i = 1, \dots,p \end{align}[/math] where- [math]f(x): \mathbb{R}^n \to \mathbb{R}[/math] is the
**objective function**to be minimized over the variable [math]x[/math], - [math]g_i(x) \leq 0[/math] are called
**inequality constraints**, and - [math]h_i(x) = 0[/math] are called
**equality constraints**.

- [math]f(x): \mathbb{R}^n \to \mathbb{R}[/math] is the
- By convention, the standard form defines a
**minimization problem**. A maximization problem can be treated by negating the objective function.

- QUOTE: The

### 2009

- (Wikipedia, 2009) ⇒ http://en.wikipedia.org/wiki/Estimation_theory
- Estimation theory is a branch of statistics and signal processing that deals with estimating the values of parameters based on measured/empirical data. The parameters describe an underlying physical setting in such a way that the value of the parameters affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.
For example, it is desired to estimate the proportion of a population of voters who will vote for a particular candidate. That proportion is the unobservable parameter; the estimate is based on a small random sample of voters.

- Or, for example, in radar the goal is to estimate the location of objects (airplanes, boats, etc.) by analyzing the received echo and a possible question to be posed is "where are the airplanes?" To answer where the airplanes are, it is necessary to estimate the distance the airplanes are at from the radar station, which can provide an absolute location if the absolute location of the radar station is known.
- In estimation theory, it is assumed that the desired information is embedded in a noisy signal. Noise adds uncertainty, without which the problem would be deterministic and estimation would not be needed.

- Estimation theory is a branch of statistics and signal processing that deals with estimating the values of parameters based on measured/empirical data. The parameters describe an underlying physical setting in such a way that the value of the parameters affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.
- http://research.microsoft.com/en-us/um/people/zhang/inria/publis/tutorial-estim/node3.html
- A Glance over Parameter Estimation in General: Parameter estimation is a discipline that provides tools for the efficient use of data for aiding in mathematically modeling of phenomena and the estimation of constants appearing in these models [2]. It can thus be visualized as a study of inverse problems. Much of parameter estimation can be related to four optimization problems:
- criterion: the choice of the best function to optimize (minimize or maximize)
- estimation: the optimization of the chosen function
- design: optimal design to obtain the best parameter estimates
- modeling: the determination of the mathematical model which best describes the system from which data are measured.

- A Glance over Parameter Estimation in General: Parameter estimation is a discipline that provides tools for the efficient use of data for aiding in mathematically modeling of phenomena and the estimation of constants appearing in these models [2]. It can thus be visualized as a study of inverse problems. Much of parameter estimation can be related to four optimization problems:

### 1999

- (Nocedal & Wright, 1999) ⇒ Jorge Nocedal, and Stephen J. Wright. (1999). “Numerical Optimization." Springer, ISBN:0387987932.
- NUMERICAL OPTIMIZATION presents a comprehensive and up-to-date description of the most effective methods in
**continuous optimization**. ...

- NUMERICAL OPTIMIZATION presents a comprehensive and up-to-date description of the most effective methods in

### 1977

- (Beck & Arnold, 1977) ⇒ J.V. Beck, and K.J. Arnold. (1977). “Parameter Estimation in Engineering and Science.
*Wiley series in probability and mathematical statistics.*

*Convex Optimization*. Cambridge University Press. p. 129. ISBN 978-0-521-83378-3. http://www.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf.