# Metric Space Optimization Task

(Redirected from Mathematical Optimization)

Jump to navigation
Jump to search
A Metric Space Optimization Task is an optimization task that requires the identification of an optimal point in a metric space.

**AKA:**Continuous Function Optimization.**Context:**- It can be solved by a Numerical Optimization System (that implements a numerical optimization algorithm).
- It can (often) be a Function Fitting Task.
- ...

**Example(s):****Counter-Example(s):**- a Combinatorial Optimization Task, such as a Knapsack problem.

**See:**Parameter Estimation, Least-Squares Regression.

## References

### 2024

- (Wikipedia, 2024) ⇒ https://en.wikipedia.org/wiki/Mathematical_optimization Retrieved:2024-4-24.
**Mathematical optimization**(alternatively spelled*optimisation*) or**mathematical programming**is the selection of a best element, with regard to some criterion, from some set of available alternatives.^{[1]}It is generally divided into two subfields: discrete optimization and continuous optimization. Optimization problems arise in all quantitative disciplines from computer science and engineering to operations research and economics, and the development of solution methods has been of interest in mathematics for centuries.In the more general approach, an optimization problem consists of maximizing or minimizing a real function by systematically choosing input values from within an allowed set and computing the value of the function. The generalization of optimization theory and techniques to other formulations constitutes a large area of applied mathematics.

### 2012

- http://en.wikipedia.org/wiki/Optimization_problem#Continuous_optimization_problem
- QUOTE: The
*standard form*of a (continuous) optimization problem is :[math]\displaystyle{ \begin{align} &\underset{x}{\operatorname{minimize}}& & f(x) \\ &\operatorname{subject\;to} & &g_i(x) \leq 0, \quad i = 1,\dots,m \\ &&&h_i(x) = 0, \quad i = 1, \dots, p \end{align} }[/math] where- [math]\displaystyle{ f(x): \mathbb{R}^n \to \mathbb{R} }[/math] is the
**objective function**to be minimized over the variable [math]\displaystyle{ x }[/math], - [math]\displaystyle{ g_i(x) \leq 0 }[/math] are called
**inequality constraints**, and - [math]\displaystyle{ h_i(x) = 0 }[/math] are called
**equality constraints**.

- [math]\displaystyle{ f(x): \mathbb{R}^n \to \mathbb{R} }[/math] is the
- By convention, the standard form defines a
**minimization problem**. A**maximization problem**can be treated by negating the objective function.

- QUOTE: The

### 2009a

- (Wikipedia, 2009) ⇒ http://en.wikipedia.org/wiki/Estimation_theory
- Estimation theory is a branch of statistics and signal processing that deals with estimating the values of parameters based on measured/empirical data. The parameters describe an underlying physical setting in such a way that the value of the parameters affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.
For example, it is desired to estimate the proportion of a population of voters who will vote for a particular candidate. That proportion is the unobservable parameter; the estimate is based on a small random sample of voters.

- Or, for example, in radar the goal is to estimate the location of objects (airplanes, boats, etc.) by analyzing the received echo and a possible question to be posed is "where are the airplanes?" To answer where the airplanes are, it is necessary to estimate the distance the airplanes are at from the radar station, which can provide an absolute location if the absolute location of the radar station is known.
- In estimation theory, it is assumed that the desired information is embedded in a noisy signal. Noise adds uncertainty, without which the problem would be deterministic and estimation would not be needed.

- Estimation theory is a branch of statistics and signal processing that deals with estimating the values of parameters based on measured/empirical data. The parameters describe an underlying physical setting in such a way that the value of the parameters affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.

### 2009b

- http://research.microsoft.com/en-us/um/people/zhang/inria/publis/tutorial-estim/node3.html
- A Glance over Parameter Estimation in General: Parameter estimation is a discipline that provides tools for the efficient use of data for aiding in mathematically modeling of phenomena and the estimation of constants appearing in these models [2]. It can thus be visualized as a study of inverse problems. Much of parameter estimation can be related to four optimization problems:
- criterion: the choice of the best function to optimize (minimize or maximize)
- estimation: the optimization of the chosen function
- design: optimal design to obtain the best parameter estimates
- modeling: the determination of the mathematical model which best describes the system from which data are measured.

- A Glance over Parameter Estimation in General: Parameter estimation is a discipline that provides tools for the efficient use of data for aiding in mathematically modeling of phenomena and the estimation of constants appearing in these models [2]. It can thus be visualized as a study of inverse problems. Much of parameter estimation can be related to four optimization problems:

### 1999

- (Nocedal & Wright, 1999) ⇒ Jorge Nocedal, and Stephen J. Wright. (1999). “Numerical Optimization." Springer, ISBN:0387987932.
- NUMERICAL OPTIMIZATION presents a comprehensive and up-to-date description of the most effective methods in
**continuous optimization**. ...

- NUMERICAL OPTIMIZATION presents a comprehensive and up-to-date description of the most effective methods in

### 1977

- (Beck & Arnold, 1977) ⇒ J.V. Beck, and K.J. Arnold. (1977). “Parameter Estimation in Engineering and Science.
*Wiley series in probability and mathematical statistics.*

- ↑ "The Nature of Mathematical Programming ,"
*Mathematical Programming Glossary*, INFORMS Computing Society.