# Additive Smoothing

An Additive Smoothing is an image processing technique for smoothing categorical data

**AKA:**Laplace Smoothing, Lidstone Smoothing**See:**Smoothing, Shrinkage Estimator, Posterior Distribution, Expected Value, Categorical Data.

## References

### 2016

- (Wikipedia, 2016) ⇒ https://www.wikiwand.com/en/Additive_smoothing Retrieved 2016-07-24
- In statistics,
**additive smoothing**, also called**Laplace smoothing**(not to be confused with Laplacian smoothing), or**Lidstone smoothing**, is a technique used to smooth categorical data. Given an observation**x**= (*x*_{1}, …,*x*_{d}) from a multinomial distribution with*N*trials and parameter vector**θ**= (*θ*_{1}, …,*θ*_{d}), a "smoothed" version of the data gives the estimator:

- In statistics,

- [math]\hat\theta_i= \frac{x_i + \alpha}{N + \alpha d} \qquad (i=1,\ldots,d),[/math]
- where the pseudocount
*α*> 0 is the smoothing parameter (*α*= 0 corresponds to no smoothing). Additive smoothing is a type of shrinkage estimator, as the resulting estimate will be between the empirical estimate*x*/_{i}*N*, and the uniform probability 1/*d*. Using Laplace's rule of succession, some authors have argued^{[citation needed]}that*α*should be 1 (in which case the term**add-one smoothing**is also used), though in practice a smaller value is typically chosen. - From a Bayesian point of view, this corresponds to the expected value of the posterior distribution, using a symmetric Dirichlet distribution with parameter
*α*as a prior. In the special case where the number of categories is 2, this is equivalent to using a Beta distribution as the conjugate prior for the parameters of Binomial distribution.

- where the pseudocount