# Hyperbolic Tangent Activation Function

## References

### 2018a

• (Pytorch, 2018) ⇒ http://pytorch.org/docs/master/nn.html#tanh Retrieved:2018-2-10
• QUOTE:  class torch.nn.Tanh source

Applies element-wise, $\displaystyle{ f(x)=\dfrac{\exp(x)−\exp(−x)}{\exp(x)+\exp(−x)} }$

Shape:

• Input: (N,∗) where * means, any number of additional dimensions
• Output: (N,∗), same shape as the input
Examples:
>>> m = nn.Tanh()
>>> print(input)
>>> print(m(input))


### 2017

• (Mate Labs, 2017) ⇒ Mate Labs Aug 23, 2017. Secret Sauce behind the beauty of Deep Learning: Beginners guide to Activation Functions
• QUOTE: Hyperbolic tangent (TanH)  —  It looks like a scaled sigmoid function. Data is centered around zero, so the derivatives will be higher. Tanh quickly converges than sigmoid and logistic activation functions.

$\displaystyle{ f(x)=\tanh(x)=\dfrac{2}{1+e^{-2x}} -2 }$

Range: $\displaystyle{ (-1, 1) }$

Examples: $\displaystyle{ \tanh(2) = 0.9640,\; \tanh(-0.567) = -0.5131, \; \tanh(0) = 0 }$

### 2005

a. linear function,
b. threshold function,
c. sigmoid function.