Hyperbolic Tangent Activation Function

From GM-RKB
Jump to navigation Jump to search

A Hyperbolic Tangent Activation Function is a Neuron Activation Function based on an Hyperbolic Tangent Function.



References

2018a

  • (Pytorch, 2018) ⇒ http://pytorch.org/docs/master/nn.html#tanh Retrieved:2018-2-10
    • QUOTE: class torch.nn.Tanh source

      Applies element-wise, [math]\displaystyle{ f(x)=\dfrac{\exp(x)−\exp(−x)}{\exp(x)+\exp(−x)} }[/math]

      Shape:

      • Input: (N,∗) where * means, any number of additional dimensions
      • Output: (N,∗), same shape as the input
Examples:
>>> m = nn.Tanh()
>>> input = autograd.Variable(torch.randn(2))
>>> print(input)
>>> print(m(input))

2018b

2018c

2017

  • (Mate Labs, 2017) ⇒ Mate Labs Aug 23, 2017. Secret Sauce behind the beauty of Deep Learning: Beginners guide to Activation Functions
    • QUOTE: Hyperbolic tangent (TanH)  —  It looks like a scaled sigmoid function. Data is centered around zero, so the derivatives will be higher. Tanh quickly converges than sigmoid and logistic activation functions.

      [math]\displaystyle{ f(x)=\tanh(x)=\dfrac{2}{1+e^{-2x}} -2 }[/math]

      Range: [math]\displaystyle{ (-1, 1) }[/math]

      Examples: [math]\displaystyle{ \tanh(2) = 0.9640,\; \tanh(-0.567) = -0.5131, \; \tanh(0) = 0 }[/math]

2005

a. linear function,
b. threshold function,
c. sigmoid function.