Log-Sigmoid Activation Function

From GM-RKB
Jump to navigation Jump to search

A Log-Sigmoid Activation Function is a Sigmoid-based Activation Function that is based on the logarithm function of a Sigmoid Function.



References

2018

  • (Pyttorch, 2018) ⇒ http://pytorch.org/docs/master/nn.html#logsoftmax
    • QUOTE: class torch.nn.LogSigmoid source

      Applies element-wise

      [math]\displaystyle{ LogSigmoid(x)=\log\left(\dfrac{1}{1+\exp(-x_i)}\right) }[/math]

      Shape:

      *** Input: [math]\displaystyle{ (N,∗) }[/math] where * means, any number of additional dimensions

      • Output: [math]\displaystyle{ (N,∗) }[/math], same shape as the input.
Examples:
>>> m = nn.LogSigmoid()
>>> input = autograd.Variable(torch.randn(2))
>>> print(input)
>>> print(m(input))