LogSoftmax Activation Function

From GM-RKB
Jump to navigation Jump to search

A LogSoftmax Activation Function is a Softmax-based Activation Function that is the logarithm of a Softmax Function, i.e.:

$LogSoftmax\left(x_i\right)=\log\left(\dfrac{\exp\left(x_i\right)}{\sum_j\exp\left(x_j\right)}\right)$


References

2018

  • (Pyttorch, 2018) ⇒ http://pytorch.org/docs/master/nn.html#logsoftmax
    • QUOTE: class torch.nn.LogSoftmax(dim=None) source

      Applies the Log(Softmax(x)) function to an n-dimensional input Tensor. The LogSoftmax formulation can be simplified as

      [math]\displaystyle{ f_i(x)=\log\left(\dfrac{\exp(x_i)}{\sum_j\exp(x_j)}\right) }[/math]

      Shape:

      *** Input: any shape

      • Output: same as input
Parameters: dim(int) – A dimension along which Softmax will be computed (so every slice along dim will sum to 1)

.Returns: a Tensor of the same dimension and shape as the input with values in the range [-inf, 0).

Examples:

>>> m = nn.LogSoftmax()
>>> input = autograd.Variable(torch.randn(2, 3))
>>> print(input)
>>> print(m(input))