Softmin Activation Function

From GM-RKB
Jump to navigation Jump to search

A Softmin Activation Function is a Softmax-based Activation Function that is defined as [math]\displaystyle{ f(x)=softmax(-x) }[/math].



References

2018

  • (Pyttorch, 2018) ⇒ http://pytorch.org/docs/master/nn.html#softmin
    • QUOTE: class torch.nn.Softmin(dim=None) source

      Applies the Softmin function to an n-dimensional input Tensor rescaling them so that the elements of the n-dimensional output Tensor lie in the range (0, 1) and sum to 1

      [math]\displaystyle{ f_i(x)=\dfrac{\exp(-x_i)}{\sum_j\exp(-x_j)} }[/math]

      Shape:

      *** Input: any shape

      • Output: same as input
Parameters: dim(int) – A dimension along which Softmax will be computed (so every slice along dim will sum to 1).

.Returns: a Tensor of the same dimension and shape as the input, with values in the range [0, 1].

Examples:

>>> m = nn.Softmin()
>>> input = autograd.Variable(torch.randn(2, 3))
>>> print(input)
>>> print(m(input))