Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

tfb_softmax_centered

Computes Y = g(X) = exp([X 0]) / sum(exp([X 0]))


Description

To implement softmax as a bijection, the forward transformation appends a value to the input and the inverse removes this coordinate. The appended coordinate represents a pivot, e.g., softmax(x) = exp(x-c) / sum(exp(x-c)) where c is the implicit last coordinate.

Usage

tfb_softmax_centered(validate_args = FALSE, name = "softmax_centered")

Arguments

validate_args

Logical, default FALSE. Whether to validate input with asserts. If validate_args is FALSE, and the inputs are invalid, correct behavior is not guaranteed.

name

name prefixed to Ops created by this class.

Details

At first blush it may seem like the Invariance of domain theorem implies this implementation is not a bijection. However, the appended dimension makes the (forward) image non-open and the theorem does not directly apply.

Value

a bijector instance.

See Also


tfprobability

Interface to 'TensorFlow Probability'

v0.11.0.0
Apache License (>= 2.0)
Authors
Sigrid Keydana [aut, cre], Daniel Falbel [ctb], Kevin Kuo [ctb] (<https://orcid.org/0000-0001-7803-7901>), RStudio [cph]
Initial release

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.