Shortcuts

celu

class torch.ao.nn.quantized.functional.celu(input, scale, zero_point, alpha=1.)[source][source]

Applies the quantized CELU function element-wise.

CELU(x)=max(0,x)+min(0,α(exp(x/α)1))\text{CELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x / \alpha) - 1))
Parameters
  • input (Tensor) – quantized input

  • alpha (float) – the α\alpha value for the CELU formulation. Default: 1.0

Return type

Tensor

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources