Shortcuts

LeakyReLU

class torch.nn.quantized.LeakyReLU(scale, zero_point, negative_slope=0.01, inplace=False, device=None, dtype=None)[source]

This is the quantized equivalent of LeakyReLU.

Parameters
  • scale – quantization scale of the output tensor

  • zero_point – quantization zero point of the output tensor

  • negative_slope – Controls the angle of the negative slope. Default: 1e-2

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources