Rate this Page

LeakyReLU#

class torch.ao.nn.quantized.LeakyReLU(scale, zero_point, negative_slope=0.01, inplace=False, device=None, dtype=None)[source]#

This is the quantized equivalent of LeakyReLU.

Parameters
  • scale (float) – quantization scale of the output tensor

  • zero_point (int) – quantization zero point of the output tensor

  • negative_slope (float) – Controls the angle of the negative slope. Default: 1e-2