Rate this Page

Class LayerNormImpl#

Inheritance Relationships#

Base Type#

Class Documentation#

class LayerNormImpl : public torch::nn::Cloneable<LayerNormImpl>#

Applies Layer Normalization over a mini-batch of inputs as described in the paper Layer Normalization_ .

See https://pytorch.org/docs/main/nn.html#torch.nn.LayerNorm to learn about the exact behavior of this module.

See the documentation for torch::nn::LayerNormOptions class to learn what constructor arguments are supported for this module.

Example:

LayerNorm model(LayerNormOptions({2,
2}).elementwise_affine(false).eps(2e-5));

Public Functions

inline LayerNormImpl(std::vector<int64_t> normalized_shape)#
explicit LayerNormImpl(LayerNormOptions options_)#
virtual void reset() override#

reset() must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.

void reset_parameters()#
virtual void pretty_print(std::ostream &stream) const override#

Pretty prints the LayerNorm module into the given stream.

Tensor forward(const Tensor &input)#

Applies layer normalization over a mini-batch of inputs as described in the paper Layer Normalization_ .

The mean and standard-deviation are calculated separately over the last certain number dimensions which have to be of the shape specified by input normalized_shape.

Layer Normalization: https://arxiv.org/abs/1607.06450

Public Members

LayerNormOptions options#

The options with which this module was constructed.

Tensor weight#

The learned weight.

Initialized to ones if the elementwise_affine option is set to true upon construction.

Tensor bias#

The learned bias.

Initialized to zeros elementwise_affine option is set to true upon construction.