Class TransformerEncoderLayerImpl¶
Defined in File transformerlayer.h
Page Contents
Inheritance Relationships¶
Base Type¶
public torch::nn::Cloneable< TransformerEncoderLayerImpl >
(Template Class Cloneable)
Class Documentation¶
-
class TransformerEncoderLayerImpl : public torch::nn::Cloneable<TransformerEncoderLayerImpl>¶
TransformerEncoderLayer module.
See https://pytorch.org/docs/main/generated/torch.nn.TransformerEncoderLayer.html to learn abouut the exact behavior of this encoder layer model
See the documentation for
torch::nn::TransformerEncoderLayer
class to learn what constructor arguments are supported for this encoder layer modelExample:
TransformerEncoderLayer encoderLayer(TransformerEncoderLayerOptions(512, 8).dropout(0.1));
Public Functions
-
inline TransformerEncoderLayerImpl(int64_t d_model, int64_t nhead)¶
-
explicit TransformerEncoderLayerImpl(TransformerEncoderLayerOptions options_)¶
-
Tensor forward(const Tensor &src, const Tensor &src_mask = {}, const Tensor &src_key_padding_mask = {})¶
-
virtual void reset() override¶
reset()
must perform initialization of all members with reference semantics, most importantly parameters, buffers and submodules.
-
void reset_parameters()¶
Public Members
-
TransformerEncoderLayerOptions options¶
options with which this
TransformerEncoderLayer
was constructed
-
MultiheadAttention self_attn = nullptr¶
self attention
Protected Functions
-
inline virtual bool _forward_has_default_args() override¶
The following three functions allow a module with default arguments in its forward method to be used in a Sequential module.
You should NEVER override these functions manually. Instead, you should use the
FORWARD_HAS_DEFAULT_ARGS
macro.
-
inline virtual unsigned int _forward_num_required_args() override¶
Friends
- friend struct torch::nn::AnyModuleHolder
-
inline TransformerEncoderLayerImpl(int64_t d_model, int64_t nhead)¶