Shortcuts

Struct TransformerDecoderLayerOptions

Page Contents

Struct Documentation

struct torch::nn::TransformerDecoderLayerOptions

Options for the TransformerDecoderLayer module.

Example:

TransformerDecoderLayer model(TransformerDecoderLayerOptions(512, 8).dropout(0.2));

Public Functions

TransformerDecoderLayerOptions(int64_t d_model, int64_t nhead)
auto d_model(const int64_t &new_d_model) -> decltype(*this)

number of expected features in the input

auto d_model(int64_t &&new_d_model) -> decltype(*this)
const int64_t &d_model() const noexcept
int64_t &d_model() noexcept
auto nhead(const int64_t &new_nhead) -> decltype(*this)

number of heads in the multiheadattention models

auto nhead(int64_t &&new_nhead) -> decltype(*this)
const int64_t &nhead() const noexcept
int64_t &nhead() noexcept
auto dim_feedforward(const int64_t &new_dim_feedforward) -> decltype(*this)

dimension of the feedforward network model. Default: 2048

auto dim_feedforward(int64_t &&new_dim_feedforward) -> decltype(*this)
const int64_t &dim_feedforward() const noexcept
int64_t &dim_feedforward() noexcept
auto dropout(const double &new_dropout) -> decltype(*this)

dropout value. Default: 1

auto dropout(double &&new_dropout) -> decltype(*this)
const double &dropout() const noexcept
double &dropout() noexcept
auto activation(const activation_t &new_activation) -> decltype(*this)

activation function of intermediate layer, can be torch::kGELU, torch::kReLU, or a unary callable. Default: torch::kReLU

auto activation(activation_t &&new_activation) -> decltype(*this)
const activation_t &activation() const noexcept
activation_t &activation() noexcept

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources