Shortcuts

Struct RNNOptions

Page Contents

Struct Documentation

struct torch::nn::RNNOptions

Options for the RNN module.

Example:

RNN model(RNNOptions(128, 64).num_layers(3).dropout(0.2).nonlinearity(torch::kTanh));

Public Types

typedef c10::variant<enumtype::kTanh, enumtype::kReLU> nonlinearity_t

Public Functions

RNNOptions(int64_t input_size, int64_t hidden_size)
auto input_size(const int64_t &new_input_size) -> decltype(*this)

The number of expected features in the input x

auto input_size(int64_t &&new_input_size) -> decltype(*this)
const int64_t &input_size() const noexcept
int64_t &input_size() noexcept
auto hidden_size(const int64_t &new_hidden_size) -> decltype(*this)

The number of features in the hidden state h

auto hidden_size(int64_t &&new_hidden_size) -> decltype(*this)
const int64_t &hidden_size() const noexcept
int64_t &hidden_size() noexcept
auto num_layers(const int64_t &new_num_layers) -> decltype(*this)

Number of recurrent layers.

E.g., setting num_layers=2 would mean stacking two RNNs together to form a stacked RNN, with the second RNN taking in outputs of the first RNN and computing the final results. Default: 1

auto num_layers(int64_t &&new_num_layers) -> decltype(*this)
const int64_t &num_layers() const noexcept
int64_t &num_layers() noexcept
auto nonlinearity(const nonlinearity_t &new_nonlinearity) -> decltype(*this)

The non-linearity to use. Can be either torch::kTanh or torch::kReLU. Default: torch::kTanh

auto nonlinearity(nonlinearity_t &&new_nonlinearity) -> decltype(*this)
const nonlinearity_t &nonlinearity() const noexcept
nonlinearity_t &nonlinearity() noexcept
auto bias(const bool &new_bias) -> decltype(*this)

If false, then the layer does not use bias weights b_ih and b_hh.

Default: true

auto bias(bool &&new_bias) -> decltype(*this)
const bool &bias() const noexcept
bool &bias() noexcept
auto batch_first(const bool &new_batch_first) -> decltype(*this)

If true, then the input and output tensors are provided as (batch, seq, feature).

Default: false

auto batch_first(bool &&new_batch_first) -> decltype(*this)
const bool &batch_first() const noexcept
bool &batch_first() noexcept
auto dropout(const double &new_dropout) -> decltype(*this)

If non-zero, introduces a Dropout layer on the outputs of each RNN layer except the last layer, with dropout probability equal to dropout.

Default: 0

auto dropout(double &&new_dropout) -> decltype(*this)
const double &dropout() const noexcept
double &dropout() noexcept
auto bidirectional(const bool &new_bidirectional) -> decltype(*this)

If true, becomes a bidirectional RNN. Default: false

auto bidirectional(bool &&new_bidirectional) -> decltype(*this)
const bool &bidirectional() const noexcept
bool &bidirectional() noexcept

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources