Shortcuts

Struct RNNOptionsBase

Page Contents

Struct Documentation

struct torch::nn::detail::RNNOptionsBase

Common options for RNN, LSTM and GRU modules.

Public Types

typedef c10::variant<enumtype::kLSTM, enumtype::kGRU, enumtype::kRNN_TANH, enumtype::kRNN_RELU> rnn_options_base_mode_t

Public Functions

RNNOptionsBase(rnn_options_base_mode_t mode, int64_t input_size, int64_t hidden_size)
auto mode(const rnn_options_base_mode_t &new_mode) -> decltype(*this)
auto mode(rnn_options_base_mode_t &&new_mode) -> decltype(*this)
const rnn_options_base_mode_t &mode() const noexcept
rnn_options_base_mode_t &mode() noexcept
auto input_size(const int64_t &new_input_size) -> decltype(*this)

The number of features of a single sample in the input sequence x.

auto input_size(int64_t &&new_input_size) -> decltype(*this)
const int64_t &input_size() const noexcept
int64_t &input_size() noexcept
auto hidden_size(const int64_t &new_hidden_size) -> decltype(*this)

The number of features in the hidden state h.

auto hidden_size(int64_t &&new_hidden_size) -> decltype(*this)
const int64_t &hidden_size() const noexcept
int64_t &hidden_size() noexcept
auto num_layers(const int64_t &new_num_layers) -> decltype(*this)

The number of recurrent layers (cells) to use.

auto num_layers(int64_t &&new_num_layers) -> decltype(*this)
const int64_t &num_layers() const noexcept
int64_t &num_layers() noexcept
auto bias(const bool &new_bias) -> decltype(*this)

Whether a bias term should be added to all linear operations.

auto bias(bool &&new_bias) -> decltype(*this)
const bool &bias() const noexcept
bool &bias() noexcept
auto batch_first(const bool &new_batch_first) -> decltype(*this)

If true, the input sequence should be provided as (batch, sequence, features).

If false (default), the expected layout is (sequence, batch, features).

auto batch_first(bool &&new_batch_first) -> decltype(*this)
const bool &batch_first() const noexcept
bool &batch_first() noexcept
auto dropout(const double &new_dropout) -> decltype(*this)

If non-zero, adds dropout with the given probability to the output of each RNN layer, except the final layer.

auto dropout(double &&new_dropout) -> decltype(*this)
const double &dropout() const noexcept
double &dropout() noexcept
auto bidirectional(const bool &new_bidirectional) -> decltype(*this)

Whether to make the RNN bidirectional.

auto bidirectional(bool &&new_bidirectional) -> decltype(*this)
const bool &bidirectional() const noexcept
bool &bidirectional() noexcept
auto proj_size(const int64_t &new_proj_size) -> decltype(*this)

Cell projection dimension. If 0, projections are not added. Can only be used for LSTMs.

auto proj_size(int64_t &&new_proj_size) -> decltype(*this)
const int64_t &proj_size() const noexcept
int64_t &proj_size() noexcept

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources