Shortcuts

Struct RNNOptions

Page Contents

Struct Documentation

struct RNNOptions

Options for the RNN module.

Example:

RNN model(RNNOptions(128,
64).num_layers(3).dropout(0.2).nonlinearity(torch::kTanh));

Public Types

typedef std::variant<enumtype::kTanh, enumtype::kReLU> nonlinearity_t

Public Functions

RNNOptions(int64_t input_size, int64_t hidden_size)
inline auto input_size(const int64_t &new_input_size) -> decltype(*this)

The number of expected features in the input x

inline auto input_size(int64_t &&new_input_size) -> decltype(*this)
inline const int64_t &input_size() const noexcept
inline int64_t &input_size() noexcept
inline auto hidden_size(const int64_t &new_hidden_size) -> decltype(*this)

The number of features in the hidden state h

inline auto hidden_size(int64_t &&new_hidden_size) -> decltype(*this)
inline const int64_t &hidden_size() const noexcept
inline int64_t &hidden_size() noexcept
inline auto num_layers(const int64_t &new_num_layers) -> decltype(*this)

Number of recurrent layers.

E.g., setting num_layers=2 would mean stacking two RNNs together to form a stacked RNN, with the second RNN taking in outputs of the first RNN and computing the final results. Default: 1

inline auto num_layers(int64_t &&new_num_layers) -> decltype(*this)
inline const int64_t &num_layers() const noexcept
inline int64_t &num_layers() noexcept
inline auto nonlinearity(const nonlinearity_t &new_nonlinearity) -> decltype(*this)

The non-linearity to use.

Can be either torch::kTanh or torch::kReLU. Default: torch::kTanh

inline auto nonlinearity(nonlinearity_t &&new_nonlinearity) -> decltype(*this)
inline const nonlinearity_t &nonlinearity() const noexcept
inline nonlinearity_t &nonlinearity() noexcept
inline auto bias(const bool &new_bias) -> decltype(*this)

If false, then the layer does not use bias weights b_ih and b_hh.

Default: true

inline auto bias(bool &&new_bias) -> decltype(*this)
inline const bool &bias() const noexcept
inline bool &bias() noexcept
inline auto batch_first(const bool &new_batch_first) -> decltype(*this)

If true, then the input and output tensors are provided as (batch, seq, feature).

Default: false

inline auto batch_first(bool &&new_batch_first) -> decltype(*this)
inline const bool &batch_first() const noexcept
inline bool &batch_first() noexcept
inline auto dropout(const double &new_dropout) -> decltype(*this)

If non-zero, introduces a Dropout layer on the outputs of each RNN layer except the last layer, with dropout probability equal to dropout.

Default: 0

inline auto dropout(double &&new_dropout) -> decltype(*this)
inline const double &dropout() const noexcept
inline double &dropout() noexcept
inline auto bidirectional(const bool &new_bidirectional) -> decltype(*this)

If true, becomes a bidirectional RNN. Default: false

inline auto bidirectional(bool &&new_bidirectional) -> decltype(*this)
inline const bool &bidirectional() const noexcept
inline bool &bidirectional() noexcept

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources