Shortcuts

Struct LSTMOptions

Page Contents

Struct Documentation

struct torch::nn::LSTMOptions

Options for the LSTM module.

Example:

LSTM model(LSTMOptions(2, 4).num_layers(3).batch_first(false).bidirectional(true));

Public Functions

LSTMOptions(int64_t input_size, int64_t hidden_size)
auto input_size(const int64_t &new_input_size) -> decltype(*this)

The number of expected features in the input x

auto input_size(int64_t &&new_input_size) -> decltype(*this)
const int64_t &input_size() const noexcept
int64_t &input_size() noexcept
auto hidden_size(const int64_t &new_hidden_size) -> decltype(*this)

The number of features in the hidden state h

auto hidden_size(int64_t &&new_hidden_size) -> decltype(*this)
const int64_t &hidden_size() const noexcept
int64_t &hidden_size() noexcept
auto num_layers(const int64_t &new_num_layers) -> decltype(*this)

Number of recurrent layers.

E.g., setting num_layers=2 would mean stacking two LSTMs together to form a stacked LSTM, with the second LSTM taking in outputs of the first LSTM and computing the final results. Default: 1

auto num_layers(int64_t &&new_num_layers) -> decltype(*this)
const int64_t &num_layers() const noexcept
int64_t &num_layers() noexcept
auto bias(const bool &new_bias) -> decltype(*this)

If false, then the layer does not use bias weights b_ih and b_hh.

Default: true

auto bias(bool &&new_bias) -> decltype(*this)
const bool &bias() const noexcept
bool &bias() noexcept
auto batch_first(const bool &new_batch_first) -> decltype(*this)

If true, then the input and output tensors are provided as (batch, seq, feature).

Default: false

auto batch_first(bool &&new_batch_first) -> decltype(*this)
const bool &batch_first() const noexcept
bool &batch_first() noexcept
auto dropout(const double &new_dropout) -> decltype(*this)

If non-zero, introduces a Dropout layer on the outputs of each LSTM layer except the last layer, with dropout probability equal to dropout.

Default: 0

auto dropout(double &&new_dropout) -> decltype(*this)
const double &dropout() const noexcept
double &dropout() noexcept
auto bidirectional(const bool &new_bidirectional) -> decltype(*this)

If true, becomes a bidirectional LSTM. Default: false

auto bidirectional(bool &&new_bidirectional) -> decltype(*this)
const bool &bidirectional() const noexcept
bool &bidirectional() noexcept
auto proj_size(const int64_t &new_proj_size) -> decltype(*this)

Cell projection dimension. If 0, projections are not added.

auto proj_size(int64_t &&new_proj_size) -> decltype(*this)
const int64_t &proj_size() const noexcept
int64_t &proj_size() noexcept

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources