Shortcuts

Class GRUImpl

Inheritance Relationships

Base Type

Class Documentation

class torch::nn::GRUImpl : public torch::nn::detail::RNNImplBase<GRUImpl>

A multi-layer gated recurrent unit (GRU) module.

See https://pytorch.org/docs/master/nn.html#torch.nn.GRU to learn about the exact behavior of this module.

See the documentation for torch::nn::GRUOptions class to learn what constructor arguments are supported for this module.

Example:

GRU model(GRUOptions(2, 4).num_layers(3).batch_first(false).bidirectional(true));

Public Functions

GRUImpl(int64_t input_size, int64_t hidden_size)
GRUImpl(const GRUOptions &options_)
std::tuple<Tensor, Tensor> forward(const Tensor &input, Tensor hx = {})
std::tuple<torch::nn::utils::rnn::PackedSequence, Tensor> forward_with_packed_input(const torch::nn::utils::rnn::PackedSequence &packed_input, Tensor hx = {})

Public Members

GRUOptions options

Protected Functions

bool _forward_has_default_args() override

The following three functions allow a module with default arguments in its forward method to be used in a Sequential module.

You should NEVER override these functions manually. Instead, you should use the FORWARD_HAS_DEFAULT_ARGS macro.

unsigned int _forward_num_required_args() override
std::vector<torch::nn::AnyValue> _forward_populate_default_args(std::vector<torch::nn::AnyValue> &&arguments) override
std::tuple<Tensor, Tensor> forward_helper(const Tensor &input, const Tensor &batch_sizes, const Tensor &sorted_indices, int64_t max_batch_size, Tensor hx)

Friends

friend struct torch::nn::AnyModuleHolder

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources