Shortcuts

Function at::rrelu

Function Documentation

inline at::Tensor at::rrelu(const at::Tensor &self, const at::Scalar &lower = 0.125, const at::Scalar &upper = 0.3333333333333333, bool training = false, ::std::optional<at::Generator> generator = ::std::nullopt)

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources