Shortcuts

Struct SmoothL1LossOptions

Page Contents

Struct Documentation

struct SmoothL1LossOptions

Options for the SmoothL1Loss module.

Example:

SmoothL1Loss model(SmoothL1LossOptions().reduction(torch::kNone).beta(0.5));

Public Types

typedef std::variant<enumtype::kNone, enumtype::kMean, enumtype::kSum> reduction_t

Public Functions

SmoothL1LossOptions() = default
inline SmoothL1LossOptions(torch::enumtype::kNone reduction)
inline SmoothL1LossOptions(torch::enumtype::kMean reduction)
inline SmoothL1LossOptions(torch::enumtype::kSum reduction)
inline auto reduction(const reduction_t &new_reduction) -> decltype(*this)

Specifies the reduction to apply to the output: ‘none’ | ‘mean’ | ‘sum’.

‘none’: no reduction will be applied, ‘mean’: the sum of the output will be divided by the number of elements in the output, ‘sum’: the output will be summed. Default: ‘mean’

inline auto reduction(reduction_t &&new_reduction) -> decltype(*this)
inline const reduction_t &reduction() const noexcept
inline reduction_t &reduction() noexcept
inline auto beta(const std::optional<double> &new_beta) -> decltype(*this)

Specifies the threshold at which to change between L1 and L2 loss.

If beta is not specified, a value of 1.0 will be used. Default: nullopt

inline auto beta(std::optional<double> &&new_beta) -> decltype(*this)
inline const std::optional<double> &beta() const noexcept
inline std::optional<double> &beta() noexcept

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources