Shortcuts

tensordict.nn.distributions.NormalParamExtractor

class tensordict.nn.distributions.NormalParamExtractor(scale_mapping: str = 'biased_softplus_1.0', scale_lb: Number = 0.0001)

A non-parametric nn.Module that splits its input into loc and scale parameters.

The scale parameters are mapped onto positive values using the specified scale_mapping.

Parameters:
  • scale_mapping (str, optional) – positive mapping function to be used with the std. default = "biased_softplus_1.0" (i.e. softplus map with bias such that fn(0.0) = 1.0) choices: "softplus", "exp", "relu", "biased_softplus_1" or "none" (no mapping). See mappings() for more details.

  • scale_lb (Number, optional) – The minimum value that the variance can take. Default is 1e-4.

Examples

>>> import torch
>>> from tensordict.nn.distributions import NormalParamExtractor
>>> from torch import nn
>>> module = nn.Linear(3, 4)
>>> normal_params = NormalParamExtractor()
>>> tensor = torch.randn(3)
>>> loc, scale = normal_params(module(tensor))
>>> print(loc.shape, scale.shape)
torch.Size([2]) torch.Size([2])
>>> assert (scale > 0).all()
>>> # with modules that return more than one tensor
>>> module = nn.LSTM(3, 4)
>>> tensor = torch.randn(4, 2, 3)
>>> loc, scale, others = normal_params(*module(tensor))
>>> print(loc.shape, scale.shape)
torch.Size([4, 2, 2]) torch.Size([4, 2, 2])
>>> assert (scale > 0).all()

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources