Shortcuts

StorageEnsemble

class torchrl.data.replay_buffers.StorageEnsemble(*storages: Storage, transforms: Optional[List[Transform]] = None)[source]

An ensemble of storages.

This class is designed to work with ReplayBufferEnsemble.

Parameters:

storages (sequence of Storage) – the storages to make the composite storage.

Keyword Arguments:

transforms (list of Transform, optional) – a list of transforms of the same length as storages.

Warning

This class signatures for get() does not match other storages, as it will return a tuple (buffer_id, samples) rather than just the samples.

Warning

This class does not support writing (similarly to WriterEnsemble). To extend one of the replay buffers, simply index the parent ReplayBufferEnsemble object.

attach(buffer: Any) None

This function attaches a sampler to this storage.

Buffers that read from this storage must be included as an attached entity by calling this method. This guarantees that when data in the storage changes, components are made aware of changes even if the storage is shared with other buffers (eg. Priority Samplers).

Parameters:

buffer – the object that reads from this storage.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources