Shortcuts

torcheval.metrics.TopKMultilabelAccuracy

class torcheval.metrics.TopKMultilabelAccuracy(*, criteria: str = 'exact_match', k: int = 1, device: Optional[device] = None)[source]

Compute multilabel accuracy score, which is the frequency of the top k label predicted matching target. Its functional version is torcheval.metrics.functional.topk_multilabel_accuracy(). See also MulticlassAccuracy, BinaryAccuracy, MultilabelAccuracy

Parameters:
  • criteria (string) –
    • 'exact_match' [default]: The set of top-k labels predicted for a sample must exactly match the corresponding set of labels in target. Also known as subset accuracy.
    • 'hamming': Fraction of top-k correct labels over total number of labels.
    • 'overlap': The set of top-k labels predicted for a sample must overlap with the corresponding set of labels in target.
    • 'contain': The set of top-k labels predicted for a sample must contain the corresponding set of labels in target.
    • 'belong': The set of top-k labels predicted for a sample must (fully) belong to the corresponding set of labels in target.
  • k (int) – Number of top probabilities to be considered. K should be an integer greater than or equal to 1.

Examples:

>>> import torch
>>> from torcheval.metrics import TopKMultilabelAccuracy
>>> metric = TopKMultilabelAccuracy(k = 2)
>>> input = torch.tensor([[0.1, 0.5, 0.2], [0.3, 0.2, 0.1], [0.2, 0.4, 0.5], [0, 0.1, 0.9]])
>>> target = torch.tensor([[1, 1, 0], [0, 1, 0], [1, 1, 1], [0, 1, 0]])
>>> metric.update(input, target)
>>> metric.compute()
tensor(0)  # 0 / 4

>>> metric = TopKMultilabelAccuracy(criteria="hamming", k=2)
>>> input = torch.tensor([[0.1, 0.5, 0.2], [0.3, 0.2, 0.1], [0.2, 0.4, 0.5], [0, 0.1, 0.9]])
>>> target = torch.tensor([[1, 1, 0], [0, 1, 0], [1, 1, 1], [0, 1, 0]])
>>> metric.update(input, target)
>>> metric.compute()
tensor(0.583)  # 7 / 12

>>> metric = TopKMultilabelAccuracy(criteria="overlap", k=2)
>>> input = torch.tensor([[0.1, 0.5, 0.2], [0.3, 0.2, 0.1], [0.2, 0.4, 0.5], [0, 0.1, 0.9]])
>>> target = torch.tensor([[1, 1, 0], [0, 1, 0], [1, 1, 1], [0, 1, 0]])
>>> metric.update(input, target)
>>> metric.compute()
tensor(1)  # 4 / 4

>>> metric = TopKMultilabelAccuracy(criteria="contain", k=2)
>>> input = torch.tensor([[0.1, 0.5, 0.2], [0.3, 0.2, 0.1], [0.2, 0.4, 0.5], [0, 0.1, 0.9]])
>>> target = torch.tensor([[1, 1, 0], [0, 1, 0], [1, 1, 1], [0, 1, 0]])
>>> metric.update(input, target)
>>> metric.compute()
tensor(0.5)  # 2 / 4

>>> metric = TopKMultilabelAccuracy(criteria="belong", k=2)
>>> input = torch.tensor([[0.1, 0.5, 0.2], [0.3, 0.2, 0.1], [0.2, 0.4, 0.5], [0, 0.1, 0.9]])
>>> target = torch.tensor([[1, 1, 0], [0, 1, 0], [1, 1, 1], [0, 1, 0]])
>>> metric.update(input, target)
>>> metric.compute()
tensor(0.25)  # 1 / 4
__init__(*, criteria: str = 'exact_match', k: int = 1, device: Optional[device] = None) None[source]

Initialize a metric object and its internal states.

Use self._add_state() to initialize state variables of your metric class. The state variables should be either torch.Tensor, a list of torch.Tensor, or a dictionary with torch.Tensor as values

Methods

__init__(*[, criteria, k, device]) Initialize a metric object and its internal states.
compute() Return the accuracy score.
load_state_dict(state_dict[, strict]) Loads metric state variables from state_dict.
merge_state(metrics) Implement this method to update the current metric's state variables to be the merged states of the current metric and input metrics.
reset() Reset the metric state variables to their default value.
state_dict() Save metric state variables in state_dict.
to(device, *args, **kwargs) Move tensors in metric state variables to device.
update(input, target) Update states with the ground truth labels and predictions.

Attributes

device The last input device of Metric.to().

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources