FeaturePyramidNetwork(in_channels_list: List[int], out_channels: int, extra_blocks: Optional[torchvision.ops.feature_pyramid_network.ExtraFPNBlock] = None, norm_layer: Optional[Callable[[…], torch.nn.modules.module.Module]] = None)¶
Module that adds a FPN from on top of a set of feature maps. This is based on “Feature Pyramid Network for Object Detection”.
The feature maps are currently supposed to be in increasing depth order.
The input to the model is expected to be an OrderedDict[Tensor], containing the feature maps on top of which the FPN will be added.
out_channels (int) – number of channels of the FPN representation
extra_blocks (ExtraFPNBlock or None) – if provided, extra operations will be performed. It is expected to take the fpn features, the original features and the names of the original features as input, and returns a new list of feature maps and their corresponding names
norm_layer (callable, optional) – Module specifying the normalization layer to use. Default: None
>>> m = torchvision.ops.FeaturePyramidNetwork([10, 20, 30], 5) >>> # get some dummy data >>> x = OrderedDict() >>> x['feat0'] = torch.rand(1, 10, 64, 64) >>> x['feat2'] = torch.rand(1, 20, 16, 16) >>> x['feat3'] = torch.rand(1, 30, 8, 8) >>> # compute the FPN on top of x >>> output = m(x) >>> print([(k, v.shape) for k, v in output.items()]) >>> # returns >>> [('feat0', torch.Size([1, 5, 64, 64])), >>> ('feat2', torch.Size([1, 5, 16, 16])), >>> ('feat3', torch.Size([1, 5, 8, 8]))]
forward(x: Dict[str, torch.Tensor]) → Dict[str, torch.Tensor]¶
Computes the FPN for a set of feature maps.
x (OrderedDict[Tensor]) – feature maps for each feature level.
- feature maps after FPN layers.
They are ordered from highest resolution first.
- Return type
get_result_from_inner_blocks(x: torch.Tensor, idx: int) → torch.Tensor¶
This is equivalent to self.inner_blocks[idx](x), but torchscript doesn’t support this yet