alexnet¶
-
torchvision.models.
alexnet
(*, weights: Optional[torchvision.models.alexnet.AlexNet_Weights] = None, progress: bool = True, **kwargs: Any) → torchvision.models.alexnet.AlexNet[source]¶ AlexNet model architecture from One weird trick for parallelizing convolutional neural networks.
Note
AlexNet was originally introduced in the ImageNet Classification with Deep Convolutional Neural Networks paper. Our implementation is based instead on the “One weird trick” paper above.
- Parameters
weights (
AlexNet_Weights
, optional) – The pretrained weights to use. SeeAlexNet_Weights
below for more details, and possible values. By default, no pre-trained weights are used.progress (bool, optional) – If True, displays a progress bar of the download to stderr. Default is True.
**kwargs – parameters passed to the
torchvision.models.squeezenet.AlexNet
base class. Please refer to the source code for more details about this class.
-
class
torchvision.models.
AlexNet_Weights
(value)[source]¶ The model builder above accepts the following values as the
weights
parameter.AlexNet_Weights.DEFAULT
is equivalent toAlexNet_Weights.IMAGENET1K_V1
. You can also use strings, e.g.weights='DEFAULT'
orweights='IMAGENET1K_V1'
.AlexNet_Weights.IMAGENET1K_V1:
These weights reproduce closely the results of the paper using a simplified training recipe. Also available as
AlexNet_Weights.DEFAULT
.acc@1 (on ImageNet-1K)
56.522
acc@5 (on ImageNet-1K)
79.066
num_params
61100840
min_size
height=63, width=63
categories
tench, goldfish, great white shark, … (997 omitted)
recipe
The inference transforms are available at
AlexNet_Weights.IMAGENET1K_V1.transforms
and perform the following preprocessing operations: AcceptsPIL.Image
, batched(B, C, H, W)
and single(C, H, W)
imagetorch.Tensor
objects. The images are resized toresize_size=[256]
usinginterpolation=InterpolationMode.BILINEAR
, followed by a central crop ofcrop_size=[224]
. Finally the values are first rescaled to[0.0, 1.0]
and then normalized usingmean=[0.485, 0.456, 0.406]
andstd=[0.229, 0.224, 0.225]
.