Shortcuts

alexnet

torchvision.models.alexnet(*, weights: Optional[torchvision.models.alexnet.AlexNet_Weights] = None, progress: bool = True, **kwargs: Any)torchvision.models.alexnet.AlexNet[source]

AlexNet model architecture from One weird trick for parallelizing convolutional neural networks.

Note

AlexNet was originally introduced in the ImageNet Classification with Deep Convolutional Neural Networks paper. Our implementation is based instead on the “One weird trick” paper above.

Parameters
  • weights (AlexNet_Weights, optional) – The pretrained weights to use. See AlexNet_Weights below for more details, and possible values. By default, no pre-trained weights are used.

  • progress (bool, optional) – If True, displays a progress bar of the download to stderr. Default is True.

  • **kwargs – parameters passed to the torchvision.models.squeezenet.AlexNet base class. Please refer to the source code for more details about this class.

class torchvision.models.AlexNet_Weights(value)[source]

The model builder above accepts the following values as the weights parameter. AlexNet_Weights.DEFAULT is equivalent to AlexNet_Weights.IMAGENET1K_V1. You can also use strings, e.g. weights='DEFAULT' or weights='IMAGENET1K_V1'.

AlexNet_Weights.IMAGENET1K_V1:

These weights reproduce closely the results of the paper using a simplified training recipe. Also available as AlexNet_Weights.DEFAULT.

acc@1 (on ImageNet-1K)

56.522

acc@5 (on ImageNet-1K)

79.066

num_params

61100840

min_size

height=63, width=63

categories

tench, goldfish, great white shark, … (997 omitted)

recipe

link

The inference transforms are available at AlexNet_Weights.IMAGENET1K_V1.transforms and perform the following preprocessing operations: Accepts PIL.Image, batched (B, C, H, W) and single (C, H, W) image torch.Tensor objects. The images are resized to resize_size=[256] using interpolation=InterpolationMode.BILINEAR, followed by a central crop of crop_size=[224]. Finally the values are first rescaled to [0.0, 1.0] and then normalized using mean=[0.485, 0.456, 0.406] and std=[0.229, 0.224, 0.225].

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources