• Docs >
  • TorchServe default inference handlers
Shortcuts

⚠️ Notice: Limited Maintenance

This project is no longer actively maintained. While existing releases remain available, there are no planned updates, bug fixes, new features, or security patches. Users should be aware that vulnerabilities may not be addressed.

TorchServe default inference handlers

TorchServe provides following inference handlers out of box. It’s expected that the models consumed by each support batched inference.

image_classifier

  • Description : Handles image classification models trained on the ImageNet dataset.

  • Input : RGB image

  • Output : Batch of top 5 predictions and their respective probability of the image

For more details see examples

image_segmenter

  • Description : Handles image segmentation models trained on the ImageNet dataset.

  • Input : RGB image

  • Output : Output shape as [N, CL H W], N - batch size, CL - number of classes, H - height and W - width.

For more details see examples

object_detector

  • Description : Handles object detection models.

  • Input : RGB image

  • Output : Batch of lists of detected classes and bounding boxes respectively

Note : We recommend running torchvision>0.6 otherwise the object_detector default handler will only run on the default GPU device

For more details see examples

text_classifier

  • Description : Handles models trained on the AG_NEWS dataset.

  • Input : text file

  • Output : Class of input text. (No batching supported)

For more details see examples

For a more comprehensive list of available handlers make sure to check out the examples page

Common features

index_to_name.json

image_classifier, text_classifier and object_detector can all automatically map from numeric classes (0,1,2…) to friendly strings. To do this, simply include in your model archive a file, index_to_name.json, that contains a mapping of class number (as a string) to friendly name (also as a string). You can see some examples here:

Contributing

We welcome new contributed handlers, if your usecase isn’t covered by one of the existing default handlers please follow the below steps to contribute it

  1. Write a new class derived from BaseHandler. Add it as a separate file in ts/torch_handler/

  2. Update model-archiver/model_packaging.py to add in your classes name

  3. Run and update the unit tests in unit_tests. As always, make sure to run torchserve_sanity.py before submitting.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources