Shortcuts

ts.torch_handler package

Subpackages

Submodules

ts.torch_handler.base_handler module

Base default handler to load torchscript or eager mode [state_dict] models Also, provides handle method per torch serve custom model specification

class ts.torch_handler.base_handler.BaseHandler[source]

Bases: ABC

Base default handler to load torchscript or eager mode [state_dict] models Also, provides handle method per torch serve custom model specification

describe_handle()[source]

Customized describe handler

Returns:

A dictionary response.

Return type:

dict

explain_handle(data_preprocess, raw_data)[source]

Captum explanations handler

Parameters:
  • data_preprocess (Torch Tensor) – Preprocessed data to be used for captum

  • raw_data (list) – The unprocessed data to get target from the request

Returns:

A dictionary response with the explanations response.

Return type:

dict

get_device()[source]

Get device

Returns:

self device

Return type:

string

handle(data, context)[source]
Entry point for default handler. It takes the data from the input request and returns

the predicted outcome for the input.

Parameters:
  • data (list) – The input data that needs to be made a prediction request on.

  • context (Context) – It is a JSON Object containing information pertaining to the model artifacts parameters.

Returns:

Returns a list of dictionary with the predicted response.

Return type:

list

inference(*args, **kwargs)
initialize(context)[source]
Initialize function loads the model.pt file and initialized the model object.

First try to load torchscript else load eager mode state_dict based model.

Parameters:
  • context (context) – It is a JSON Object containing information

  • parameters. (pertaining to the model artifacts) –

Raises:

RuntimeError – Raises the Runtime error when the model.py is missing

postprocess(*args, **kwargs)
preprocess(*args, **kwargs)
ts.torch_handler.base_handler.setup_ort_session(model_pt_path, map_location)[source]

ts.torch_handler.contractions module

contraction map for text classification models.

ts.torch_handler.densenet_handler module

Module for image classification default handler

class ts.torch_handler.densenet_handler.DenseNetHandler[source]

Bases: object

DenseNetHandler handler class. This handler takes an image and returns the name of object in that image.

handle(data, context)[source]

Entry point for default handler

inference(data, *args, **kwargs)[source]

Override to customize the inference :param data: Torch tensor, matching the model input shape :return: Prediction output as Torch tensor

initialize(context)[source]

First try to load torchscript else load eager mode state_dict based model

ts.torch_handler.densenet_handler.list_classes_from_module(module, parent_class=None)[source]

Parse user defined module to get all model service classes in it.

Parameters:
  • module

  • parent_class

Returns:

List of model service class definitions

ts.torch_handler.image_classifier module

Module for image classification default handler

class ts.torch_handler.image_classifier.ImageClassifier[source]

Bases: VisionHandler

ImageClassifier handler class. This handler takes an image and returns the name of object in that image.

get_max_result_classes()[source]
image_processing = Compose(     Resize(size=256, interpolation=bilinear, max_size=None, antialias=True)     CenterCrop(size=(224, 224))     ToTensor()     Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]) )
postprocess(*args, **kwargs)
set_max_result_classes(topk)[source]
topk = 5

ts.torch_handler.image_segmenter module

Module for image segmentation default handler

class ts.torch_handler.image_segmenter.ImageSegmenter[source]

Bases: VisionHandler

ImageSegmenter handler class. This handler takes a batch of images and returns output shape as [N K H W], where N - batch size, K - number of classes, H - height and W - width.

image_processing = Compose(     Resize(size=256, interpolation=bilinear, max_size=None, antialias=True)     CenterCrop(size=(224, 224))     ToTensor()     Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]) )
postprocess(data)[source]

ts.torch_handler.object_detector module

Module for object detection default handler

class ts.torch_handler.object_detector.ObjectDetector[source]

Bases: VisionHandler

ObjectDetector handler class. This handler takes an image and returns list of detected classes and bounding boxes respectively

image_processing = Compose(     ToTensor() )
initialize(context)[source]
Initialize function loads the model.pt file and initialized the model object.

First try to load torchscript else load eager mode state_dict based model.

Parameters:
  • context (context) – It is a JSON Object containing information

  • parameters. (pertaining to the model artifacts) –

Raises:

RuntimeError – Raises the Runtime error when the model.py is missing

postprocess(data)[source]
threshold = 0.5

ts.torch_handler.text_classifier module

Module for text classification default handler DOES NOT SUPPORT BATCH!

class ts.torch_handler.text_classifier.TextClassifier[source]

Bases: TextHandler

TextClassifier handler class. This handler takes a text (string) and as input and returns the classification text based on the model vocabulary.

get_insights(text_preprocess, _, target=0)[source]

Calculates the captum insights

Parameters:
  • text_preprocess (tensor) – Tensor of the Text Input

  • _ (str) – The Raw text data specified in the input request

  • target (int) – Defaults to 0, the user needs to specify the target for the captum explanation.

Returns:

Returns a dictionary of the word token importances

Return type:

(dict)

inference(data, *args, **kwargs)[source]

The Inference Request is made through this function and the user needs to override the inference function to customize it.

Parameters:

data (torch tensor) –

The data is in the form of Torch Tensor whose shape should match that of the

Model Input shape.

Returns:

The predicted response from the model is returned

in this function.

Return type:

(Torch Tensor)

ngrams = 2
postprocess(data)[source]
The post process function converts the prediction response into a

Torchserve compatible format

Parameters:
  • data (Torch Tensor) – The data parameter comes from the prediction output

  • output_explain (None) – Defaults to None.

Returns:

Returns the response containing the predictions and explanations

(if the Endpoint is hit).It takes the form of a list of dictionary.

Return type:

(list)

preprocess(data)[source]
Normalizes the input text for PyTorch model using following basic cleanup operations :
  • remove html tags

  • lowercase all text

  • expand contractions [like I’d -> I would, don’t -> do not]

  • remove accented characters

  • remove punctuations

Converts the normalized text to tensor using the source_vocab.

Parameters:

data (str) – The input data is in the form of a string

Returns:

Text Tensor is returned after perfoming the pre-processing operations (str): The raw input is also returned in this function

Return type:

(Tensor)

ts.torch_handler.text_handler module

Base module for all text based default handler. Contains various text based utility methods

class ts.torch_handler.text_handler.TextHandler[source]

Bases: BaseHandler, ABC

Base class for all text based default handler. Contains various text based utility methods

get_source_vocab_path(ctx)[source]
get_word_token(input_tokens)[source]

Constructs word tokens from text

initialize(context)[source]

Loads the model and Initializes the necessary artifacts

summarize_attributions(attributions)[source]

Summarises the attribution across multiple runs

ts.torch_handler.vision_handler module

Base module for all vision handlers

class ts.torch_handler.vision_handler.VisionHandler[source]

Bases: BaseHandler, ABC

Base class for all vision handlers

get_insights(tensor_data, _, target=0)[source]
initialize(context)[source]
Initialize function loads the model.pt file and initialized the model object.

First try to load torchscript else load eager mode state_dict based model.

Parameters:
  • context (context) – It is a JSON Object containing information

  • parameters. (pertaining to the model artifacts) –

Raises:

RuntimeError – Raises the Runtime error when the model.py is missing

preprocess(*args, **kwargs)

Module contents

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources