python.data-structure
=========================
dictionary
^^^^^^^^^^

.. note::

    Tags: :doc:`python.data-structure <python.data-structure>`

    Support Level: SUPPORTED

Original source code:

.. code-block:: python

    import torch
    
    
    
    class Dictionary(torch.nn.Module):
        """
        Dictionary structures are inlined and flattened along tracing.
        """
        def __init__(self):
            super().__init__()
    
        def forward(self, x, y):
            elements = {}
            elements["x2"] = x * x
            y = y * elements["x2"]
            return {"y": y}
    

Result:

.. code-block::

    ExportedProgram:
        class GraphModule(torch.nn.Module):
            def forward(self, arg0_1: "f32[3, 2]", arg1_1: "i64[]"):
                    mul: "f32[3, 2]" = torch.ops.aten.mul.Tensor(arg0_1, arg0_1);  arg0_1 = None
                
                    mul_1: "f32[3, 2]" = torch.ops.aten.mul.Tensor(arg1_1, mul);  arg1_1 = mul = None
                return (mul_1,)
                
    Graph signature: ExportGraphSignature(input_specs=[InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg0_1'), target=None, persistent=None), InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg1_1'), target=None, persistent=None)], output_specs=[OutputSpec(kind=<OutputKind.USER_OUTPUT: 1>, arg=TensorArgument(name='mul_1'), target=None)])
    Range constraints: {}
    


fn_with_kwargs
^^^^^^^^^^^^^^

.. note::

    Tags: :doc:`python.data-structure <python.data-structure>`

    Support Level: SUPPORTED

Original source code:

.. code-block:: python

    import torch
    
    
    
        ),
        tags={"python.data-structure"},
        support_level=SupportLevel.SUPPORTED,
    )
    class FnWithKwargs(torch.nn.Module):
        """
        Keyword arguments are not supported at the moment.
        """
        def __init__(self):
            super().__init__()
    
        def forward(self, pos0, tuple0, *myargs, mykw0, **mykwargs):
            out = pos0
            for arg in tuple0:
                out = out * arg
            for arg in myargs:
                out = out * arg
            out = out * mykw0
            out = out * mykwargs["input0"] * mykwargs["input1"]
            return out
    

Result:

.. code-block::

    ExportedProgram:
        class GraphModule(torch.nn.Module):
            def forward(self, arg0_1: "f32[4]", arg1_1: "f32[4]", arg2_1: "f32[4]", arg3_1: "f32[4]", arg4_1: "f32[4]", arg5_1: "f32[4]", arg6_1: "f32[4]", arg7_1: "f32[4]"):
                    mul: "f32[4]" = torch.ops.aten.mul.Tensor(arg0_1, arg1_1);  arg0_1 = arg1_1 = None
                mul_1: "f32[4]" = torch.ops.aten.mul.Tensor(mul, arg2_1);  mul = arg2_1 = None
                
                    mul_2: "f32[4]" = torch.ops.aten.mul.Tensor(mul_1, arg3_1);  mul_1 = arg3_1 = None
                mul_3: "f32[4]" = torch.ops.aten.mul.Tensor(mul_2, arg4_1);  mul_2 = arg4_1 = None
                
                    mul_4: "f32[4]" = torch.ops.aten.mul.Tensor(mul_3, arg5_1);  mul_3 = arg5_1 = None
                
                    mul_5: "f32[4]" = torch.ops.aten.mul.Tensor(mul_4, arg6_1);  mul_4 = arg6_1 = None
                mul_6: "f32[4]" = torch.ops.aten.mul.Tensor(mul_5, arg7_1);  mul_5 = arg7_1 = None
                return (mul_6,)
                
    Graph signature: ExportGraphSignature(input_specs=[InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg0_1'), target=None, persistent=None), InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg1_1'), target=None, persistent=None), InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg2_1'), target=None, persistent=None), InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg3_1'), target=None, persistent=None), InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg4_1'), target=None, persistent=None), InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg5_1'), target=None, persistent=None), InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg6_1'), target=None, persistent=None), InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg7_1'), target=None, persistent=None)], output_specs=[OutputSpec(kind=<OutputKind.USER_OUTPUT: 1>, arg=TensorArgument(name='mul_6'), target=None)])
    Range constraints: {}
    


list_contains
^^^^^^^^^^^^^

.. note::

    Tags: :doc:`torch.dynamic-shape <torch.dynamic-shape>`, :doc:`python.assert <python.assert>`, :doc:`python.data-structure <python.data-structure>`

    Support Level: SUPPORTED

Original source code:

.. code-block:: python

    import torch
    
    
    
    class ListContains(torch.nn.Module):
        """
        List containment relation can be checked on a dynamic shape or constants.
        """
        def __init__(self):
            super().__init__()
    
        def forward(self, x):
            assert x.size(-1) in [6, 2]
            assert x.size(0) not in [4, 5, 6]
            assert "monkey" not in ["cow", "pig"]
            return x + x
    

Result:

.. code-block::

    ExportedProgram:
        class GraphModule(torch.nn.Module):
            def forward(self, arg0_1: "f32[3, 2]"):
                    add: "f32[3, 2]" = torch.ops.aten.add.Tensor(arg0_1, arg0_1);  arg0_1 = None
                return (add,)
                
    Graph signature: ExportGraphSignature(input_specs=[InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg0_1'), target=None, persistent=None)], output_specs=[OutputSpec(kind=<OutputKind.USER_OUTPUT: 1>, arg=TensorArgument(name='add'), target=None)])
    Range constraints: {}
    


list_unpack
^^^^^^^^^^^

.. note::

    Tags: :doc:`python.control-flow <python.control-flow>`, :doc:`python.data-structure <python.data-structure>`

    Support Level: SUPPORTED

Original source code:

.. code-block:: python

    from typing import List
    
    import torch
    
    
    
    class ListUnpack(torch.nn.Module):
        """
        Lists are treated as static construct, therefore unpacking should be
        erased after tracing.
        """
    
        def __init__(self):
            super().__init__()
    
        def forward(self, args: List[torch.Tensor]):
            """
            Lists are treated as static construct, therefore unpacking should be
            erased after tracing.
            """
            x, *y = args
            return x + y[0]
    

Result:

.. code-block::

    ExportedProgram:
        class GraphModule(torch.nn.Module):
            def forward(self, arg0_1: "f32[3, 2]", arg1_1: "i64[]", arg2_1: "i64[]"):
                    add: "f32[3, 2]" = torch.ops.aten.add.Tensor(arg0_1, arg1_1);  arg0_1 = arg1_1 = None
                return (add,)
                
    Graph signature: ExportGraphSignature(input_specs=[InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg0_1'), target=None, persistent=None), InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg1_1'), target=None, persistent=None), InputSpec(kind=<InputKind.USER_INPUT: 1>, arg=TensorArgument(name='arg2_1'), target=None, persistent=None)], output_specs=[OutputSpec(kind=<OutputKind.USER_OUTPUT: 1>, arg=TensorArgument(name='add'), target=None)])
    Range constraints: {}