python.closure
==================
cond_closed_over_variable
^^^^^^^^^^^^^^^^^^^^^^^^^

.. note::

    Tags: :doc:`python.closure <python.closure>`, :doc:`torch.cond <torch.cond>`

    Support Level: SUPPORTED

Original source code:

.. code-block:: python

    import torch
    
    from functorch.experimental.control_flow import cond
    
    
    class CondClosedOverVariable(torch.nn.Module):
        """
        torch.cond() supports branches closed over arbitrary variables.
        """
    
        def forward(self, pred, x):
            def true_fn(val):
                return x * 2
    
            def false_fn(val):
                return x - 2
    
            return cond(pred, true_fn, false_fn, [x + 1])
    

Result:

.. code-block::

    ExportedProgram:
        class GraphModule(torch.nn.Module):
            def forward(self, arg0_1: b8[], arg1_1: f32[3, 2]):
                # 
                add: f32[3, 2] = torch.ops.aten.add.Tensor(arg1_1, 1)
                submodule_0 = self.submodule_0
                submodule_1 = self.submodule_1
                cond: f32[3, 2] = torch.ops.higher_order.cond(arg0_1, submodule_0, submodule_1, [add, arg1_1, arg1_1]);  arg0_1 = submodule_0 = submodule_1 = add = arg1_1 = None
                return (cond,)
                
            class GraphModule(torch.nn.Module):
                def forward(self, arg0_1: f32[3, 2], arg1_1: f32[3, 2], arg2_1: f32[3, 2]):
                            mul: f32[3, 2] = torch.ops.aten.mul.Tensor(arg2_1, 2);  arg2_1 = None
                    return mul
                    
            class GraphModule(torch.nn.Module):
                def forward(self, arg0_1: f32[3, 2], arg1_1: f32[3, 2], arg2_1: f32[3, 2]):
                            sub: f32[3, 2] = torch.ops.aten.sub.Tensor(arg2_1, 2);  arg2_1 = None
                    return sub
                    
    Graph Signature: ExportGraphSignature(parameters=[], buffers=[], user_inputs=['arg0_1', 'arg1_1'], user_outputs=['cond'], inputs_to_parameters={}, inputs_to_buffers={}, buffers_to_mutate={}, backward_signature=None, assertion_dep_token=None)
    Symbol to range: {}
    


nested_function
^^^^^^^^^^^^^^^

.. note::

    Tags: :doc:`python.closure <python.closure>`

    Support Level: SUPPORTED

Original source code:

.. code-block:: python

    import torch
    
    
    
    def nested_function(a, b):
        """
        Nested functions are traced through. Side effects on global captures
        are not supported though.
        """
        x = a + b
        z = a - b
    
        def closure(y):
            nonlocal x
            x += 1
            return x * y + z
    
        return closure(x)
    

Result:

.. code-block::

    ExportedProgram:
        class GraphModule(torch.nn.Module):
            def forward(self, arg0_1: f32[3, 2], arg1_1: f32[2]):
                # 
                add: f32[3, 2] = torch.ops.aten.add.Tensor(arg0_1, arg1_1)
                sub: f32[3, 2] = torch.ops.aten.sub.Tensor(arg0_1, arg1_1);  arg0_1 = arg1_1 = None
                add_1: f32[3, 2] = torch.ops.aten.add.Tensor(add, 1);  add = None
                mul: f32[3, 2] = torch.ops.aten.mul.Tensor(add_1, add_1);  add_1 = None
                add_2: f32[3, 2] = torch.ops.aten.add.Tensor(mul, sub);  mul = sub = None
                return (add_2,)
                
    Graph Signature: ExportGraphSignature(parameters=[], buffers=[], user_inputs=['arg0_1', 'arg1_1'], user_outputs=['add_2'], inputs_to_parameters={}, inputs_to_buffers={}, buffers_to_mutate={}, backward_signature=None, assertion_dep_token=None)
    Symbol to range: {}