• Docs >
  • functorch.compile (experimental)

functorch.compile (experimental)

AOT Autograd is an experimental feature that allows ahead of time capture of forward and backward graphs, and allows easy integration with compilers. This creates an easy to hack Python-based development environment to speedup training of PyTorch models. AOT Autograd currently lives inside functorch.compile namespace.


AOT Autograd is experimental and the APIs are likely to change. We are looking for feedback. If you are interested in using AOT Autograd and need help or have suggestions, please feel free to open an issue. We will be happy to help.

Compilation APIs (experimental)


Traces the forward and backward graph of fn using torch dispatch mechanism, and then compiles the generated forward and backward graphs through fw_compiler and bw_compiler.


Traces the forward and backward graph of mod using torch dispatch tracing mechanism.


Wrapper function over aot_function() and aot_module() to perform memory efficient fusion.

Partitioners (experimental)


Partitions the joint_module in a manner that closely resembles the behavior observed in the original .forward() and .backward() of the callable, i.e., the resulting forward graph contains those operators that are executed in the original .forward() callable passed to aot_function().


Partitions the joint graph such that the backward recomputes the forward.

Compilers (experimental)


Returns the fx_g Fx graph module as it is.


Compiles the fx_g with Torchscript compiler.