Shortcuts

torch::deploy [Beta]

torch::deploy is a system that allows you to load multiple python interpreters which execute PyTorch models, and run them in a single C++ process. Effectively, it allows people to multithread their pytorch models. For more information on how torch::deploy works please see the related arXiv paper. We plan to further generalize torch::deploy into a more generic system, multipy::runtime, which is more suitable for arbitrary python programs rather than just pytorch applications.

Documentation

Acknowledgements

This documentation website for the MultiPy C++ API has been enabled by the Exhale project and generous investment of time and effort by its maintainer, svenevs. We thank Stephen for his work and his efforts providing help with both the PyTorch and MultiPy C++ documentation.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources