July 25, 2023
Announcing CPP-based S3 IO DataPipes
Training large deep learning models requires large datasets. Amazon Simple Storage Service (Amazon S3) is a scalable cloud object store service used for storing large training datasets. Machine learning (ML) practitioners need an efficient data pipe that can download data from Amazon S3, transform the data, and feed the data to GPUs for training models with high throughput and low latency. In this post, we introduce the new S3 IO DataPipes for PyTorch, S3FileLister and S3FileLoader. For memo...
June 28, 2023
The Path to Achieve Ultra-Low Inference Latency With LLaMA 65B on PyTorch/XLA
Background & State of the Art
June 22, 2023
Optimized PyTorch 2.0 Inference with AWS Graviton processors
New generations of CPUs offer significant performance improvement in machine learning (ML) inference due to specialized built-in instructions. Combined with their flexibility, high speed of development, and low operating cost, these general-purpose processors offer an alternative ML inference solution to other existing hardware solutions.
June 16, 2023
🎉 PyTorch Docathon H1 2023 Wrap-up 🎉
Thank you to all who participated in our first ever PyTorch Docathon, the results have been nothing short of amazing! We want to extend our sincerest gratitude to all the participants who made this event a resounding success. Your passion, talent, and hard work have left an indelible mark on the PyTorch documentation.
June 07, 2023
Join the PyTorch Foundation: Membership Now Open
In September 2022, we welcomed PyTorch to the Linux Foundation from Meta, which formed the PyTorch Foundation with founding members AMD, Amazon Web Services (AWS), Google, Meta, Microsoft, and NVIDIA.