Shortcuts

Recipes Overview

Recipes are the primary entry points for torchtune users. These can be thought of as hackable, singularly-focused scripts for interacting with LLMs including fine-tuning, inference, evaluation, and quantization.

Each recipe consists of three components:

  • Configurable parameters, specified through yaml configs and command-line overrides

  • Recipe script, entry-point which puts everything together including parsing and validating configs, setting up the environment, and correctly using the recipe class

  • Recipe class, core logic needed for fine-tuning, exposed through a set of APIs

Note

To learn more about the concept of “recipes”, check out our technical deep-dive: What Are Recipes?.

Supervised Finetuning

torchtune provides built-in recipes for finetuning on single device, on multiple devices with FSDP, using a variety of memory optimization features. Our fine-tuning recipes support all of our models and all our dataset types. This includes continued pre-training, and various supervised funetuning paradigms, which can be customized through our datasets. Check out our dataset tutorial for more information.

Our supervised fine-tuning recipes include:

Note

Want to learn more about a certain recipe, but can’t find the documentation here? Not to worry! Our recipe documentation is currently in construction - come back soon to see documentation of your favourite fine-tuning techniques. We’d love to support your contributions if you’re interested in helping out here. Check out our tracker issue here.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources