Class Library¶
Defined in File library.h
Page Contents
Class Documentation¶
-
class Library¶
This object provides the API for defining operators and providing implementations at dispatch keys.
Typically, a torch::Library is not allocated directly; instead it is created by the TORCH_LIBRARY() or TORCH_LIBRARY_IMPL() macros.
Most methods on torch::Library return a reference to itself, supporting method chaining.
// Examples: TORCH_LIBRARY(torchvision, m) { // m is a torch::Library m.def("roi_align", ...); ... } TORCH_LIBRARY_IMPL(aten, XLA, m) { // m is a torch::Library m.impl("add", ...); ... }
Public Functions
-
~Library() = default¶
-
template<typename Schema>
inline Library &def(Schema &&raw_schema, const std::vector<at::Tag> &tags = {}, _RegisterOrVerify rv = _RegisterOrVerify::REGISTER) &¶ Declare an operator with a schema, but don’t provide any implementations for it.
You’re expected to then provide implementations using the impl() method. All template arguments are inferred.
// Example: TORCH_LIBRARY(myops, m) { m.def("add(Tensor self, Tensor other) -> Tensor"); }
- Parameters
raw_schema – The schema of the operator to be defined. Typically, this is a
const char*
string literal, but any type accepted by torch::schema() is accepted here.
-
inline Library &set_python_module(const char *pymodule, const char *context = "")¶
Declares that for all operators that are subsequently def’ed, their fake impls may be found in the given Python module (pymodule).
This registers some help text that is used if the fake impl cannot be found.
Args:
pymodule: the python module
context: We may include this in the error message.
-
inline Library &impl_abstract_pystub(const char *pymodule, const char *context = "")¶
Deprecated; use set_python_module instead.
-
template<typename NameOrSchema, typename Func>
inline Library &def(NameOrSchema &&raw_name_or_schema, Func &&raw_f, const std::vector<at::Tag> &tags = {}) &¶ Define an operator for a schema and then register an implementation for it.
This is typically what you would use if you aren’t planning on making use of the dispatcher to structure your operator implementation. It’s roughly equivalent to calling def() and then impl(), but if you omit the schema of the operator, we will infer it from the type of your C++ function. All template arguments are inferred.
// Example: TORCH_LIBRARY(myops, m) { m.def("add", add_fn); }
- Parameters
raw_name_or_schema – The schema of the operator to be defined, or just the name of the operator if the schema is to be inferred from
raw_f
. Typically aconst char*
literal.raw_f – The C++ function that implements this operator. Any valid constructor of torch::CppFunction is accepted here; typically you provide a function pointer or lambda.
-
template<typename Name, typename Func>
inline Library &impl(Name name, Func &&raw_f, _RegisterOrVerify rv = _RegisterOrVerify::REGISTER) &¶ Register an implementation for an operator.
You may register multiple implementations for a single operator at different dispatch keys (see torch::dispatch()). Implementations must have a corresponding declaration (from def()), otherwise they are invalid. If you plan to register multiple implementations, DO NOT provide a function implementation when you def() the operator.
// Example: TORCH_LIBRARY_IMPL(myops, CUDA, m) { m.impl("add", add_cuda); }
- Parameters
name – The name of the operator to implement. Do NOT provide schema here.
raw_f – The C++ function that implements this operator. Any valid constructor of torch::CppFunction is accepted here; typically you provide a function pointer or lambda.
-
c10::OperatorName _resolve(const char *name) const¶
-
inline Library &def(detail::SelectiveStr<false>, const std::vector<at::Tag> &tags[
[maybe_unused]
] = {}) &¶
-
inline Library &def(detail::SelectiveStr<true> raw_schema, const std::vector<at::Tag> &tags = {}) &¶
-
template<typename Func>
inline Library &def(detail::SelectiveStr<false>, Func&&, const std::vector<at::Tag> &tags[[maybe_unused]
] = {}) &¶
-
template<typename Func>
inline Library &def(detail::SelectiveStr<true> raw_name_or_schema, Func &&raw_f, const std::vector<at::Tag> &tags = {}) &¶
-
template<typename Func>
inline Library &impl(detail::SelectiveStr<false>, Func&&) &¶
-
template<typename Dispatch, typename Func>
inline Library &impl(detail::SelectiveStr<false>, Dispatch&&, Func&&) &¶
-
template<typename Func>
inline Library &impl_UNBOXED(detail::SelectiveStr<false>, Func*) &¶
-
template<typename Func>
inline Library &impl(detail::SelectiveStr<true> name, Func &&raw_f) &¶
-
template<typename Dispatch, typename Func>
inline Library &impl(detail::SelectiveStr<true> name, Dispatch &&key, Func &&raw_f) &¶
-
template<typename Func>
inline Library &impl_UNBOXED(detail::SelectiveStr<true>, Func*) &¶
-
template<typename Func>
inline Library &fallback(Func &&raw_f) &¶ Register a fallback implementation for all operators which will be used if there is not a specific implementation for an operator available.
There MUST be a DispatchKey associated with a fallback; e.g., only call this from TORCH_LIBRARY_IMPL() with namespace
_
.// Example: TORCH_LIBRARY_IMPL(_, AutogradXLA, m) { // If there is not a kernel explicitly registered // for AutogradXLA, fallthrough to the next // available kernel m.fallback(torch::CppFunction::makeFallthrough()); } // See aten/src/ATen/core/dispatch/backend_fallback_test.cpp // for a full example of boxed fallback
- Parameters
raw_f – The function that implements the fallback. Unboxed functions typically do not work as fallback functions, as fallback functions must work for every operator (even though they have varying type signatures). Typical arguments are CppFunction::makeFallthrough() or CppFunction::makeFromBoxedFunction()
-
template<class CurClass>
inline torch::class_<CurClass> class_(detail::SelectiveStr<true> className)¶
-
template<class CurClass>
inline detail::ClassNotSelected class_(detail::SelectiveStr<false> className)¶
-
void reset()¶
-
template<class CurClass>
inline class_<CurClass> class_(detail::SelectiveStr<true> className)¶
Friends
- friend class detail::TorchLibraryInit
-
~Library() = default¶