Shortcuts

torch.pca_lowrank

torch.pca_lowrank(A, q=None, center=True, niter=2)[source]

Performs linear Principal Component Analysis (PCA) on a low-rank matrix, batches of such matrices, or sparse matrix.

This function returns a namedtuple (U, S, V) which is the nearly optimal approximation of a singular value decomposition of a centered matrix AA such that A=Udiag(S)VTA = U diag(S) V^T.

Note

The relation of (U, S, V) to PCA is as follows:

  • AA is a data matrix with m samples and n features

  • the VV columns represent the principal directions

  • S2/(m1)S ** 2 / (m - 1) contains the eigenvalues of ATA/(m1)A^T A / (m - 1) which is the covariance of A when center=True is provided.

  • matmul(A, V[:, :k]) projects data to the first k principal components

Note

Different from the standard SVD, the size of returned matrices depend on the specified rank and q values as follows:

  • UU is m x q matrix

  • SS is q-vector

  • VV is n x q matrix

Note

To obtain repeatable results, reset the seed for the pseudorandom number generator

Parameters
  • A (Tensor) – the input tensor of size (,m,n)(*, m, n)

  • q (int, optional) – a slightly overestimated rank of AA. By default, q = min(6, m, n).

  • center (bool, optional) – if True, center the input tensor, otherwise, assume that the input is centered.

  • niter (int, optional) – the number of subspace iterations to conduct; niter must be a nonnegative integer, and defaults to 2.

References:

- Nathan Halko, Per-Gunnar Martinsson, and Joel Tropp, Finding
  structure with randomness: probabilistic algorithms for
  constructing approximate matrix decompositions,
  arXiv:0909.4061 [math.NA; math.PR], 2009 (available at
  `arXiv <http://arxiv.org/abs/0909.4061>`_).

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources