=======

torchml.decomposition

Classes

torchml.decomposition.PCA

Description

Linear dimensionality reduction using Singular Value Decomposition of the data to project it to a lower dimensional space. The input data is centered but not scaled for each feature before applying the SVD.

References
  1. The scikit-learn [documentation page] (https://scikit-learn.org/stable/modules/generated/sklearn.decomposition.PCA.html)
  2. Nathan Halko, Per-Gunnar Martinsson, and Joel Tropp, Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions, arXiv:0909.4061 [math.NA; math.PR], 2009.
Arguments
  • n_components : int, default=None Number of components to keep. if n_components is not set all components are kept:: n_components == min(n_samples, n_features)
  • svd_solver : {'auto', 'full', 'arpack', 'randomized'}, default='auto' The algorithm that runs SVD. If auto : The solver is selected by a default policy based on X.shape and n_components: if the input data is larger than 500x500 and the number of components to extract is lower than 80% of the smallest dimension of the data, then the more efficient 'randomized' method is enabled. Otherwise the exact full SVD is computed.
Example
import torch
from torchml.decomposition import PCA
X = torch.tensor([[-1, -1], [-2, -1], [-3, -2], [1, 1], [2, 1], [3, 2]])
pca = PCA(n_components=2)
pca.fit(X)

fit(self, X)

Description

Fit the model with X.

Arguments
  • X (Tensor) - Input variates.
Example
pca = PCA()
pca.fit(X)

fit_transform(self, X)

Description

Fit the model with X and apply the dimensionality reduction on X.

Arguments
  • X (Tensor) - Input variates.
Example
pca = PCA()
X_reduced = pca.fit_transform(X)

transform(self, X)

Description

Apply dimensionality reduction to X.

Arguments
  • X (Tensor) - Input variates.
Example
pca = PCA()
X_reduced = pca.fit(X).transform(X)