API and Modules#
This page provides a complete reference of all TorchDR classes and functions. For conceptual background, see the User Guide.
Dimensionality Reduction Methods#
TorchDR provides sklearn-compatible estimators that work seamlessly with both NumPy arrays and PyTorch tensors. All methods support GPU acceleration via device="cuda" and can scale to large datasets using backend="faiss" or backend="keops".
Neighbor Embedding#
|
UMAP introduced in [McInnes et al., 2018] and further studied in [Damrich and Hamprecht, 2021]. |
|
t-Stochastic Neighbor Embedding (t-SNE) introduced in [Van der Maaten and Hinton, 2008]. |
|
InfoTSNE algorithm introduced in [Damrich et al., 2022]. |
|
LargeVis algorithm introduced in [Tang et al., 2016]. |
|
Stochastic Neighbor Embedding (SNE) introduced in [Hinton and Roweis, 2002]. |
|
TSNEkhorn algorithm introduced in [Van Assel et al., 2024]. |
|
Implementation of the CO-Stochastic Neighbor Embedding (CO-SNE) introduced in [Guo et al., 2022]. |
|
PACMAP algorithm introduced in [Wang et al., 2021]. |
Spectral Methods#
|
Principal Component Analysis module. |
|
Incremental Principal Components Analysis (IPCA) leveraging PyTorch for GPU acceleration. |
|
Exact Incremental Principal Component Analysis. |
|
Kernel Principal Component Analysis module. |
|
Implementation of PHATE introduced in [Moon et al., 2019]. |
Affinities#
Affinities are the building blocks for constructing the input similarity matrix \(\mathbf{P}\). See User Guide for details on how affinities are used in DR methods.
Adaptive Affinities#
Affinities that adapt bandwidth based on local neighborhood structure.
|
Solve the directed entropic affinity problem introduced in [Hinton and Roweis, 2002]. |
|
Compute the symmetric entropic affinity (SEA) introduced in [Van Assel et al., 2024]. |
|
Compute the input affinity used in UMAP [McInnes et al., 2018]. |
|
Compute the input affinity used in PACMAP [Wang et al., 2021]. |
|
Self-tuning affinity introduced in [Zelnik-Manor and Perona, 2004]. |
|
Compute the MAGIC affinity with alpha-decay kernel introduced in [Van Dijk et al., 2018]. |
|
Compute the potential affinity used in PHATE [Moon et al., 2019]. |
Other Normalized Affinities#
Other normalized affinity kernels.
|
Compute the Gaussian affinity matrix which can be normalized along a dimension. |
|
Compute the Student affinity matrix which can be normalized along a dimension. |
|
Compute the symmetric doubly stochastic affinity matrix. |
|
Compute the symmetric doubly stochastic affinity. |
Base Classes#
These classes provide the foundation for implementing custom DR methods.
Core Base Classes#
|
Base class for DR methods. |
|
Perform dimensionality reduction by matching two affinity matrices. |
Neighbor Embedding Base Classes#
Base classes for neighbor embedding methods. SparseNeighborEmbedding leverages input affinity sparsity for efficient attractive term computation. NegativeSamplingNeighborEmbedding adds approximate repulsive term computation via negative sampling.
|
Solves the neighbor embedding problem. |
|
Solves the neighbor embedding problem with a sparse input affinity matrix. |
|
Solves the neighbor embedding problem with both sparsity and sampling. |
Evaluation Metrics#
|
Compute the Silhouette score as the mean of silhouette coefficients. |
|
Compute the silhouette coefficients for each data sample. |
|
Compute k-NN label accuracy to evaluate class structure preservation. |
|
Compute K-ary neighborhood preservation between input data and embeddings. |
|
Perform K-means clustering and compute Adjusted Rand Index. |
Utils#
|
Compute pairwise distances between two tensors or from a DataLoader. |
|
Batched binary search root finding. |
|
Batched false-position root finding. |