COSNE#

class torchdr.COSNE(perplexity: float = 30, lambda1: float = 1, gamma: float = 2, n_components: int = 2, lr: float | str = 'auto', optimizer_kwargs: Dict | str = {}, scheduler: str | Type[LRScheduler] | None = None, scheduler_kwargs: Dict | None = None, init: str = 'hyperbolic', init_scaling: float = 0.5, min_grad_norm: float = 1e-07, max_iter: int = 2000, device: str | None = None, backend: str | None = None, verbose: bool = False, random_state: float | None = None, early_exaggeration_coeff: float = 12.0, early_exaggeration_iter: int | None = 250, tol_affinity: float = 0.001, max_iter_affinity: int = 100, metric_in: str = 'sqeuclidean', sparsity: bool = True, check_interval: int = 50)[source]#

Bases: SparseNeighborEmbedding

Implementation of the CO-Stochastic Neighbor Embedding (CO-SNE) introduced in [Guo et al., 2022].

This algorithm is a variant of SNE that uses a hyperbolic space for the embedding.

Parameters:
  • perplexity (float) – Number of ‘effective’ nearest neighbors. Consider selecting a value between 2 and the number of samples. Different values can result in significantly different results.

  • lambda1 (float) – Coefficient for the loss enforcing equal norms between input samples and embedded samples.

  • gamma (float) – Gamma parameter of the Cauchy distribution used for affinity, by default 2.

  • n_components (int, optional) – Dimension of the embedding space.

  • lr (float, optional) – Learning rate for the algorithm, by default 1.0.

  • optimizer_kwargs (dict, optional) – Arguments for the optimizer, by default None.

  • scheduler ({'constant', 'linear'}, optional) – Learning rate scheduler.

  • scheduler_kwargs (dict, optional) – Arguments for the scheduler, by default None.

  • init ({'hyperbolic'} or torch.Tensor of shape (n_samples, output_dim), optional) – Initialization for the embedding Z, default ‘hyperbolic’.

  • init_scaling (float, optional) – Scaling factor for the initialization, by default 0.5.

  • tol (float, optional) – Precision threshold at which the algorithm stops, by default 1e-4.

  • max_iter (int, optional) – Number of maximum iterations for the descent algorithm, by default 2000.

  • device (str, optional) – Device to use, by default “auto”.

  • backend ({"keops", "faiss", None}, optional) – Which backend to use for handling sparsity and memory efficiency. Default is None.

  • verbose (bool, optional) – Verbosity, by default False.

  • random_state (float, optional) – Random seed for reproducibility, by default None.

  • early_exaggeration_coeff (float, optional) – Coefficient for the attraction term during the early exaggeration phase. By default 12.0 for early exaggeration.

  • early_exaggeration_iter (int, optional) – Number of iterations for early exaggeration, by default 250.

  • tol_affinity (float, optional) – Precision threshold for the entropic affinity root search.

  • max_iter_affinity (int, optional) – Number of maximum iterations for the entropic affinity root search.

  • metric_in ({'sqeuclidean', 'manhattan'}, optional) – Metric to use for the input affinity, by default ‘sqeuclidean’.

  • sparsity (bool, optional) – Whether to use sparsity mode for the input affinity. Default is True.

  • check_interval (int, optional) – Number of iterations between checks for convergence, by default 50.

Examples using COSNE:#

TSNE vs COSNE : Euclidean vs Hyperbolic

TSNE vs COSNE : Euclidean vs Hyperbolic