TSNEkhorn

class torchdr.TSNEkhorn(perplexity: float = 30, n_components: int = 2, lr: float = 1.0, optimizer: str = 'Adam', optimizer_kwargs: dict | None = None, scheduler: str = 'constant', scheduler_kwargs: dict | None = None, init: str = 'pca', init_scaling: float = 0.0001, tol: float = 0.0001, max_iter: int = 2000, tolog: bool = False, device: str | None = None, keops: bool = False, verbose: bool = False, random_state: float = 0, early_exaggeration: float = 10.0, coeff_repulsion: float = 1.0, early_exaggeration_iter: int = 250, lr_affinity_in: float = 0.1, eps_square_affinity_in: bool = True, tol_affinity_in: float = 0.001, max_iter_affinity_in: int = 100, metric_in: str = 'sqeuclidean', metric_out: str = 'sqeuclidean', unrolling: bool = False, symmetric_affinity: bool = True)[source]

Bases: NeighborEmbedding

Implementation of the TSNEkhorn algorithm introduced in [V23].

It involves selecting a SymmetricEntropicAffinity as input affinity \(\mathbf{P}\) and a SinkhornAffinity as output affinity \(\mathbf{Q}\).

The loss function is defined as:

\[-\sum_{ij} P_{ij} \log Q_{ij} + \sum_{ij} Q_{ij} \:.\]

The above loss is the gap objective for inverse symmetric optimal transport described in this blog.

Note

The SymmetricEntropicAffinity requires a careful choice of learning rate (parameter lr_affinity_in). If it is too unstable, one can resort to using EntropicAffinity instead by setting symmetric_affinity to False.

Parameters:
  • perplexity (float) – Number of ‘effective’ nearest neighbors. Consider selecting a value between 2 and the number of samples. Different values can result in significantly different results.

  • n_components (int, optional) – Dimension of the embedding space.

  • lr (float, optional) – Learning rate for the algorithm, by default 1e0.

  • optimizer ({'SGD', 'Adam', 'NAdam'}, optional) – Which pytorch optimizer to use, by default ‘Adam’.

  • optimizer_kwargs (dict, optional) – Arguments for the optimizer, by default None.

  • scheduler ({'constant', 'linear'}, optional) – Learning rate scheduler.

  • scheduler_kwargs (dict, optional) – Arguments for the scheduler, by default None.

  • init ({'normal', 'pca'} or torch.Tensor of shape (n_samples, output_dim), optional) – Initialization for the embedding Z, default ‘pca’.

  • init_scaling (float, optional) – Scaling factor for the initialization, by default 1e-4.

  • tol (float, optional) – Precision threshold at which the algorithm stops, by default 1e-4.

  • max_iter (int, optional) – Number of maximum iterations for the descent algorithm, by default 2000.

  • tolog (bool, optional) – Whether to store intermediate results in a dictionary, by default False.

  • device (str, optional) – Device to use, by default “auto”.

  • keops (bool, optional) – Whether to use KeOps, by default False.

  • verbose (bool, optional) – Verbosity, by default False.

  • random_state (float, optional) – Random seed for reproducibility, by default 0.

  • early_exaggeration (float, optional) – Coefficient for the attraction term during the early exaggeration phase. By default 10.0 for early exaggeration.

  • coeff_repulsion (float, optional) – Coefficient for the repulsion term, by default 1.0.

  • early_exaggeration_iter (int, optional) – Number of iterations for early exaggeration, by default 250.

  • lr_affinity_in (float, optional) – Learning rate used to update dual variables for the symmetric entropic affinity computation.

  • eps_square_affinity_in (bool, optional) – When computing the symmetric entropic affinity, whether to optimize on the square of the dual variables. May be more stable in practice.

  • tol_affinity_in (_type_, optional) – Precision threshold for the symmetric entropic affinity computation.

  • max_iter_affinity_in (int, optional) – Number of maximum iterations for the symmetric entropic affinity computation.

  • metric_in ({'sqeuclidean', 'manhattan'}, optional) – Metric to use for the input affinity, by default ‘sqeuclidean’.

  • metric_out ({'sqeuclidean', 'manhattan'}, optional) – Metric to use for the output affinity, by default ‘sqeuclidean’.

  • unrolling (bool, optional) – Whether to use unrolling for solving inverse OT. If False, uses the gap objective. Default is False.

  • symmetric_affinity (bool, optional) – Whether to use symmetric entropic affinity. If False, uses entropic affinity. Default is True.

References

[V23]

SNEkhorn: Dimension Reduction with Symmetric Entropic Affinities, Hugues Van Assel, Titouan Vayer, Rémi Flamary, Nicolas Courty. Advances in neural information processing systems 36 (NeurIPS).