EncoderTOTALVI

class scvi.models.modules.EncoderTOTALVI(n_input, n_output, n_cat_list=None, n_layers=2, n_hidden=256, dropout_rate=0.1, distribution='ln')[source]

Bases: torch.nn.modules.module.Module

Encodes data of n_input dimensions into a latent space of n_output dimensions using a fully-connected neural network of n_hidden layers.

Parameters
  • n_input (intint) – The dimensionality of the input (data space)

  • n_output (intint) – The dimensionality of the output (latent space)

  • n_cat_list (Iterable[int], NoneOptional[Iterable[int]]) – A list containing the number of categories for each category of interest. Each category will be included using a one-hot encoding

  • n_layers (intint) – The number of fully-connected hidden layers

  • n_hidden (intint) – The number of nodes per hidden layer

  • dropout_rate (floatfloat) – Dropout rate to apply to each of the hidden layers

  • distribution (strstr) –

    Distribution of the latent space, one of

    • 'normal' - Normal distribution

    • 'ln' - Logistic normal

Returns

Methods Summary

forward(data, *cat_list)

The forward computation for a single sample.

reparameterize_transformation(mu, var)

Methods Documentation

forward(data, *cat_list)[source]

The forward computation for a single sample.

  1. Encodes the data into latent space using the encoder network

  2. Generates a mean ( q_m ) and variance ( q_v )

  3. Samples a new value from an i.i.d. latent distribution

The dictionary latent contains the samples of the latent variables, while untran_latent contains the untransformed versions of these latent variables. For example, the library size is log normally distributed, so untran_latent["l"] gives the normal sample that was later exponentiated to become latent["l"]. The logistic normal distribution is equivalent to applying softmax to a normal sample.

Parameters
  • data (TensorTensor) – tensor with shape (n_input,)

  • cat_list (intint) – list of category membership(s) for this sample

Returns

6-tuple. First 4 of torch.Tensor, next 2 are dict of torch.Tensor tensors of shape (n_latent,) for mean and var, and sample

reparameterize_transformation(mu, var)[source]