EncoderTOTALVI¶
-
class
scvi.models.modules.
EncoderTOTALVI
(n_input, n_output, n_cat_list=None, n_layers=2, n_hidden=256, dropout_rate=0.1, distribution='ln')[source]¶ Bases:
torch.nn.modules.module.Module
Encodes data of
n_input
dimensions into a latent space ofn_output
dimensions using a fully-connected neural network ofn_hidden
layers.- Parameters
n_input (
int
int
) – The dimensionality of the input (data space)n_output (
int
int
) – The dimensionality of the output (latent space)n_cat_list (
Iterable
[int
],None
Optional
[Iterable
[int
]]) – A list containing the number of categories for each category of interest. Each category will be included using a one-hot encodingn_layers (
int
int
) – The number of fully-connected hidden layersdropout_rate (
float
float
) – Dropout rate to apply to each of the hidden layersDistribution of the latent space, one of
'normal'
- Normal distribution'ln'
- Logistic normal
- Returns
Methods Summary
forward
(data, *cat_list)The forward computation for a single sample.
reparameterize_transformation
(mu, var)Methods Documentation
-
forward
(data, *cat_list)[source]¶ The forward computation for a single sample.
Encodes the data into latent space using the encoder network
Generates a mean ( q_m ) and variance ( q_v )
Samples a new value from an i.i.d. latent distribution
The dictionary
latent
contains the samples of the latent variables, whileuntran_latent
contains the untransformed versions of these latent variables. For example, the library size is log normally distributed, sountran_latent["l"]
gives the normal sample that was later exponentiated to becomelatent["l"]
. The logistic normal distribution is equivalent to applying softmax to a normal sample.- Parameters
- Returns
6-tuple. First 4 of
torch.Tensor
, next 2 are dict oftorch.Tensor
tensors of shape(n_latent,)
for mean and var, and sample