scvi.module.VAEC#

class scvi.module.VAEC(n_input, n_batch=0, n_labels=0, n_hidden=128, n_latent=5, n_layers=2, log_variational=True, ct_weight=None, dropout_rate=0.05, encode_covariates=False, extra_encoder_kwargs=None, extra_decoder_kwargs=None)[source]#

Bases: BaseModuleClass

Conditional Variational auto-encoder model.

This is an implementation of the CondSCVI model

Parameters:
  • n_input (int) – Number of input genes

  • n_batch (int (default: 0)) – Number of batches. If 0, no batch correction is performed.

  • n_labels (int (default: 0)) – Number of labels

  • n_hidden (int (default: 128)) – Number of nodes per hidden layer

  • n_latent (int (default: 5)) – Dimensionality of the latent space

  • n_layers (int (default: 2)) – Number of hidden layers used for encoder and decoder NNs

  • log_variational (bool (default: True)) – Log(data+1) prior to encoding for numerical stability. Not normalization.

  • ct_weight (ndarray | None (default: None)) – Multiplicative weight for cell type specific latent space.

  • dropout_rate (float (default: 0.05)) – Dropout rate for the encoder and decoder neural network.

  • encode_covariates (bool (default: False)) – If True, covariates are concatenated to gene expression prior to passing through the encoder(s). Else, only gene expression is used.

  • extra_encoder_kwargs (dict | None (default: None)) – Keyword arguments passed into Encoder.

  • extra_decoder_kwargs (dict | None (default: None)) – Keyword arguments passed into FCLayers.

Attributes table#

Methods table#

generative(z, library, y[, batch_index, ...])

Runs the generative model.

inference(x, y[, batch_index, n_samples])

High level inference method.

loss(tensors, inference_outputs, ...[, ...])

Loss computation.

sample(tensors[, n_samples])

Generate observation samples from the posterior predictive distribution.

Attributes#

VAEC.training: bool#

Methods#

VAEC.generative(z, library, y, batch_index=None, transform_batch=None)[source]#

Runs the generative model.

Return type:

dict[str, Distribution]

VAEC.inference(x, y, batch_index=None, n_samples=1)[source]#

High level inference method.

Runs the inference (encoder) model.

Return type:

dict[str, Tensor | Distribution]

VAEC.loss(tensors, inference_outputs, generative_outputs, kl_weight=1.0)[source]#

Loss computation.

VAEC.sample(tensors, n_samples=1)[source]#

Generate observation samples from the posterior predictive distribution.

The posterior predictive distribution is written as \(p(\hat{x} \mid x)\).

Parameters:
  • tensors (dict[str, Tensor]) – Tensors dict

  • n_samples (int (default: 1)) – Number of required samples for each cell

Return type:

Tensor

Returns:

x_new : torch.Tensor tensor with shape (n_cells, n_genes, n_samples)