scvi.module.VAEC#

class scvi.module.VAEC(n_input, n_labels=0, n_hidden=128, n_latent=5, n_layers=2, log_variational=True, ct_weight=None, dropout_rate=0.05, **module_kwargs)[source]#

Bases: BaseModuleClass

Conditional Variational auto-encoder model.

This is an implementation of the CondSCVI model

Parameters:
  • n_input (int) – Number of input genes

  • n_labels (int) – Number of labels

  • n_hidden (Tunable_[int]) – Number of nodes per hidden layer

  • n_latent (Tunable_[int]) – Dimensionality of the latent space

  • n_layers (Tunable_[int]) – Number of hidden layers used for encoder and decoder NNs

  • dropout_rate (Tunable_[float]) – Dropout rate for the encoder and decoder neural network

  • log_variational (bool) – Log(data+1) prior to encoding for numerical stability. Not normalization.

  • ct_weight (ndarray) –

Attributes table#

Methods table#

generative(z, library, y)

Runs the generative model.

inference(x, y[, n_samples])

High level inference method.

loss(tensors, inference_outputs, ...[, ...])

Loss computation.

sample(tensors[, n_samples])

Generate observation samples from the posterior predictive distribution.

Attributes#

training

VAEC.training: bool#

Methods#

generative

VAEC.generative(z, library, y)[source]#

Runs the generative model.

inference

VAEC.inference(x, y, n_samples=1)[source]#

High level inference method.

Runs the inference (encoder) model.

loss

VAEC.loss(tensors, inference_outputs, generative_outputs, kl_weight=1.0)[source]#

Loss computation.

Parameters:

kl_weight (float) –

sample

VAEC.sample(tensors, n_samples=1)[source]#

Generate observation samples from the posterior predictive distribution.

The posterior predictive distribution is written as \(p(\hat{x} \mid x)\).

Parameters:
  • tensors – Tensors dict

  • n_samples – Number of required samples for each cell

Returns:

x_new : torch.Tensor tensor with shape (n_cells, n_genes, n_samples)

Return type:

ndarray