scvi.nn.FCLayers#
- class scvi.nn.FCLayers(n_in, n_out, n_cat_list=None, n_cont=0, n_layers=1, n_hidden=128, dropout_rate=0.1, use_batch_norm=True, use_layer_norm=False, use_activation=True, bias=True, inject_covariates=True, activation_fn=<class 'torch.nn.modules.activation.ReLU'>)[source]#
Bases:
Module
A helper class to build fully-connected layers for a neural network.
- Parameters:
n_in (
int
) – The dimensionality of the inputn_out (
int
) – The dimensionality of the outputn_cat_list (
Iterable
[int
] (default:None
)) – A list containing, for each category of interest, the number of categories. Each category will be included using a one-hot encoding.n_cont (
int
(default:0
)) – The dimensionality of the continuous covariatesn_layers (
int
(default:1
)) – The number of fully-connected hidden layersn_hidden (
int
(default:128
)) – The number of nodes per hidden layerdropout_rate (
float
(default:0.1
)) – Dropout rate to apply to each of the hidden layersuse_batch_norm (
bool
(default:True
)) – Whether to have BatchNorm layers or notuse_layer_norm (
bool
(default:False
)) – Whether to have LayerNorm layers or notuse_activation (
bool
(default:True
)) – Whether to have layer activation or notbias (
bool
(default:True
)) – Whether to learn bias in linear layers or notinject_covariates (
bool
(default:True
)) – Whether to inject covariates in each layer, or just the first (default).activation_fn (
Module
(default:<class 'torch.nn.modules.activation.ReLU'>
)) – Which activation function to use
Attributes table#
Methods table#
|
Forward computation on |
|
Helper to determine if covariates should be injected. |
|
Set online update hooks. |
Attributes#
- FCLayers.training: bool#