scvi.autotune.AutotuneExperiment#
- class scvi.autotune.AutotuneExperiment(model_cls, data, metrics, mode, search_space, num_samples, scheduler='asha', searcher='hyperopt', seed=None, resources=None, name=None, logging_dir=None, scheduler_kwargs=None, searcher_kwargs=None)[source]#
BETA
Track hyperparameter tuning experiments.- Parameters:
model_cls (
BaseModelClass
) – Model class on which to tune hyperparameters. Must implement a constructor and atrain
method.data (
AnnData
|MuData
|LightningDataModule
) –AnnData
orMuData
that has been setup withmodel_cls
or aLightningDataModule
(EXPERIMENTAL
).metrics (
str
|list
[str
]) – Either a single metric or a list of metrics to track during the experiment. If a list is provided, the primary metric will be the first element in the list.mode (
Literal
['min'
,'max'
]) – Optimization mode for the primary metric. One of"min"
or"max"
.search_space (
dict
[str
,dict
[Literal
['model_params'
,'train_params'
],dict
[str
,Any
]]]) –Dictionary of hyperparameter names and their respective search spaces. See the API for available search specifications. Must only contain the following top-level keys:
"model_params"
: parameters to pass to the model constructor."train_params"
: parameters to pass to the model’strain
method.
Passed into
Tuner
asparam_space
.num_samples (
int
) – Total number of hyperparameter configurations to sample from the search space. Passed intoTuneConfig
.scheduler (
Literal
['asha'
,'hyperband'
,'median'
,'fifo'
] (default:'asha'
)) –Ray Tune scheduler to use. One of the following:
"asha"
:AsyncHyperBandScheduler
"hyperband"
:HyperBandScheduler
"median"
:MedianStoppingRule
"fifo"
:FIFOScheduler
Configured with reasonable defaults, which can be overridden with
scheduler_kwargs
.searcher (
Literal
['hyperopt'
,'random'
] (default:'hyperopt'
)) –Ray Tune search algorithm to use. One of the following:
"hyperopt"
:HyperOptSearch
"random"
:BasicVariantGenerator
Configured with reasonable defaults, which can be overridden with
searcher_kwargs
.seed (
int
|None
(default:None
)) – Random seed to use for the experiment. Propagated toseed
and search algorithms. If not provided, defaults toseed
.resources (
dict
[Literal
['cpu'
,'gpu'
,'memory'
],float
] |None
(default:None
)) –Dictionary of resources to allocate per trial in the experiment. Available keys include:
"cpu"
: number of CPU cores"gpu"
: number of GPUs"memory"
: amount of memory
Passed into
with_resources()
.name (
str
|None
(default:None
)) – Name of the experiment, used for logging purposes. Defaults to a unique ID.logging_dir (
str
|None
(default:None
)) – Base directory to store experiment logs. Defaults tologging_dir
.scheduler_kwargs (
dict
|None
(default:None
)) – Additional keyword arguments to pass to the scheduler.searcher_kwargs (
dict
|None
(default:None
)) – Additional keyword arguments to pass to the search algorithm.
Notes
Lifecycle: beta
See also
Attributes table#
Data on which to tune hyperparameters. |
|
Unique identifier for the experiment. |
|
Base directory to store experiment logs. |
|
Metrics to track during the experiment. |
|
Optimization mode for the primary metric. |
|
Model class on which to tune hyperparameters. |
|
Name of the experiment. |
|
Total number of hyperparameter configurations to sample. |
|
Resources to allocate per trial in the experiment. |
|
Result grid for the experiment. |
|
Ray Tune scheduler to use. |
|
Additional keyword arguments to pass to the scheduler. |
|
Search space for hyperparameters. |
|
Ray Tune search algorithm to use. |
|
Additional keyword arguments to pass to the search algorithm. |
|
Random seed to use for the experiment. |
|
Keyword arguments for the setup method. |
|
Either |
Methods table#
|
Configure TensorBoard logger for a trial in this experiment. |
Configure a |