scvi.autotune.AutotuneExperiment#

class scvi.autotune.AutotuneExperiment(model_cls, data, metrics, mode, search_space, num_samples, scheduler='asha', searcher='hyperopt', seed=None, resources=None, name=None, logging_dir=None, scheduler_kwargs=None, searcher_kwargs=None)[source]#

BETA Track hyperparameter tuning experiments.

Parameters:
  • model_cls (BaseModelClass) – Model class on which to tune hyperparameters. Must implement a constructor and a train method.

  • data (Union[AnnData, MuData, LightningDataModule]) – AnnData or MuData that has been setup with model_cls or a LightningDataModule (EXPERIMENTAL).

  • metrics (str | list[str]) – Either a single metric or a list of metrics to track during the experiment. If a list is provided, the primary metric will be the first element in the list.

  • mode (Literal['min', 'max']) – Optimization mode for the primary metric. One of "min" or "max".

  • search_space (dict[str, dict[Literal['model_args', 'train_args'], dict[str, Any]]]) –

    Dictionary of hyperparameter names and their respective search spaces. See the API for available search specifications. Must only contain the following top-level keys:

    • "model_args": parameters to pass to the model constructor.

    • "train_args": parameters to pass to the model’s train method.

    Passed into Tuner as param_space.

  • num_samples (int) – Total number of hyperparameter configurations to sample from the search space. Passed into TuneConfig.

  • scheduler (Literal['asha', 'hyperband', 'median', 'fifo'] (default: 'asha')) –

    Ray Tune scheduler to use. One of the following:

    Configured with reasonable defaults, which can be overridden with scheduler_kwargs.

  • searcher (Literal['hyperopt', 'random'] (default: 'hyperopt')) –

    Ray Tune search algorithm to use. One of the following:

    Configured with reasonable defaults, which can be overridden with searcher_kwargs.

  • seed (int | None (default: None)) – Random seed to use for the experiment. Propagated to seed and search algorithms. If not provided, defaults to seed.

  • resources (dict[Literal['cpu', 'gpu', 'memory'], float] | None (default: None)) –

    Dictionary of resources to allocate per trial in the experiment. Available keys include:

    • "cpu": number of CPU cores

    • "gpu": number of GPUs

    • "memory": amount of memory

    Passed into with_resources().

  • name (str | None (default: None)) – Name of the experiment, used for logging purposes. Defaults to a unique ID.

  • logging_dir (str | None (default: None)) – Base directory to store experiment logs. Defaults to :attr:scvi.settings.logging_dir.

  • scheduler_kwargs (dict | None (default: None)) – Additional keyword arguments to pass to the scheduler.

  • searcher_kwargs (dict | None (default: None)) – Additional keyword arguments to pass to the search algorithm.

Notes

Lifecycle: beta

See also

run_autotune()

Attributes table#

data

Data on which to tune hyperparameters.

id

Unique identifier for the experiment.

logging_dir

Base directory to store experiment logs.

metrics

Metrics to track during the experiment.

metrics_callback

mode

Optimization mode for the primary metric.

model_cls

Model class on which to tune hyperparameters.

name

Name of the experiment.

num_samples

Total number of hyperparameter configurations to sample.

resources

Resources to allocate per trial in the experiment.

result_grid

Result grid for the experiment.

scheduler

Ray Tune scheduler to use.

scheduler_kwargs

Additional keyword arguments to pass to the scheduler.

search_space

Search space for hyperparameters.

searcher

Ray Tune search algorithm to use.

searcher_kwargs

Additional keyword arguments to pass to the search algorithm.

seed

Random seed to use for the experiment.

setup_method_args

Keyword arguments for the setup method.

setup_method_name

Either "setup_anndata" or "setup_mudata".

Methods table#

get_logger(trial_name)

Configure TensorBoard logger for a trial in this experiment.

get_tuner()

Configure a Tuner from this experiment.

Attributes#

AutotuneExperiment.data[source]#

Data on which to tune hyperparameters.

AutotuneExperiment.id[source]#

Unique identifier for the experiment.

AutotuneExperiment.logging_dir[source]#

Base directory to store experiment logs.

AutotuneExperiment.metrics[source]#

Metrics to track during the experiment.

AutotuneExperiment.metrics_callback[source]#
AutotuneExperiment.mode[source]#

Optimization mode for the primary metric.

AutotuneExperiment.model_cls[source]#

Model class on which to tune hyperparameters.

AutotuneExperiment.name[source]#

Name of the experiment.

AutotuneExperiment.num_samples[source]#

Total number of hyperparameter configurations to sample.

AutotuneExperiment.resources[source]#

Resources to allocate per trial in the experiment.

AutotuneExperiment.result_grid[source]#

Result grid for the experiment.

AutotuneExperiment.scheduler[source]#

Ray Tune scheduler to use.

AutotuneExperiment.scheduler_kwargs[source]#

Additional keyword arguments to pass to the scheduler.

AutotuneExperiment.search_space[source]#

Search space for hyperparameters.

AutotuneExperiment.searcher[source]#

Ray Tune search algorithm to use.

AutotuneExperiment.searcher_kwargs[source]#

Additional keyword arguments to pass to the search algorithm.

AutotuneExperiment.seed[source]#

Random seed to use for the experiment.

AutotuneExperiment.setup_method_args[source]#

Keyword arguments for the setup method.

AutotuneExperiment.setup_method_name[source]#

Either "setup_anndata" or "setup_mudata".

Methods#

AutotuneExperiment.get_logger(trial_name)[source]#

Configure TensorBoard logger for a trial in this experiment.

Return type:

TensorBoardLogger

AutotuneExperiment.get_tuner()[source]#

Configure a Tuner from this experiment.

Return type:

Tuner