scvi.autotune.run_autotune

Contents

scvi.autotune.run_autotune#

scvi.autotune.run_autotune(model_cls, data, metrics, mode, search_space, num_samples, scheduler='asha', searcher='hyperopt', seed=None, resources=None, experiment_name=None, logging_dir=None, scheduler_kwargs=None, searcher_kwargs=None, scib_stage='train_end', scib_subsample_rows=5000, scib_indices_list=None, log_to_driver=False, local_mode=False, ignore_reinit_error=False)[source]#

BETA Run a hyperparameter sweep.

Parameters:
  • model_cls (BaseModelClass) – Model class on which to tune hyperparameters.

  • data (AnnData | MuData | LightningDataModule) – AnnData or MuData that has been setup with model_cls or a LightningDataModule (EXPERIMENTAL).

  • metrics (str | list[str]) – Either a single metric or a list of metrics to track during the experiment. If a list is provided, the primary metric will be the first element in the list.

  • mode (Literal['min', 'max']) – Optimization mode for the primary metric. One of "min" or "max".

  • search_space (dict[str, dict[Literal['model_params', 'train_params'], dict[str, Any]]]) –

    Dictionary of hyperparameter names and their respective search spaces. See the API for available search specifications. Must only contain the following top-level keys:

    • "model_params": parameters to pass to the model constructor.

    • "train_params": parameters to pass to the model’s train method.

    Passed into Tuner as param_space.

  • num_samples (int) – Total number of hyperparameter configurations to sample from the search space. Passed into TuneConfig.

  • scheduler (Literal['asha', 'hyperband', 'median', 'fifo'] (default: 'asha')) –

    Ray Tune scheduler to use. One of the following:

    Configured with reasonable defaults, which can be overridden with scheduler_kwargs.

  • searcher (Literal['hyperopt', 'random'] (default: 'hyperopt')) –

    Ray Tune search algorithm to use. One of the following:

    Configured with reasonable defaults, which can be overridden with searcher_kwargs.

  • seed (int | None (default: None)) – Random seed to use for the experiment. Propagated to seed and search algorithms. If not provided, defaults to seed.

  • resources (dict[Literal['cpu', 'gpu', 'memory'], float] | None (default: None)) –

    Dictionary of resources to allocate per trial in the experiment. Available keys include:

    • "cpu": number of CPU cores

    • "gpu": number of GPUs

    • "memory": amount of memory

    Passed into with_resources().

  • experiment_name (str | None (default: None)) – Name of the experiment, used for logging purposes. Defaults to a unique ID concatenated to the model class name.

  • logging_dir (str | None (default: None)) – Base directory to store experiment logs. Defaults to logging_dir.

  • scheduler_kwargs (dict | None (default: None)) – Additional keyword arguments to pass to the scheduler.

  • searcher_kwargs (dict | None (default: None)) – Additional keyword arguments to pass to the search algorithm.

  • scib_stage (str | None (default: 'train_end')) – Used when performing scib-metrics tune, select whether to perform on “validation_end” or “train_end” (default).

  • scib_subsample_rows (int | None (default: 5000)) – Used when performing scib-metrics tune, select number of rows to subsample (5000 default). This is important to save computation time

  • scib_indices_list (list | None (default: None)) – If not empty will be used to select the indices to calc the scib metric on, otherwise will use the random indices selection in size of scib_subsample_rows

  • ignore_reinit_error (bool (default: False)) – If true, Ray suppresses errors from calling ray.init() a second time. Ray won’t be restarted.

  • local_mode (bool (default: False)) – Deprecated: consider using the Ray Debugger instead.

  • log_to_driver (bool (default: False)) – If true, the output from all of the worker processes on all nodes will be directed to the driver.

Return type:

AutotuneExperiment

Returns:

AutotuneExperiment object containing the results of the hyperparameter sweep in result_grid.

Notes

Lifecycle: beta