doc/source/train/deprecated-user-guides/hyperparameter-optimization-deprecated.rst
:orphan:
.. _train-tune-deprecated-api:
.. important::
This user guide covers the deprecated Train + Tune integration. See :ref:train-tune for the new API user guide.
Please see :ref:`here <train-tune-deprecation>` for information about the deprecation and migration.
Hyperparameter tuning with :ref:Ray Tune <tune-main> is natively supported with Ray Train.
.. https://docs.google.com/drawings/d/1yMd12iMkyo6DGrFoET1TIlKfFnXX9dfh2u3GSdTz6W4/edit
.. figure:: ../images/train-tuner.svg :align: center
The `Tuner` will take in a `Trainer` and execute multiple training runs, each with different hyperparameter configurations.
There are a number of key concepts when doing hyperparameter optimization with a :class:~ray.tune.Tuner:
~ray.tune.ResultGrid... note::
Tuners can also be used to launch hyperparameter tuning without using Ray Train. See
:ref:the Ray Tune documentation <tune-main> for more guides and examples.
You can take an existing :class:Trainer <ray.train.base_trainer.BaseTrainer> and simply
pass it into a :class:~ray.tune.Tuner.
.. literalinclude:: ../doc_code/tuner.py :language: python :start-after: basic_start :end-before: basic_end
There are two main configuration objects that can be passed into a Tuner: the :class:TuneConfig <ray.tune.TuneConfig> and the :class:ray.tune.RunConfig.
The :class:TuneConfig <ray.tune.TuneConfig> contains tuning specific settings, including:
Here are some common configurations for TuneConfig:
.. literalinclude:: ../doc_code/tuner.py :language: python :start-after: tune_config_start :end-before: tune_config_end
See the :class:TuneConfig API reference <ray.tune.TuneConfig> for more details.
The :class:ray.tune.RunConfig contains configurations that are more generic than tuning specific settings.
This includes:
Below we showcase some common configurations of :class:ray.tune.RunConfig.
.. literalinclude:: ../doc_code/tuner.py :language: python :start-after: run_config_start :end-before: run_config_end
A Tuner takes in a param_space argument where you can define the search space
from which hyperparameter configurations will be sampled.
Depending on the model and dataset, you may want to tune:
You can use a Tuner to tune most arguments and configurations for Ray Train, including but not limited to:
Datasets <ray.data.Dataset>~ray.train.ScalingConfigRead more about :ref:Tune search spaces here <tune-search-space-tutorial>.
There are a couple gotchas about parameter specification when using Tuners with Trainers:
param_space.ray.tune.RunConfig and :class:ray.tune.TuneConfig are inherently un-tunable.See :doc:/tune/tutorials/tune_get_data_in_and_out for an example.
Tuners also offer the ability to tune over different data preprocessing steps and different training/validation datasets, as shown in the following snippet.
.. literalinclude:: ../doc_code/tuner.py :language: python :start-after: tune_dataset_start :end-before: tune_dataset_end