doc/source/tune/examples/index.rst
.. _tune-examples-ref: .. _tune-recipes:
.. tip::
See :ref:tune-main to learn more about Tune features.
Below are examples for using Ray Tune for a variety of use cases and sorted by categories:
ML frameworks_Experiment tracking tools_Hyperparameter optimization frameworks_Others_Exercises_.. _ml-frameworks:
.. toctree:: :hidden:
PyTorch Example <tune-pytorch-cifar>
PyTorch Lightning Example <tune-pytorch-lightning>
XGBoost Example <tune-xgboost>
LightGBM Example <lightgbm_example>
Hugging Face Transformers Example <pbt_transformers>
Ray RLlib Example <pbt_ppo_example>
Keras Example <tune_mnist_keras>
PyTorch with ASHA <tune_pytorch_asha/content/tune_pytorch_asha>
Ray Tune integrates with many popular machine learning frameworks. Here you find a few practical examples showing you how to tune your models. At the end of these guides you will often find links to even more examples.
.. list-table::
How to use Tune with Keras and TensorFlow models <tune_mnist_keras>How to use Tune with PyTorch models <tune-pytorch-cifar>How to tune PyTorch Lightning models <tune-pytorch-lightning>Tuning RL experiments with Ray Tune and Ray Serve <pbt_ppo_example>Tuning XGBoost parameters with Tune <tune-xgboost>Tuning LightGBM parameters with Tune <lightgbm_example>Tuning Hugging Face Transformers with Tune <pbt_transformers>Hyperparameter tuning with PyTorch and ASHA <tune_pytorch_asha/content/tune_pytorch_asha>.. _experiment-tracking-tools:
.. toctree:: :hidden:
Weights & Biases Example <tune-wandb>
MLflow Example <tune-mlflow>
Aim Example <tune-aim>
Comet Example <tune-comet>
Ray Tune integrates with some popular Experiment tracking and management tools,
such as CometML, or Weights & Biases. For how
to use Ray Tune with Tensorboard, see
:ref:Guide to logging and outputs <tune-logging>.
.. list-table::
Using Aim with Ray Tune for experiment management <tune-aim>Using Comet with Ray Tune for experiment management <tune-comet>Tracking your experiment process Weights & Biases <tune-wandb>Using MLflow tracking and auto logging with Tune <tune-mlflow>.. _hyperparameter-optimization-frameworks:
.. toctree:: :hidden:
Ax Example <ax_example>
HyperOpt Example <hyperopt_example>
Bayesopt Example <bayesopt_example>
BOHB Example <bohb_example>
Nevergrad Example <nevergrad_example>
Optuna Example <optuna_example>
Tune integrates with a wide variety of hyperparameter optimization frameworks and their respective search algorithms. See the following detailed examples for each integration:
.. list-table::
ax_examplehyperopt_examplebayesopt_examplebohb_examplenevergrad_exampleoptuna_example.. _tune-examples-others:
.. list-table::
Simple example for doing a basic random and grid search <includes/tune_basic_example>Example of using a simple tuning function with AsyncHyperBandScheduler <includes/async_hyperband_example>Example of using a trainable function with HyperBandScheduler and the AsyncHyperBandScheduler <includes/hyperband_function_example>Configuring and running (synchronous) PBT and understanding the underlying algorithm behavior with a simple example <pbt_visualization/pbt_visualization>includes/pbt_functionincludes/pb2_exampleincludes/logging_example.. _tune-examples-exercises:
Learn how to use Tune in your browser with the following Colab-based exercises.
.. list-table:: :widths: 50 30 20 :header-rows: 1
Tutorial source files are on GitHub <https://github.com/ray-project/tutorial>_.