Back to Ray

Ray Tune Examples

doc/source/tune/examples/index.rst

1.13.15.6 KB
Original Source

.. _tune-examples-ref: .. _tune-recipes:

================= Ray Tune Examples

.. tip:: See :ref:tune-main to learn more about Tune features.

Below are examples for using Ray Tune for a variety of use cases and sorted by categories:

  • ML frameworks_
  • Experiment tracking tools_
  • Hyperparameter optimization frameworks_
  • Others_
  • Exercises_

.. _ml-frameworks:

ML frameworks

.. toctree:: :hidden:

PyTorch Example <tune-pytorch-cifar>
PyTorch Lightning Example <tune-pytorch-lightning>
XGBoost Example <tune-xgboost>
LightGBM Example <lightgbm_example>
Hugging Face Transformers Example <pbt_transformers>
Ray RLlib Example <pbt_ppo_example>
Keras Example <tune_mnist_keras>
PyTorch with ASHA <tune_pytorch_asha/content/tune_pytorch_asha>

Ray Tune integrates with many popular machine learning frameworks. Here you find a few practical examples showing you how to tune your models. At the end of these guides you will often find links to even more examples.

.. list-table::

    • :doc:How to use Tune with Keras and TensorFlow models <tune_mnist_keras>
    • :doc:How to use Tune with PyTorch models <tune-pytorch-cifar>
    • :doc:How to tune PyTorch Lightning models <tune-pytorch-lightning>
    • :doc:Tuning RL experiments with Ray Tune and Ray Serve <pbt_ppo_example>
    • :doc:Tuning XGBoost parameters with Tune <tune-xgboost>
    • :doc:Tuning LightGBM parameters with Tune <lightgbm_example>
    • :doc:Tuning Hugging Face Transformers with Tune <pbt_transformers>
    • :doc:Hyperparameter tuning with PyTorch and ASHA <tune_pytorch_asha/content/tune_pytorch_asha>

.. _experiment-tracking-tools:

Experiment tracking tools

.. toctree:: :hidden:

Weights & Biases Example <tune-wandb>
MLflow Example <tune-mlflow>
Aim Example <tune-aim>
Comet Example <tune-comet>

Ray Tune integrates with some popular Experiment tracking and management tools, such as CometML, or Weights & Biases. For how to use Ray Tune with Tensorboard, see :ref:Guide to logging and outputs <tune-logging>.

.. list-table::

    • :doc:Using Aim with Ray Tune for experiment management <tune-aim>
    • :doc:Using Comet with Ray Tune for experiment management <tune-comet>
    • :doc:Tracking your experiment process Weights & Biases <tune-wandb>
    • :doc:Using MLflow tracking and auto logging with Tune <tune-mlflow>

.. _hyperparameter-optimization-frameworks:

Hyperparameter optimization frameworks

.. toctree:: :hidden:

Ax Example <ax_example>
HyperOpt Example <hyperopt_example>
Bayesopt Example <bayesopt_example>
BOHB Example <bohb_example>
Nevergrad Example <nevergrad_example>
Optuna Example <optuna_example>

Tune integrates with a wide variety of hyperparameter optimization frameworks and their respective search algorithms. See the following detailed examples for each integration:

.. list-table::

    • :doc:ax_example
    • :doc:hyperopt_example
    • :doc:bayesopt_example
    • :doc:bohb_example
    • :doc:nevergrad_example
    • :doc:optuna_example

.. _tune-examples-others:

Others

.. list-table::

    • :doc:Simple example for doing a basic random and grid search <includes/tune_basic_example>
    • :doc:Example of using a simple tuning function with AsyncHyperBandScheduler <includes/async_hyperband_example>
    • :doc:Example of using a trainable function with HyperBandScheduler and the AsyncHyperBandScheduler <includes/hyperband_function_example>
    • :doc:Configuring and running (synchronous) PBT and understanding the underlying algorithm behavior with a simple example <pbt_visualization/pbt_visualization>
    • :doc:includes/pbt_function
    • :doc:includes/pb2_example
    • :doc:includes/logging_example

.. _tune-examples-exercises:

Exercises

Learn how to use Tune in your browser with the following Colab-based exercises.

.. list-table:: :widths: 50 30 20 :header-rows: 1

Tutorial source files are on GitHub <https://github.com/ray-project/tutorial>_.