clearml-docs/docs/integrations/optuna.md

45 lines
2.0 KiB
Markdown
Raw Normal View History

2023-06-22 13:08:26 +00:00
---
title: Optuna
---
[Optuna](https://optuna.readthedocs.io/en/latest) is a [hyperparameter optimization](../fundamentals/hpo.md) framework,
which makes use of different samplers such as grid search, random, bayesian, and evolutionary algorithms. You can integrate
Optuna into ClearML's automated hyperparameter optimization.
2023-12-03 12:27:46 +00:00
The [HyperParameterOptimizer](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class contains ClearML's
2023-06-22 13:08:26 +00:00
hyperparameter optimization modules. Its modular design enables using different optimizers, including existing software
frameworks, like Optuna, enabling simple,
2023-12-07 16:33:28 +00:00
accurate, and fast hyperparameter optimization. The Optuna ([`automation.optuna.OptimizerOptuna`](../references/sdk/hpo_optuna_optuna_optimizeroptuna.md))
2023-09-21 10:52:36 +00:00
optimizer lets you simultaneously optimize many hyperparameters efficiently by relying on early stopping (pruning)
2023-06-22 13:08:26 +00:00
and smart resource allocation.
2023-12-07 16:33:28 +00:00
To use Optuna in ClearML's hyperparameter optimization, you must first install it. When you instantiate `HyperParameterOptimizer`,
2023-06-22 13:08:26 +00:00
pass `OptimizerOptuna` as the `optimizer_class` argument:
```python
from clearml.automation import (
DiscreteParameterRange, HyperParameterOptimizer, UniformIntegerParameterRange
)
from clearml.automation.optuna import OptimizerOptuna
an_optimizer = HyperParameterOptimizer(
# This is the experiment we want to optimize
base_task_id=args['template_task_id'],
hyper_parameters=[
UniformIntegerParameterRange('layer_1', min_value=128, max_value=512, step_size=128),
DiscreteParameterRange('batch_size', values=[96, 128, 160]),
DiscreteParameterRange('epochs', values=[30]),
],
objective_metric_title='validation',
objective_metric_series='accuracy',
objective_metric_sign='max',
max_number_of_concurrent_tasks=2,
optimizer_class=OptimizerOptuna, # input optuna as search strategy
execution_queue='1xGPU',
total_max_jobs=10,
)
```
See the Hyperparameter Optimization [tutorial](../guides/optimization/hyper-parameter-optimization/examples_hyperparam_opt.md).