2021-05-13 23:48:51 +00:00
---
2021-06-20 22:00:16 +00:00
title: Hyperparameter Optimization
2021-05-13 23:48:51 +00:00
---
2025-02-23 15:33:55 +00:00
You can automate and boost hyperparameter optimization (HPO) with ClearML's
[**`HyperParameterOptimizer`** ](../references/sdk/hpo_optimization_hyperparameteroptimizer.md ) class, which takes care of the entire optimization process
2021-07-25 08:49:58 +00:00
with a simple interface.
2021-06-20 22:00:16 +00:00
2021-07-25 08:49:58 +00:00
ClearML's approach to hyperparameter optimization is scalable, easy to set up and to manage, and it makes it easy to
compare results.
### Workflow

2023-08-09 10:28:25 +00:00
The preceding diagram demonstrates the typical flow of hyperparameter optimization where the parameters of a base task are optimized:
2021-05-13 23:48:51 +00:00
2024-05-21 08:30:46 +00:00
1. Configure an Optimization Task with a base task whose parameters will be optimized, optimization targets, and a set of parameter values to
2021-07-25 08:49:58 +00:00
test
1. Clone the base task. Each clone's parameter is overridden with a value from the optimization task
1. Enqueue each clone for execution by a ClearML Agent
1. The Optimization Task records and monitors the cloned tasks' configuration and execution details, and returns a
2021-09-01 06:39:38 +00:00
summary of the optimization results in tabular and parallel coordinate formats, and in a scalar plot.
2021-07-25 08:49:58 +00:00
2025-02-05 15:53:30 +00:00


2021-07-25 08:49:58 +00:00
2023-02-22 10:29:36 +00:00
< Collapsible title = "Parallel coordinate and scalar plots" type = "screenshot" >
2021-09-01 06:39:38 +00:00
2025-02-05 15:53:30 +00:00


2021-09-01 06:39:38 +00:00
2025-02-05 15:53:30 +00:00


2021-09-01 06:39:38 +00:00
2023-02-22 10:29:36 +00:00
< / Collapsible >
2021-09-01 06:39:38 +00:00
2021-07-25 08:49:58 +00:00
### Supported Optimizers
2021-05-13 23:48:51 +00:00
2023-12-03 12:27:46 +00:00
The `HyperParameterOptimizer` class contains ClearML's hyperparameter optimization modules. Its modular design enables
2021-05-13 23:48:51 +00:00
using different optimizers, including existing software frameworks, enabling simple, accurate, and fast hyperparameter
optimization.
2021-10-06 12:50:00 +00:00
* **Optuna** - [`automation.optuna.OptimizerOptuna` ](../references/sdk/hpo_optuna_optuna_optimizeroptuna.md ). Optuna is the default optimizer in ClearML. It makes use of
2021-05-13 23:48:51 +00:00
different samplers such as grid search, random, bayesian, and evolutionary algorithms.
For more information, see the [Optuna ](https://optuna.readthedocs.io/en/latest/ )
documentation.
2021-10-06 12:50:00 +00:00
* **BOHB** - [`automation.hpbandster.OptimizerBOHB` ](../references/sdk/hpo_hpbandster_bandster_optimizerbohb.md ). BOHB performs robust and efficient hyperparameter optimization
2021-05-13 23:48:51 +00:00
at scale by combining the speed of Hyperband searches with the guidance and guarantees of convergence of Bayesian Optimization.
2025-02-23 15:33:55 +00:00
For more information about HpBandSter BOHB, see the [HpBandSter ](../https://automl.github.io/HpBandSter/build/html/index.html )
2021-09-01 06:41:27 +00:00
documentation and a [code example ](../guides/frameworks/pytorch/notebooks/image/hyperparameter_search.md ).
2021-10-06 12:50:00 +00:00
* **Random** uniform sampling of hyperparameters - [`automation.RandomSearch` ](../references/sdk/hpo_optimization_randomsearch.md ).
* **Full grid** sampling strategy of every hyperparameter combination - [`automation.GridSearch` ](../references/sdk/hpo_optimization_gridsearch.md ).
2025-02-23 15:33:55 +00:00
* **Custom** - [`automation.optimization.SearchStrategy` ](../https://github.com/clearml/clearml/blob/master/clearml/automation/optimization.py#L268 ) - Use a custom class and inherit from the ClearML automation base strategy class.
2021-06-20 22:00:16 +00:00
2021-05-13 23:48:51 +00:00
2021-09-02 07:48:37 +00:00
## Defining a Hyperparameter Optimization Search Example
2021-05-13 23:48:51 +00:00
1. Import ClearML's automation modules:
2024-07-01 07:07:19 +00:00
```python
from clearml.automation import UniformParameterRange, UniformIntegerParameterRange
from clearml.automation import HyperParameterOptimizer
from clearml.automation.optuna import OptimizerOptuna
```
2021-05-13 23:48:51 +00:00
1. Initialize the Task, which will be stored in ClearML Server when the code runs. After the code runs at least once,
2021-07-25 08:49:58 +00:00
it can be reproduced, and the parameters can be tuned:
2024-07-01 07:07:19 +00:00
```python
from clearml import Task
2021-05-13 23:48:51 +00:00
2024-07-01 07:07:19 +00:00
task = Task.init(
project_name='Hyper-Parameter Optimization',
task_name='Automatic Hyper-Parameter Optimization',
task_type=Task.TaskTypes.optimizer,
reuse_last_task_id=False
)
```
2021-05-13 23:48:51 +00:00
1. Define the optimization configuration and resources budget:
2024-07-01 07:07:19 +00:00
```python
optimizer = HyperParameterOptimizer(
# specifying the task to be optimized, task must be in system already so it can be cloned
base_task_id=TEMPLATE_TASK_ID,
# setting the hyperparameters to optimize
hyper_parameters=[
UniformIntegerParameterRange('number_of_epochs', min_value=2, max_value=12, step_size=2),
UniformIntegerParameterRange('batch_size', min_value=2, max_value=16, step_size=2),
UniformParameterRange('dropout', min_value=0, max_value=0.5, step_size=0.05),
UniformParameterRange('base_lr', min_value=0.00025, max_value=0.01, step_size=0.00025),
],
# setting the objective metric we want to maximize/minimize
objective_metric_title='accuracy',
objective_metric_series='total',
objective_metric_sign='max',
# setting optimizer
optimizer_class=OptimizerOptuna,
# configuring optimization parameters
execution_queue='default',
max_number_of_concurrent_tasks=2,
optimization_time_limit=60.,
compute_time_limit=120,
total_max_jobs=20,
min_iteration_per_job=15000,
max_iteration_per_job=150000,
)
```
:::tip Locating Task ID
To locate the base task's ID, go to the task's info panel in the [WebApp ](../webapp/webapp_overview.md ). The ID appears
in the task header.
:::
:::tip Multi-objective Optimization
If you are using the Optuna framework (see [Supported Optimizers ](#supported-optimizers )), you can list multiple optimization objectives.
When doing so, make sure the `objective_metric_title` , `objective_metric_series` , and `objective_metric_sign` lists
are the same length. Each title will be matched to its respective series and sign.
For example, the code below sets two objectives: to minimize the `validation/loss` metric and to maximize the `validation/accuracy` metric.
```python
objective_metric_title=["validation", "validation"]
objective_metric_series=["loss", "accuracy"]
objective_metric_sign=["min", "max"]
```
:::
2024-05-21 08:30:46 +00:00
2021-12-26 13:09:03 +00:00
2022-11-09 11:43:45 +00:00
## Optimizer Execution Options
2025-02-23 15:33:55 +00:00
The `HyperParameterOptimizer` provides options to launch the optimization tasks locally or through a ClearML [queue ](../fundamentals/agents_and_queues.md#what-is-a-queue ).
2023-07-23 09:11:32 +00:00
Start a `HyperParameterOptimizer` instance using either [`HyperParameterOptimizer.start()` ](../references/sdk/hpo_optimization_hyperparameteroptimizer.md#start )
2025-02-23 15:33:55 +00:00
or [`HyperParameterOptimizer.start_locally()` ](references/sdk/hpo_optimization_hyperparameteroptimizer.md#start_locally ).
2023-07-23 09:11:32 +00:00
Both methods run the optimizer controller locally. `start()` launches the base task clones through a queue
specified when instantiating the controller, while `start_locally()` runs the tasks locally.
2022-11-09 11:43:45 +00:00
:::tip Remote Execution
2023-07-23 09:11:32 +00:00
You can also launch the optimizer controller through a queue by using [`Task.execute_remotely()` ](../references/sdk/task.md#execute_remotely )
before starting the optimizer.
2022-11-09 11:43:45 +00:00
:::
2021-07-25 08:49:58 +00:00
## Tutorial
2021-12-05 09:29:59 +00:00
Check out the [Hyperparameter Optimization tutorial ](../guides/optimization/hyper-parameter-optimization/examples_hyperparam_opt.md ) for a step-by-step guide.
2021-05-13 23:48:51 +00:00
2023-07-18 10:49:57 +00:00
## SDK Reference
For detailed information, see the complete [HyperParameterOptimizer SDK reference page ](../references/sdk/hpo_optimization_hyperparameteroptimizer.md ).