The [hyper_parameter_optimizer.py](https://github.com/allegroai/clearml/blob/master/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py)
example script demonstrates hyperparameter optimization, which is automated by using **ClearML**
BOHB performs robust and efficient hyperparameter optimization at scale by combining the speed of Hyperband searches
with the guidance and guarantees of convergence of Bayesian Optimization.
**ClearML** implements BOHB for automation with HpBandSter's [bohb.py](https://github.com/automl/HpBandSter/blob/master/hpbandster/optimizers/bohb.py).
For more information about HpBandSter BOHB, see the [HpBandSter](https://automl.github.io/HpBandSter/build/html/index.html)
documentation.
* Random uniform sampling of hyperparameter strategy - [automation.optimization.RandomSearch](../../../references/sdk/hpo_optimization_randomsearch.md)
* Full grid sampling strategy of every hyperparameter combination - Grid search [automation.optimization.GridSearch](../../../references/sdk/hpo_optimization_gridsearch.md).
* Custom - Use a custom class and inherit from the **ClearML** automation base strategy class, automation.optimization.SearchStrategy.
The search strategy class that is chosen will be passed to the [automation.optimization.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
object later.
The example code attempts to import `OptimizerOptuna` for the search strategy. If `clearml.automation.optuna` is not
installed, it attempts to import `OptimizerBOHB`. If `clearml.automation.hpbandster` is not installed, it uses
the `RandomSearch` for the search strategy.
aSearchStrategy = None
if not aSearchStrategy:
try:
from clearml.automation.optuna import OptimizerOptuna
aSearchStrategy = OptimizerOptuna
except ImportError as ex:
pass
if not aSearchStrategy:
try:
from clearml.automation.hpbandster import OptimizerBOHB
aSearchStrategy = OptimizerBOHB
except ImportError as ex:
pass
if not aSearchStrategy:
logging.getLogger().warning(
'Apologies, it seems you do not have \'optuna\' or \'hpbandster\' installed, '
Instantiate an [automation.optimization.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
object, setting the optimization parameters, beginning with the ID of the experiment to optimize.
an_optimizer = HyperParameterOptimizer(
# This is the experiment we want to optimize
base_task_id=args['template_task_id'],
Set the hyperparameter ranges to sample, instantiating them as **ClearML** automation objects using [automation.parameters.UniformIntegerParameterRange](../../../references/sdk/hpo_parameters_uniformintegerparameterrange.md)
and [automation.parameters.DiscreteParameterRange](../../../references/sdk/hpo_parameters_discreteparameterrange.md).
Set the metric to optimize and the optimization objective.
objective_metric_title='val_acc',
objective_metric_series='val_acc',
objective_metric_sign='max',
Set the number of concurrent Tasks.
max_number_of_concurrent_tasks=2,
Set the optimization strategy, see [Set the search strategy for optimization](#set-the-search-strategy-for-optimization).
optimizer_class=aSearchStrategy,
Specify the queue to use for remote execution. This is overridden if the optimizer runs as a service.
execution_queue='1xGPU',
Specify the remaining parameters, including the time limit per Task (minutes), period for checking the optimization (minutes), maximum number of jobs to launch, minimum and maximum number of iterations for each Task.
# Optional: Limit the execution time of a single experiment, in minutes.
# (this is optional, and if using OptimizerBOHB, it is ignored)
time_limit_per_job=10.,
# Check the experiments every 6 seconds is way too often, we should probably set it to 5 min,
# assuming a single experiment is usually hours...
pool_period_min=0.1,
# set the maximum number of jobs to launch for the optimization, default (None) unlimited
# If OptimizerBOHB is used, it defined the maximum budget in terms of full jobs
# basically the cumulative number of iterations will not exceed total_max_jobs * max_iteration_per_job
total_max_jobs=10,
# This is only applicable for OptimizerBOHB and ignore by the rest
# set the minimum number of iterations for an experiment, before early stopping
min_iteration_per_job=10,
# Set the maximum number of iterations for an experiment to execute
# (This is optional, unless using OptimizerBOHB where this is a must)