mirror of
https://github.com/clearml/clearml-docs
synced 2025-01-31 06:27:22 +00:00
Add hyperparam optimization diagram + minor corrections (#22)
* add hpo diagram * edit around hpo diagram * small edits hpo, put tutorial under own heading * fix incorrect wording
This commit is contained in:
parent
94525b12c9
commit
a8802194b9
@ -6,26 +6,39 @@ title: Hyperparameter Optimization
|
||||
Hyperparameters are variables that directly control the behaviors of training algorithms, and have a significant effect on
|
||||
the performance of the resulting machine learning models. Finding the hyperparameter values that yield the best
|
||||
performing models can be complicated. Manually adjusting hyperparameters over the course of many training trials can be
|
||||
slow and tedious. Luckily, **hyperparameter optimization** can be automated and boosted using **ClearML**'s
|
||||
`HyperParameterOptimizer` class.
|
||||
slow and tedious. Luckily, hyperparameter optimization can be automated and boosted using ClearML's
|
||||
**`HyperParameterOptimizer`** class.
|
||||
|
||||
## ClearML's HyperParameter Optimization
|
||||
|
||||
ClearML provides the `HyperParameterOptimizer` class which takes care of the entire optimization process for users in
|
||||
with a simple interface.
|
||||
ClearML provides the `HyperParameterOptimizer` class, which takes care of the entire optimization process for users in
|
||||
with a simple interface.
|
||||
|
||||
The `HyperParameterOptimizer` class does the following:
|
||||
* Clones a base experiment that needs to be optimized.
|
||||
* Changes arguments based on the specified optimizer strategy.
|
||||
* Tries to minimize / maximize defined objectives.
|
||||
ClearML's approach to hyperparameter optimization is scalable, easy to set up and to manage, and it makes it easy to
|
||||
compare results.
|
||||
|
||||
### Workflow
|
||||
|
||||
![Hyperparameter optimization diagram](../img/hpo_diagram.png)
|
||||
|
||||
The diagram above demonstrates the typical flow of hyperparameter optimization where the parameters of a base task are optimized:
|
||||
|
||||
1. Configure an Optimization Task with a base task whose parameters will be optimized, and a set of parameter values to
|
||||
test
|
||||
1. Clone the base task. Each clone's parameter is overridden with a value from the optimization task
|
||||
1. Enqueue each clone for execution by a ClearML Agent
|
||||
1. The Optimization Task records and monitors the cloned tasks' configuration and execution details, and returns a
|
||||
summary of the optimization results
|
||||
|
||||
|
||||
![Optimization results summary chart](../img/fundamentals_hpo_summary.png)
|
||||
|
||||
### Supported Optimizers
|
||||
|
||||
The `HyperParameterOptimizer` class contains **ClearML**’s hyperparameter optimization modules. Its modular design enables
|
||||
using different optimizers, including existing software frameworks, enabling simple, accurate, and fast hyperparameter
|
||||
optimization.
|
||||
|
||||
### Supported Optimizers
|
||||
|
||||
* **Optuna** - `automation.optuna.optuna.OptimizerOptuna`. Optuna is the default optimizer in ClearML. It makes use of
|
||||
different samplers such as grid search, random, bayesian, and evolutionary algorithms.
|
||||
For more information, see the [Optuna](https://optuna.readthedocs.io/en/latest/)
|
||||
@ -34,29 +47,10 @@ optimization.
|
||||
at scale by combining the speed of Hyperband searches with the guidance and guarantees of convergence of Bayesian Optimization.
|
||||
For more information about HpBandSter BOHB, see the [HpBandSter](https://automl.github.io/HpBandSter/build/html/index.html)
|
||||
documentation.
|
||||
* **Random** uniform sampling of hyperparameters - `automation.optimization.RandomSearch`
|
||||
* **Random** uniform sampling of hyperparameters - `automation.optimization.RandomSearch`.
|
||||
* **Full grid** sampling strategy of every hyperparameter combination - `Grid search automation.optimization.GridSearch`.
|
||||
* **Custom** - `automation.optimization.SearchStrategy`. - Use a custom class and inherit from the ClearML automation base strategy class
|
||||
* **Custom** - `automation.optimization.SearchStrategy` - Use a custom class and inherit from the ClearML automation base strategy class
|
||||
|
||||
## How Does it Work?
|
||||
|
||||
**ClearML**'s approach to hyperparameter optimization is scalable, easy to set up and to manage, and it makes it easy to
|
||||
compare results.
|
||||
|
||||
### Workflow
|
||||
|
||||
Make use of **ClearML**'s hyperparameter optimization capabilities by:
|
||||
* Initializing an Optimizer Task, which will record and monitor arguments, execution details, results, and more.
|
||||
* Instantiating a `HyperParameterOptimizer`, where the following is specified:
|
||||
* Task to optimize
|
||||
* Hyperparameters to optimize
|
||||
* Metric to optimize
|
||||
* Optimizer class (optimization strategy) where the optimization configuration and resources budget are defined
|
||||
* And more.
|
||||
* Enqueuing the Task to be executed by a ClearML Agent (or multiple agents) in a remote machine.
|
||||
* Monitoring the optimization process and viewing the summarized results in the **ClearML web UI**
|
||||
|
||||
![image](../img/fundamentals_hpo_summary.png)
|
||||
|
||||
## Defining a hyperparameter optimization search example
|
||||
|
||||
@ -68,7 +62,7 @@ Make use of **ClearML**'s hyperparameter optimization capabilities by:
|
||||
from clearml.automation.optuna import OptimizerOptuna
|
||||
```
|
||||
1. Initialize the Task, which will be stored in ClearML Server when the code runs. After the code runs at least once,
|
||||
it can be reproduced and tuned:
|
||||
it can be reproduced, and the parameters can be tuned:
|
||||
```python
|
||||
from clearml import Task
|
||||
|
||||
@ -82,7 +76,7 @@ Make use of **ClearML**'s hyperparameter optimization capabilities by:
|
||||
1. Define the optimization configuration and resources budget:
|
||||
```python
|
||||
optimizer = HyperParameterOptimizer(
|
||||
# specifying the Task to be optimized, Task must be in system already so it can be cloned
|
||||
# specifying the task to be optimized, task must be in system already so it can be cloned
|
||||
base_task_id=TEMPLATE_TASK_ID,
|
||||
# setting the hyper-parameters to optimize
|
||||
hyper_parameters=[
|
||||
@ -111,6 +105,9 @@ Make use of **ClearML**'s hyperparameter optimization capabilities by:
|
||||
```
|
||||
<br/><br/>
|
||||
|
||||
Check out the [Hyperparameter Optimization](../guides/optimization/hyper-parameter-optimization) tutorial for a step by step guide.
|
||||
For more information about `HyperParameterOptimizer` and supported optimization modules, see the [HyperParameterOptimizer class reference](../references/sdk/hpo_optimization_hyperparameteroptimizer.md).
|
||||
|
||||
## Tutorial
|
||||
|
||||
Check out the [Hyperparameter Optimization](../guides/optimization/hyper-parameter-optimization/examples_hyperparam_opt.md) tutorial for a step-by-step guide.
|
||||
|
||||
For further information about the `HyperParameterOptimizer` arguments, see the [Automation module reference](../references/sdk/hpo_optimization_hyperparameteroptimizer.md).
|
BIN
docs/img/hpo_diagram.png
Normal file
BIN
docs/img/hpo_diagram.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 57 KiB |
@ -14,7 +14,7 @@ module.exports = {
|
||||
{'MLOps': ['getting_started/mlops/mlops_first_steps','getting_started/mlops/mlops_second_steps','getting_started/mlops/mlops_best_practices']}]
|
||||
}, 'getting_started/architecture']},
|
||||
{'ClearML Fundamentals': ['fundamentals/task', 'fundamentals/hyperparameters', 'fundamentals/artifacts', 'fundamentals/logger', 'fundamentals/agents_and_queues',
|
||||
'fundamentals/pipelines']},
|
||||
'fundamentals/hpo', 'fundamentals/pipelines']},
|
||||
'clearml_sdk',
|
||||
'clearml_agent',
|
||||
'clearml_data',
|
||||
|
Loading…
Reference in New Issue
Block a user