Edit references and small edits (#81)

This commit is contained in:
pollfly 2021-10-06 15:50:00 +03:00 committed by GitHub
parent 834c04d739
commit 867bb471ec
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
3 changed files with 19 additions and 23 deletions

View File

@ -7,7 +7,7 @@ Hyperparameters are variables that directly control the behaviors of training al
the performance of the resulting machine learning models. Finding the hyperparameter values that yield the best the performance of the resulting machine learning models. Finding the hyperparameter values that yield the best
performing models can be complicated. Manually adjusting hyperparameters over the course of many training trials can be performing models can be complicated. Manually adjusting hyperparameters over the course of many training trials can be
slow and tedious. Luckily, hyperparameter optimization can be automated and boosted using ClearML's slow and tedious. Luckily, hyperparameter optimization can be automated and boosted using ClearML's
**`HyperParameterOptimizer`** class. [**`HyperParameterOptimizer`**](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class.
## ClearML's HyperParameter Optimization ## ClearML's HyperParameter Optimization
@ -50,17 +50,17 @@ The `HyperParameterOptimizer` class contains **ClearML**s hyperparameter opti
using different optimizers, including existing software frameworks, enabling simple, accurate, and fast hyperparameter using different optimizers, including existing software frameworks, enabling simple, accurate, and fast hyperparameter
optimization. optimization.
* **Optuna** - `automation.optuna.optuna.OptimizerOptuna`. Optuna is the default optimizer in ClearML. It makes use of * **Optuna** - [`automation.optuna.OptimizerOptuna`](../references/sdk/hpo_optuna_optuna_optimizeroptuna.md). Optuna is the default optimizer in ClearML. It makes use of
different samplers such as grid search, random, bayesian, and evolutionary algorithms. different samplers such as grid search, random, bayesian, and evolutionary algorithms.
For more information, see the [Optuna](https://optuna.readthedocs.io/en/latest/) For more information, see the [Optuna](https://optuna.readthedocs.io/en/latest/)
documentation. documentation.
* **BOHB** - `automation.hpbandster.bandster.OptimizerBOHB`. BOHB performs robust and efficient hyperparameter optimization * **BOHB** - [`automation.hpbandster.OptimizerBOHB`](../references/sdk/hpo_hpbandster_bandster_optimizerbohb.md). BOHB performs robust and efficient hyperparameter optimization
at scale by combining the speed of Hyperband searches with the guidance and guarantees of convergence of Bayesian Optimization. at scale by combining the speed of Hyperband searches with the guidance and guarantees of convergence of Bayesian Optimization.
For more information about HpBandSter BOHB, see the [HpBandSter](https://automl.github.io/HpBandSter/build/html/index.html) For more information about HpBandSter BOHB, see the [HpBandSter](https://automl.github.io/HpBandSter/build/html/index.html)
documentation and a [code example](../guides/frameworks/pytorch/notebooks/image/hyperparameter_search.md). documentation and a [code example](../guides/frameworks/pytorch/notebooks/image/hyperparameter_search.md).
* **Random** uniform sampling of hyperparameters - `automation.optimization.RandomSearch`. * **Random** uniform sampling of hyperparameters - [`automation.RandomSearch`](../references/sdk/hpo_optimization_randomsearch.md).
* **Full grid** sampling strategy of every hyperparameter combination - `Grid search automation.optimization.GridSearch`. * **Full grid** sampling strategy of every hyperparameter combination - [`automation.GridSearch`](../references/sdk/hpo_optimization_gridsearch.md).
* **Custom** - `automation.optimization.SearchStrategy` - Use a custom class and inherit from the ClearML automation base strategy class * **Custom** - [`automation.optimization.SearchStrategy`](https://github.com/allegroai/clearml/blob/master/clearml/automation/optimization.py#L268) - Use a custom class and inherit from the ClearML automation base strategy class
## Defining a Hyperparameter Optimization Search Example ## Defining a Hyperparameter Optimization Search Example
@ -114,7 +114,7 @@ optimization.
max_iteration_per_job=150000, max_iteration_per_job=150000,
) )
``` ```
<br/><br/> <br/>
For more information about `HyperParameterOptimizer` and supported optimization modules, see the [HyperParameterOptimizer class reference](../references/sdk/hpo_optimization_hyperparameteroptimizer.md). For more information about `HyperParameterOptimizer` and supported optimization modules, see the [HyperParameterOptimizer class reference](../references/sdk/hpo_optimization_hyperparameteroptimizer.md).

View File

@ -5,17 +5,15 @@ title: Hyperparameter Optimization
The [hyper_parameter_optimizer.py](https://github.com/allegroai/clearml/blob/master/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py) The [hyper_parameter_optimizer.py](https://github.com/allegroai/clearml/blob/master/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py)
example script demonstrates hyperparameter optimization, which is automated by using **ClearML** example script demonstrates hyperparameter optimization, which is automated by using **ClearML**
<a class="tr_top_negative" name="strategy"></a>
## Set the Search Strategy for Optimization ## Set the Search Strategy for Optimization
A search strategy is required for the optimization, as well as a search strategy optimizer class to implement that strategy. A search strategy is required for the optimization, as well as a search strategy optimizer class to implement that strategy.
The following search strategies can be used: The following search strategies can be used:
* Optuna hyperparameter optimization - [automation.optuna.optuna.OptimizerOptuna](../../../references/sdk/hpo_optuna_optuna_optimizeroptuna.md). * Optuna hyperparameter optimization - [automation.optuna.OptimizerOptuna](../../../references/sdk/hpo_optuna_optuna_optimizeroptuna.md).
For more information about Optuna, see the [Optuna](https://optuna.org/) documentation. For more information about Optuna, see the [Optuna](https://optuna.org/) documentation.
* BOHB - [automation.hpbandster.bandster.OptimizerBOHB](../../../references/sdk/hpo_hpbandster_bandster_optimizerbohb.md). * BOHB - [automation.hpbandster.OptimizerBOHB](../../../references/sdk/hpo_hpbandster_bandster_optimizerbohb.md).
BOHB performs robust and efficient hyperparameter optimization at scale by combining the speed of Hyperband searches BOHB performs robust and efficient hyperparameter optimization at scale by combining the speed of Hyperband searches
with the guidance and guarantees of convergence of Bayesian Optimization. with the guidance and guarantees of convergence of Bayesian Optimization.
@ -24,11 +22,11 @@ The following search strategies can be used:
For more information about HpBandSter BOHB, see the [HpBandSter](https://automl.github.io/HpBandSter/build/html/index.html) For more information about HpBandSter BOHB, see the [HpBandSter](https://automl.github.io/HpBandSter/build/html/index.html)
documentation. documentation.
* Random uniform sampling of hyperparameter strategy - [automation.optimization.RandomSearch](../../../references/sdk/hpo_optimization_randomsearch.md) * Random uniform sampling of hyperparameter strategy - [automation.RandomSearch](../../../references/sdk/hpo_optimization_randomsearch.md)
* Full grid sampling strategy of every hyperparameter combination - Grid search [automation.optimization.GridSearch](../../../references/sdk/hpo_optimization_gridsearch.md). * Full grid sampling strategy of every hyperparameter combination - Grid search [automation.GridSearch](../../../references/sdk/hpo_optimization_gridsearch.md).
* Custom - Use a custom class and inherit from the **ClearML** automation base strategy class, automation.optimization.SearchStrategy. * Custom - Use a custom class and inherit from the **ClearML** automation base strategy class, automation.optimization.SearchStrategy.
The search strategy class that is chosen will be passed to the [automation.optimization.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md) The search strategy class that is chosen will be passed to the [automation.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
object later. object later.
The example code attempts to import `OptimizerOptuna` for the search strategy. If `clearml.automation.optuna` is not The example code attempts to import `OptimizerOptuna` for the search strategy. If `clearml.automation.optuna` is not
@ -40,14 +38,14 @@ the `RandomSearch` for the search strategy.
if not aSearchStrategy: if not aSearchStrategy:
try: try:
from clearml.automation.optuna import OptimizerOptuna from clearml.optuna import OptimizerOptuna
aSearchStrategy = OptimizerOptuna aSearchStrategy = OptimizerOptuna
except ImportError as ex: except ImportError as ex:
pass pass
if not aSearchStrategy: if not aSearchStrategy:
try: try:
from clearml.automation.hpbandster import OptimizerBOHB from clearml.automation.hpbandster import OptimizerBOHB
aSearchStrategy = OptimizerBOHB aSearchStrategy = OptimizerBOHB
except ImportError as ex: except ImportError as ex:
pass pass
@ -104,7 +102,7 @@ In this example, an experiment named **Keras HP optimization base** is being opt
least once so that it is stored in **ClearML Server**, and, therefore, can be cloned. least once so that it is stored in **ClearML Server**, and, therefore, can be cloned.
Since the arguments dictionary is connected to the Task, after the code runs once, the `template_task_id` can be changed Since the arguments dictionary is connected to the Task, after the code runs once, the `template_task_id` can be changed
to optimize a different experiment, see [tuning experiments](../../../webapp/webapp_exp_tuning.md). to optimize a different experiment.
```python ```python
# experiment template to optimize in the hyper-parameter optimization # experiment template to optimize in the hyper-parameter optimization
@ -122,7 +120,7 @@ to optimize a different experiment, see [tuning experiments](../../../webapp/web
## Creating the Optimizer Object ## Creating the Optimizer Object
Initialize an [automation.optimization.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md) Initialize an [automation.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
object, setting the optimization parameters, beginning with the ID of the experiment to optimize. object, setting the optimization parameters, beginning with the ID of the experiment to optimize.
```python ```python
@ -131,8 +129,8 @@ object, setting the optimization parameters, beginning with the ID of the experi
base_task_id=args['template_task_id'], base_task_id=args['template_task_id'],
``` ```
Set the hyperparameter ranges to sample, instantiating them as **ClearML** automation objects using [automation.parameters.UniformIntegerParameterRange](../../../references/sdk/hpo_parameters_uniformintegerparameterrange.md) Set the hyperparameter ranges to sample, instantiating them as **ClearML** automation objects using [automation.UniformIntegerParameterRange](../../../references/sdk/hpo_parameters_uniformintegerparameterrange.md)
and [automation.parameters.DiscreteParameterRange](../../../references/sdk/hpo_parameters_discreteparameterrange.md). and [automation.DiscreteParameterRange](../../../references/sdk/hpo_parameters_discreteparameterrange.md).
```python ```python
hyper_parameters=[ hyper_parameters=[
@ -186,8 +184,6 @@ Specify the remaining parameters, including the time limit per Task (minutes), p
``` ```
<a class="tr_top_negative" name="service"></a>
## Running as a Service ## Running as a Service
The optimization can run as a service, if the `run_as_service` argument is set to `true`. For more information about The optimization can run as a service, if the `run_as_service` argument is set to `true`. For more information about

View File

@ -18,7 +18,7 @@ Artifacts can be uploaded and dynamically tracked, or uploaded without tracking.
<a name="configure_artifact_storage" class="tr_top_negative"></a> <a name="configure_artifact_storage" class="tr_top_negative"></a>
Configure **ClearML** for uploading artifacts to any of the supported types of storage, which include local and shared folders, Configure **ClearML** for uploading artifacts to any of the supported types of storage, which include local and shared folders,
S3 buckets, Google Cloud Storage, and Azure Storage ([debug sample storage](../../references/sdk/logger.md#set_default_upload_destination) S3 buckets, Google Cloud Storage, and Azure Storage ([debug sample storage](../../references/sdk/logger.md#set_default_upload_destination)
is different). Configure **ClearML** in any of the following ways: is different). Configure **ClearML** in any of the following ways:
* In the configuration file, set [default_output_uri](../../configs/clearml_conf.md#sdkdevelopment). * In the configuration file, set [default_output_uri](../../configs/clearml_conf.md#sdkdevelopment).