mirror of
https://github.com/clearml/clearml-docs
synced 2025-02-07 13:21:46 +00:00
Edit references and small edits (#81)
This commit is contained in:
parent
834c04d739
commit
867bb471ec
@ -7,7 +7,7 @@ Hyperparameters are variables that directly control the behaviors of training al
|
||||
the performance of the resulting machine learning models. Finding the hyperparameter values that yield the best
|
||||
performing models can be complicated. Manually adjusting hyperparameters over the course of many training trials can be
|
||||
slow and tedious. Luckily, hyperparameter optimization can be automated and boosted using ClearML's
|
||||
**`HyperParameterOptimizer`** class.
|
||||
[**`HyperParameterOptimizer`**](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class.
|
||||
|
||||
## ClearML's HyperParameter Optimization
|
||||
|
||||
@ -50,17 +50,17 @@ The `HyperParameterOptimizer` class contains **ClearML**’s hyperparameter opti
|
||||
using different optimizers, including existing software frameworks, enabling simple, accurate, and fast hyperparameter
|
||||
optimization.
|
||||
|
||||
* **Optuna** - `automation.optuna.optuna.OptimizerOptuna`. Optuna is the default optimizer in ClearML. It makes use of
|
||||
* **Optuna** - [`automation.optuna.OptimizerOptuna`](../references/sdk/hpo_optuna_optuna_optimizeroptuna.md). Optuna is the default optimizer in ClearML. It makes use of
|
||||
different samplers such as grid search, random, bayesian, and evolutionary algorithms.
|
||||
For more information, see the [Optuna](https://optuna.readthedocs.io/en/latest/)
|
||||
documentation.
|
||||
* **BOHB** - `automation.hpbandster.bandster.OptimizerBOHB`. BOHB performs robust and efficient hyperparameter optimization
|
||||
* **BOHB** - [`automation.hpbandster.OptimizerBOHB`](../references/sdk/hpo_hpbandster_bandster_optimizerbohb.md). BOHB performs robust and efficient hyperparameter optimization
|
||||
at scale by combining the speed of Hyperband searches with the guidance and guarantees of convergence of Bayesian Optimization.
|
||||
For more information about HpBandSter BOHB, see the [HpBandSter](https://automl.github.io/HpBandSter/build/html/index.html)
|
||||
documentation and a [code example](../guides/frameworks/pytorch/notebooks/image/hyperparameter_search.md).
|
||||
* **Random** uniform sampling of hyperparameters - `automation.optimization.RandomSearch`.
|
||||
* **Full grid** sampling strategy of every hyperparameter combination - `Grid search automation.optimization.GridSearch`.
|
||||
* **Custom** - `automation.optimization.SearchStrategy` - Use a custom class and inherit from the ClearML automation base strategy class
|
||||
* **Random** uniform sampling of hyperparameters - [`automation.RandomSearch`](../references/sdk/hpo_optimization_randomsearch.md).
|
||||
* **Full grid** sampling strategy of every hyperparameter combination - [`automation.GridSearch`](../references/sdk/hpo_optimization_gridsearch.md).
|
||||
* **Custom** - [`automation.optimization.SearchStrategy`](https://github.com/allegroai/clearml/blob/master/clearml/automation/optimization.py#L268) - Use a custom class and inherit from the ClearML automation base strategy class
|
||||
|
||||
|
||||
## Defining a Hyperparameter Optimization Search Example
|
||||
@ -114,7 +114,7 @@ optimization.
|
||||
max_iteration_per_job=150000,
|
||||
)
|
||||
```
|
||||
<br/><br/>
|
||||
<br/>
|
||||
|
||||
For more information about `HyperParameterOptimizer` and supported optimization modules, see the [HyperParameterOptimizer class reference](../references/sdk/hpo_optimization_hyperparameteroptimizer.md).
|
||||
|
||||
|
@ -5,17 +5,15 @@ title: Hyperparameter Optimization
|
||||
The [hyper_parameter_optimizer.py](https://github.com/allegroai/clearml/blob/master/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py)
|
||||
example script demonstrates hyperparameter optimization, which is automated by using **ClearML**
|
||||
|
||||
<a class="tr_top_negative" name="strategy"></a>
|
||||
|
||||
## Set the Search Strategy for Optimization
|
||||
|
||||
A search strategy is required for the optimization, as well as a search strategy optimizer class to implement that strategy.
|
||||
|
||||
The following search strategies can be used:
|
||||
|
||||
* Optuna hyperparameter optimization - [automation.optuna.optuna.OptimizerOptuna](../../../references/sdk/hpo_optuna_optuna_optimizeroptuna.md).
|
||||
* Optuna hyperparameter optimization - [automation.optuna.OptimizerOptuna](../../../references/sdk/hpo_optuna_optuna_optimizeroptuna.md).
|
||||
For more information about Optuna, see the [Optuna](https://optuna.org/) documentation.
|
||||
* BOHB - [automation.hpbandster.bandster.OptimizerBOHB](../../../references/sdk/hpo_hpbandster_bandster_optimizerbohb.md).
|
||||
* BOHB - [automation.hpbandster.OptimizerBOHB](../../../references/sdk/hpo_hpbandster_bandster_optimizerbohb.md).
|
||||
|
||||
BOHB performs robust and efficient hyperparameter optimization at scale by combining the speed of Hyperband searches
|
||||
with the guidance and guarantees of convergence of Bayesian Optimization.
|
||||
@ -24,11 +22,11 @@ The following search strategies can be used:
|
||||
For more information about HpBandSter BOHB, see the [HpBandSter](https://automl.github.io/HpBandSter/build/html/index.html)
|
||||
documentation.
|
||||
|
||||
* Random uniform sampling of hyperparameter strategy - [automation.optimization.RandomSearch](../../../references/sdk/hpo_optimization_randomsearch.md)
|
||||
* Full grid sampling strategy of every hyperparameter combination - Grid search [automation.optimization.GridSearch](../../../references/sdk/hpo_optimization_gridsearch.md).
|
||||
* Random uniform sampling of hyperparameter strategy - [automation.RandomSearch](../../../references/sdk/hpo_optimization_randomsearch.md)
|
||||
* Full grid sampling strategy of every hyperparameter combination - Grid search [automation.GridSearch](../../../references/sdk/hpo_optimization_gridsearch.md).
|
||||
* Custom - Use a custom class and inherit from the **ClearML** automation base strategy class, automation.optimization.SearchStrategy.
|
||||
|
||||
The search strategy class that is chosen will be passed to the [automation.optimization.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
|
||||
The search strategy class that is chosen will be passed to the [automation.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
|
||||
object later.
|
||||
|
||||
The example code attempts to import `OptimizerOptuna` for the search strategy. If `clearml.automation.optuna` is not
|
||||
@ -40,14 +38,14 @@ the `RandomSearch` for the search strategy.
|
||||
|
||||
if not aSearchStrategy:
|
||||
try:
|
||||
from clearml.automation.optuna import OptimizerOptuna
|
||||
from clearml.optuna import OptimizerOptuna
|
||||
aSearchStrategy = OptimizerOptuna
|
||||
except ImportError as ex:
|
||||
pass
|
||||
|
||||
if not aSearchStrategy:
|
||||
try:
|
||||
from clearml.automation.hpbandster import OptimizerBOHB
|
||||
from clearml.automation.hpbandster import OptimizerBOHB
|
||||
aSearchStrategy = OptimizerBOHB
|
||||
except ImportError as ex:
|
||||
pass
|
||||
@ -104,7 +102,7 @@ In this example, an experiment named **Keras HP optimization base** is being opt
|
||||
least once so that it is stored in **ClearML Server**, and, therefore, can be cloned.
|
||||
|
||||
Since the arguments dictionary is connected to the Task, after the code runs once, the `template_task_id` can be changed
|
||||
to optimize a different experiment, see [tuning experiments](../../../webapp/webapp_exp_tuning.md).
|
||||
to optimize a different experiment.
|
||||
|
||||
```python
|
||||
# experiment template to optimize in the hyper-parameter optimization
|
||||
@ -122,7 +120,7 @@ to optimize a different experiment, see [tuning experiments](../../../webapp/web
|
||||
|
||||
## Creating the Optimizer Object
|
||||
|
||||
Initialize an [automation.optimization.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
|
||||
Initialize an [automation.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
|
||||
object, setting the optimization parameters, beginning with the ID of the experiment to optimize.
|
||||
|
||||
```python
|
||||
@ -131,8 +129,8 @@ object, setting the optimization parameters, beginning with the ID of the experi
|
||||
base_task_id=args['template_task_id'],
|
||||
```
|
||||
|
||||
Set the hyperparameter ranges to sample, instantiating them as **ClearML** automation objects using [automation.parameters.UniformIntegerParameterRange](../../../references/sdk/hpo_parameters_uniformintegerparameterrange.md)
|
||||
and [automation.parameters.DiscreteParameterRange](../../../references/sdk/hpo_parameters_discreteparameterrange.md).
|
||||
Set the hyperparameter ranges to sample, instantiating them as **ClearML** automation objects using [automation.UniformIntegerParameterRange](../../../references/sdk/hpo_parameters_uniformintegerparameterrange.md)
|
||||
and [automation.DiscreteParameterRange](../../../references/sdk/hpo_parameters_discreteparameterrange.md).
|
||||
|
||||
```python
|
||||
hyper_parameters=[
|
||||
@ -186,8 +184,6 @@ Specify the remaining parameters, including the time limit per Task (minutes), p
|
||||
|
||||
```
|
||||
|
||||
<a class="tr_top_negative" name="service"></a>
|
||||
|
||||
## Running as a Service
|
||||
|
||||
The optimization can run as a service, if the `run_as_service` argument is set to `true`. For more information about
|
||||
|
@ -18,7 +18,7 @@ Artifacts can be uploaded and dynamically tracked, or uploaded without tracking.
|
||||
<a name="configure_artifact_storage" class="tr_top_negative"></a>
|
||||
|
||||
Configure **ClearML** for uploading artifacts to any of the supported types of storage, which include local and shared folders,
|
||||
S3 buckets, Google Cloud Storage, and Azure Storage ([debug sample storage](../../references/sdk/logger.md#set_default_upload_destination)
|
||||
S3 buckets, Google Cloud Storage, and Azure Storage ([debug sample storage](../../references/sdk/logger.md#set_default_upload_destination)
|
||||
is different). Configure **ClearML** in any of the following ways:
|
||||
|
||||
* In the configuration file, set [default_output_uri](../../configs/clearml_conf.md#sdkdevelopment).
|
||||
|
Loading…
Reference in New Issue
Block a user