mirror of
https://github.com/clearml/clearml-docs
synced 2025-06-26 18:17:44 +00:00
HPO refactor - remove HPO from fundamentals, add general overview with links to solutions, create hpo_sdk which goes over sdk class.
Note: SDK files aren't in ToC yet.
This commit is contained in:
@@ -34,7 +34,7 @@ of the optimization results in table and graph forms.
|
||||
|`--objective-metric-sign`| Optimization target, whether to maximize or minimize the value of the objective metric specified. Possible values: "min", "max", "min_global", "max_global". For more information, see [Optimization Objective](#optimization-objective). |<img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" />|
|
||||
|`--objective-metric-title`| Objective metric title to maximize/minimize (e.g. 'validation').|<img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" />|
|
||||
|`--optimization-time-limit`|The maximum time (minutes) for the optimization to run. The default is `None`, indicating no time limit.|<img src="/docs/latest/icons/ico-optional-no.svg" alt="No" className="icon size-md center-md" />|
|
||||
|`--optimizer-class`|The optimizer to use. Possible values are: OptimizerOptuna (default), OptimizerBOHB, GridSearch, RandomSearch. For more information, see [Supported Optimizers](../fundamentals/hpo.md#supported-optimizers). |<img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" />|
|
||||
|`--optimizer-class`|The optimizer to use. Possible values are: OptimizerOptuna (default), OptimizerBOHB, GridSearch, RandomSearch. For more information, see [Supported Optimizers](../hpo.md#supported-optimizers). |<img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" />|
|
||||
|`--params-search`|Parameters space for optimization. See more information in [Specifying the Parameter Space](#specifying-the-parameter-space). |<img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" />|
|
||||
|`--params-override`|Additional parameters of the base task to override for this parameter search. Use the following JSON format for each parameter: `{"name": "param_name", "value": <new_value>}`. Windows users, see [JSON format note](#json_note).|<img src="/docs/latest/icons/ico-optional-no.svg" alt="No" className="icon size-md center-md" />|
|
||||
|`--pool-period-min`|The time between two consecutive polls (minutes).|<img src="/docs/latest/icons/ico-optional-no.svg" alt="No" className="icon size-md center-md" />|
|
||||
|
||||
@@ -11,7 +11,7 @@ line arguments, Python module dependencies, and a requirements.txt file!
|
||||
|
||||
## What Is ClearML Task For?
|
||||
* Launching off-the-shelf code on a remote machine with dedicated resources (e.g. GPU)
|
||||
* Running [hyperparameter optimization](../fundamentals/hpo.md) on a codebase that is still not in ClearML
|
||||
* Running [hyperparameter optimization](../hpo.md) on a codebase that is still not in ClearML
|
||||
* Creating a pipeline from an assortment of scripts, that you need to turn into ClearML tasks
|
||||
* Running some code on a remote machine, either using an on-prem cluster or on the cloud
|
||||
|
||||
|
||||
@@ -2,16 +2,8 @@
|
||||
title: Hyperparameter Optimization
|
||||
---
|
||||
|
||||
## What is Hyperparameter Optimization?
|
||||
Hyperparameters are variables that directly control the behaviors of training algorithms, and have a significant effect on
|
||||
the performance of the resulting machine learning models. Finding the hyperparameter values that yield the best
|
||||
performing models can be complicated. Manually adjusting hyperparameters over the course of many training trials can be
|
||||
slow and tedious. Luckily, you can automate and boost hyperparameter optimization (HPO) with ClearML's
|
||||
[**`HyperParameterOptimizer`**](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class.
|
||||
|
||||
## ClearML's Hyperparameter Optimization
|
||||
|
||||
ClearML provides the `HyperParameterOptimizer` class, which takes care of the entire optimization process for users
|
||||
You can automate and boost hyperparameter optimization (HPO) with ClearML's
|
||||
[**`HyperParameterOptimizer`**](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class, which takes care of the entire optimization process
|
||||
with a simple interface.
|
||||
|
||||
ClearML's approach to hyperparameter optimization is scalable, easy to set up and to manage, and it makes it easy to
|
||||
@@ -57,11 +49,11 @@ optimization.
|
||||
documentation.
|
||||
* **BOHB** - [`automation.hpbandster.OptimizerBOHB`](../references/sdk/hpo_hpbandster_bandster_optimizerbohb.md). BOHB performs robust and efficient hyperparameter optimization
|
||||
at scale by combining the speed of Hyperband searches with the guidance and guarantees of convergence of Bayesian Optimization.
|
||||
For more information about HpBandSter BOHB, see the [HpBandSter](https://automl.github.io/HpBandSter/build/html/index.html)
|
||||
For more information about HpBandSter BOHB, see the [HpBandSter](../https://automl.github.io/HpBandSter/build/html/index.html)
|
||||
documentation and a [code example](../guides/frameworks/pytorch/notebooks/image/hyperparameter_search.md).
|
||||
* **Random** uniform sampling of hyperparameters - [`automation.RandomSearch`](../references/sdk/hpo_optimization_randomsearch.md).
|
||||
* **Full grid** sampling strategy of every hyperparameter combination - [`automation.GridSearch`](../references/sdk/hpo_optimization_gridsearch.md).
|
||||
* **Custom** - [`automation.optimization.SearchStrategy`](https://github.com/allegroai/clearml/blob/master/clearml/automation/optimization.py#L268) - Use a custom class and inherit from the ClearML automation base strategy class.
|
||||
* **Custom** - [`automation.optimization.SearchStrategy`](../https://github.com/allegroai/clearml/blob/master/clearml/automation/optimization.py#L268) - Use a custom class and inherit from the ClearML automation base strategy class.
|
||||
|
||||
|
||||
## Defining a Hyperparameter Optimization Search Example
|
||||
@@ -137,9 +129,9 @@ optimization.
|
||||
|
||||
|
||||
## Optimizer Execution Options
|
||||
The `HyperParameterOptimizer` provides options to launch the optimization tasks locally or through a ClearML [queue](agents_and_queues.md#what-is-a-queue).
|
||||
The `HyperParameterOptimizer` provides options to launch the optimization tasks locally or through a ClearML [queue](../fundamentals/agents_and_queues.md#what-is-a-queue).
|
||||
Start a `HyperParameterOptimizer` instance using either [`HyperParameterOptimizer.start()`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md#start)
|
||||
or [`HyperParameterOptimizer.start_locally()`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md#start_locally).
|
||||
or [`HyperParameterOptimizer.start_locally()`](references/sdk/hpo_optimization_hyperparameteroptimizer.md#start_locally).
|
||||
Both methods run the optimizer controller locally. `start()` launches the base task clones through a queue
|
||||
specified when instantiating the controller, while `start_locally()` runs the tasks locally.
|
||||
|
||||
@@ -156,17 +148,3 @@ Check out the [Hyperparameter Optimization tutorial](../guides/optimization/hype
|
||||
## SDK Reference
|
||||
|
||||
For detailed information, see the complete [HyperParameterOptimizer SDK reference page](../references/sdk/hpo_optimization_hyperparameteroptimizer.md).
|
||||
|
||||
## CLI
|
||||
|
||||
ClearML also provides `clearml-param-search`, a CLI utility for managing the hyperparameter optimization process. See
|
||||
[ClearML Param Search](../apps/clearml_param_search.md) for more information.
|
||||
|
||||
## UI Application
|
||||
|
||||
:::info Pro Plan Offering
|
||||
The ClearML HPO App is available under the ClearML Pro plan.
|
||||
:::
|
||||
|
||||
ClearML provides the [Hyperparameter Optimization GUI application](../webapp/applications/apps_hpo.md) for launching and
|
||||
managing the hyperparameter optimization process.
|
||||
@@ -17,7 +17,7 @@ from installing required packages to setting environment variables,
|
||||
all leading to executing the code (supporting both virtual environment or flexible docker container configurations).
|
||||
|
||||
The agent also supports overriding parameter values on-the-fly without code modification, thus enabling no-code experimentation (this is also the foundation on which
|
||||
ClearML [Hyperparameter Optimization](hpo.md) is implemented).
|
||||
ClearML [Hyperparameter Optimization](../hpo.md) is implemented).
|
||||
|
||||
An agent can be associated with specific GPUs, enabling workload distribution. For example, on a machine with 8 GPUs you
|
||||
can allocate several GPUs to an agent and use the rest for a different workload, even through another agent (see [Dynamic GPU Allocation](../clearml_agent/clearml_agent_dynamic_gpus.md)).
|
||||
|
||||
@@ -6,7 +6,7 @@ Hyperparameters are a script's configuration options. Since hyperparameters can
|
||||
model performance, it is crucial to efficiently track and manage them.
|
||||
|
||||
ClearML supports tracking and managing hyperparameters in each task and provides a dedicated [hyperparameter
|
||||
optimization module](hpo.md). With ClearML's logging and tracking capabilities, tasks can be reproduced, and their
|
||||
optimization module](../hpo.md). With ClearML's logging and tracking capabilities, tasks can be reproduced, and their
|
||||
hyperparameters and results can be saved and compared, which is key to understanding model behavior.
|
||||
|
||||
ClearML lets you easily try out different hyperparameter values without changing your original code. ClearML's [execution
|
||||
|
||||
@@ -124,7 +124,7 @@ Available task types are:
|
||||
* *inference* - Model inference job (e.g. offline / batch model execution)
|
||||
* *controller* - A task that lays out the logic for other tasks' interactions, manual or automatic (e.g. a pipeline
|
||||
controller)
|
||||
* *optimizer* - A specific type of controller for optimization tasks (e.g. [hyperparameter optimization](hpo.md))
|
||||
* *optimizer* - A specific type of controller for optimization tasks (e.g. [hyperparameter optimization](../hpo.md))
|
||||
* *service* - Long lasting or recurring service (e.g. server cleanup, auto ingress, sync services etc.)
|
||||
* *monitor* - A specific type of service for monitoring
|
||||
* *application* - A task implementing custom applicative logic, like [autoscaler](../guides/services/aws_autoscaler.md)
|
||||
|
||||
@@ -181,7 +181,7 @@ or check these pages out:
|
||||
- Scale you work and deploy [ClearML Agents](../../clearml_agent.md)
|
||||
- Develop on remote machines with [ClearML Session](../../apps/clearml_session.md)
|
||||
- Structure your work and put it into [Pipelines](../../pipelines/pipelines.md)
|
||||
- Improve your experiments with [Hyperparameter Optimization](../../fundamentals/hpo.md)
|
||||
- Improve your experiments with [Hyperparameter Optimization](../../hpo.md)
|
||||
- Check out ClearML's integrations with your favorite ML frameworks like [TensorFlow](../../integrations/tensorflow.md),
|
||||
[PyTorch](../../integrations/pytorch.md), [Keras](../../integrations/keras.md),
|
||||
and more
|
||||
|
||||
@@ -112,7 +112,7 @@ alert you whenever your model improves in accuracy)
|
||||
- Automatically scale cloud instances according to your resource needs with ClearML's
|
||||
[AWS Autoscaler](../webapp/applications/apps_aws_autoscaler.md) and [GCP Autoscaler](../webapp/applications/apps_gcp_autoscaler.md)
|
||||
GUI applications
|
||||
- Run [hyperparameter optimization](../fundamentals/hpo.md)
|
||||
- Run [hyperparameter optimization](../hpo.md)
|
||||
- Build [pipelines](../pipelines/pipelines.md) from code
|
||||
- Much more!
|
||||
|
||||
|
||||
34
docs/hpo.md
Normal file
34
docs/hpo.md
Normal file
@@ -0,0 +1,34 @@
|
||||
---
|
||||
title: Hyperparameter Optimization
|
||||
---
|
||||
|
||||
## What is Hyperparameter Optimization?
|
||||
Hyperparameters are variables that directly control the behaviors of training algorithms, and have a significant effect on
|
||||
the performance of the resulting machine learning models. Hyperparameter optimization (HPO) is crucial for improving
|
||||
model performance and generalization.
|
||||
|
||||
Finding the hyperparameter values that yield the best performing models can be complicated. Manually adjusting
|
||||
hyperparameters over the course of many training trials can be slow and tedious. Luckily, ClearML offers automated
|
||||
solutions to boost hyperparameter optimization efficiency.
|
||||
|
||||
## Workflow
|
||||
|
||||

|
||||
|
||||
The preceding diagram demonstrates the typical flow of hyperparameter optimization where the parameters of a base task are optimized:
|
||||
|
||||
1. Configure an Optimization Task with a base task whose parameters will be optimized, optimization targets, and a set of parameter values to
|
||||
test
|
||||
1. Clone the base task. Each clone's parameter is overridden with a value from the optimization task
|
||||
1. Enqueue each clone for execution by a ClearML Agent
|
||||
1. The Optimization Task records and monitors the cloned tasks' configuration and execution details, and returns a
|
||||
summary of the optimization results.
|
||||
|
||||
## ClearML Solutions
|
||||
|
||||
ClearML offers three solutions for hyperparameter optimization:
|
||||
* [GUI application](webapp/applications/apps_hpo.md): The Hyperparameter Optimization app allows you to run and manage the optimization tasks
|
||||
directly from the web interface--no code necessary (available under the ClearML Pro plan).
|
||||
* [Command-Line Interface (CLI)](apps/clearml_param_search.md): The `clearml-param-search` CLI tool enables you to configure and launch the optimization process from your terminal.
|
||||
* [Python Interface](clearml_sdk/hpo_sdk.md): The `HyperParameterOptimizer` class within the ClearML SDK allows you to
|
||||
configure and launch optimization tasks, and seamlessly integrate them in your existing model training tasks.
|
||||
@@ -117,5 +117,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
|
||||
|
||||
## Hyperparameter Optimization
|
||||
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
|
||||
for more information.
|
||||
|
||||
@@ -129,5 +129,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
|
||||
|
||||
## Hyperparameter Optimization
|
||||
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
|
||||
for more information.
|
||||
|
||||
@@ -118,5 +118,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
|
||||
|
||||
## Hyperparameter Optimization
|
||||
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
|
||||
for more information.
|
||||
|
||||
@@ -114,5 +114,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
|
||||
|
||||
## Hyperparameter Optimization
|
||||
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
|
||||
for more information.
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
title: Optuna
|
||||
---
|
||||
|
||||
[Optuna](https://optuna.readthedocs.io/en/latest) is a [hyperparameter optimization](../fundamentals/hpo.md) framework,
|
||||
[Optuna](https://optuna.readthedocs.io/en/latest) is a [hyperparameter optimization](../hpo.md) framework,
|
||||
which makes use of different samplers such as grid search, random, bayesian, and evolutionary algorithms. You can integrate
|
||||
Optuna into ClearML's automated hyperparameter optimization.
|
||||
|
||||
|
||||
@@ -144,6 +144,6 @@ task.execute_remotely(queue_name='default', exit_process=True)
|
||||
|
||||
## Hyperparameter Optimization
|
||||
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
|
||||
for more information.
|
||||
|
||||
|
||||
@@ -131,5 +131,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
|
||||
|
||||
## Hyperparameter Optimization
|
||||
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
|
||||
for more information.
|
||||
|
||||
@@ -90,5 +90,5 @@ The ClearML Agent executing the task will use the new values to [override any ha
|
||||
|
||||
## Hyperparameter Optimization
|
||||
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
|
||||
for more information.
|
||||
|
||||
@@ -145,5 +145,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
|
||||
|
||||
## Hyperparameter Optimization
|
||||
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
|
||||
the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
|
||||
for more information.
|
||||
|
||||
@@ -7,7 +7,7 @@ built in logger:
|
||||
* Track every YOLOv5 training run in ClearML
|
||||
* Version and easily access your custom training data with [ClearML Data](../clearml_data/clearml_data.md)
|
||||
* Remotely train and monitor your YOLOv5 training runs using [ClearML Agent](../clearml_agent.md)
|
||||
* Get the very best mAP using ClearML [Hyperparameter Optimization](../fundamentals/hpo.md)
|
||||
* Get the very best mAP using ClearML [Hyperparameter Optimization](../hpo.md)
|
||||
* Turn your newly trained YOLOv5 model into an API with just a few commands using [ClearML Serving](../clearml_serving/clearml_serving.md)
|
||||
|
||||
## Setup
|
||||
|
||||
@@ -134,7 +134,7 @@ module.exports = {
|
||||
]}
|
||||
],
|
||||
},
|
||||
'webapp/applications/apps_hpo',
|
||||
'hpo',
|
||||
{"Deploying Model Endpoints": [
|
||||
'webapp/applications/apps_embed_model_deployment',
|
||||
'webapp/applications/apps_model_deployment',
|
||||
@@ -254,7 +254,6 @@ module.exports = {
|
||||
'fundamentals/artifacts',
|
||||
'fundamentals/models',
|
||||
'fundamentals/logger',
|
||||
'fundamentals/hpo'
|
||||
]},
|
||||
{
|
||||
type: 'category',
|
||||
|
||||
Reference in New Issue
Block a user