Add multi-objective optimization info (#844)

This commit is contained in:
pollfly 2024-05-21 11:30:46 +03:00 committed by GitHub
parent 865e46e538
commit ce58c8e2d4
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
2 changed files with 90 additions and 59 deletions

View File

@ -23,7 +23,7 @@ compare results.
The preceding diagram demonstrates the typical flow of hyperparameter optimization where the parameters of a base task are optimized: The preceding diagram demonstrates the typical flow of hyperparameter optimization where the parameters of a base task are optimized:
1. Configure an Optimization Task with a base task whose parameters will be optimized, and a set of parameter values to 1. Configure an Optimization Task with a base task whose parameters will be optimized, optimization targets, and a set of parameter values to
test test
1. Clone the base task. Each clone's parameter is overridden with a value from the optimization task 1. Clone the base task. Each clone's parameter is overridden with a value from the optimization task
1. Enqueue each clone for execution by a ClearML Agent 1. Enqueue each clone for execution by a ClearML Agent
@ -118,6 +118,19 @@ optimization.
in the task header. in the task header.
::: :::
:::tip Multi-objective Optimization
If you are using the Optuna framework (see [Supported Optimizers](#supported-optimizers)), you can list multiple optimization objectives.
When doing so, make sure the `objective_metric_title`, `objective_metric_series`, and `objective_metric_sign` lists
are the same length. Each title will be matched to its respective series and sign.
For example, the code below sets two objectives: to minimize the `validation/loss` metric and to maximize the `validation/accuracy` metric.
```python
objective_metric_title=["validation", "validation"]
objective_metric_series=["loss", "accuracy"]
objective_metric_sign=["min", "max"]
```
:::
## Optimizer Execution Options ## Optimizer Execution Options
The `HyperParameterOptimizer` provides options to launch the optimization tasks locally or through a ClearML [queue](agents_and_queues.md#what-is-a-queue). The `HyperParameterOptimizer` provides options to launch the optimization tasks locally or through a ClearML [queue](agents_and_queues.md#what-is-a-queue).

View File

@ -114,7 +114,9 @@ if not args['template_task_id']:
## Creating the Optimizer Object ## Creating the Optimizer Object
Initialize an [`automation.HyperParameterOptimizer`](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md) Initialize an [`automation.HyperParameterOptimizer`](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
object, setting the optimization parameters, beginning with the ID of the experiment to optimize. object, setting the following optimization parameters:
* ID of a ClearML task to optimize. This task will be cloned, and each clone will sample a different set of hyperparameters values:
```python ```python
an_optimizer = HyperParameterOptimizer( an_optimizer = HyperParameterOptimizer(
@ -122,8 +124,8 @@ an_optimizer = HyperParameterOptimizer(
base_task_id=args['template_task_id'], base_task_id=args['template_task_id'],
``` ```
Set the hyperparameter ranges to sample, instantiating them as ClearML automation objects using [`automation.UniformIntegerParameterRange`](../../../references/sdk/hpo_parameters_uniformintegerparameterrange.md) * Hyperparameter ranges to sample, instantiating them as ClearML automation objects using [`automation.UniformIntegerParameterRange`](../../../references/sdk/hpo_parameters_uniformintegerparameterrange.md)
and [`automation.DiscreteParameterRange`](../../../references/sdk/hpo_parameters_discreteparameterrange.md). and [`automation.DiscreteParameterRange`](../../../references/sdk/hpo_parameters_discreteparameterrange.md):
```python ```python
hyper_parameters=[ hyper_parameters=[
@ -134,7 +136,7 @@ and [`automation.DiscreteParameterRange`](../../../references/sdk/hpo_parameters
], ],
``` ```
Set the metric to optimize and the optimization objective. * Metric to optimize and the optimization objective:
```python ```python
objective_metric_title='val_acc', objective_metric_title='val_acc',
@ -142,19 +144,35 @@ Set the metric to optimize and the optimization objective.
objective_metric_sign='max', objective_metric_sign='max',
``` ```
Set the number of concurrent Tasks. :::tip Multi-objective Optimization
If you are using the Optuna framework (see [Set the Search Strategy for Optimization](#set-the-search-strategy-for-optimization)),
you can list multiple optimization objectives. When doing so, make sure the `objective_metric_title`,
`objective_metric_series`, and `objective_metric_sign` lists are
the same length. Each title will be matched to its respective series and sign.
For example, the code below sets two objectives: to minimize the `validation/loss` metric and to maximize the
`validation/accuracy` metric:
```python
objective_metric_title=["validation", "validation"]
objective_metric_series=["loss", "accuracy"]
objective_metric_sign=["min", "max"]
```
:::
* Number of concurrent Tasks:
```python ```python
max_number_of_concurrent_tasks=2, max_number_of_concurrent_tasks=2,
``` ```
Set the optimization strategy, see [Set the search strategy for optimization](#set-the-search-strategy-for-optimization). * Optimization strategy (see [Set the search strategy for optimization](#set-the-search-strategy-for-optimization)):
```python ```python
optimizer_class=aSearchStrategy, optimizer_class=aSearchStrategy,
``` ```
Specify the queue to use for remote execution. This is overridden if the optimizer runs as a service. * Queue to use for remote execution. This is overridden if the optimizer runs as a service.
```python ```python
execution_queue='1xGPU', execution_queue='1xGPU',
``` ```
Specify the remaining parameters, including the time limit per Task (minutes), period for checking the optimization (minutes), maximum number of jobs to launch, minimum and maximum number of iterations for each Task. * Remaining parameters, including the time limit per Task (minutes), period for checking the optimization (minutes),
maximum number of jobs to launch, minimum and maximum number of iterations for each Task:
```python ```python
# Optional: Limit the execution time of a single experiment, in minutes. # Optional: Limit the execution time of a single experiment, in minutes.
# (this is optional, and if using OptimizerBOHB, it is ignored) # (this is optional, and if using OptimizerBOHB, it is ignored)