diff --git a/docs/apps/clearml_param_search.md b/docs/apps/clearml_param_search.md
index b15aa9de..7781d1cf 100644
--- a/docs/apps/clearml_param_search.md
+++ b/docs/apps/clearml_param_search.md
@@ -34,7 +34,7 @@ of the optimization results in table and graph forms.
|`--objective-metric-sign`| Optimization target, whether to maximize or minimize the value of the objective metric specified. Possible values: "min", "max", "min_global", "max_global". For more information, see [Optimization Objective](#optimization-objective). |
|
|`--objective-metric-title`| Objective metric title to maximize/minimize (e.g. 'validation').|
|
|`--optimization-time-limit`|The maximum time (minutes) for the optimization to run. The default is `None`, indicating no time limit.|
|
-|`--optimizer-class`|The optimizer to use. Possible values are: OptimizerOptuna (default), OptimizerBOHB, GridSearch, RandomSearch. For more information, see [Supported Optimizers](../fundamentals/hpo.md#supported-optimizers). |
|
+|`--optimizer-class`|The optimizer to use. Possible values are: OptimizerOptuna (default), OptimizerBOHB, GridSearch, RandomSearch. For more information, see [Supported Optimizers](../hpo.md#supported-optimizers). |
|
|`--params-search`|Parameters space for optimization. See more information in [Specifying the Parameter Space](#specifying-the-parameter-space). |
|
|`--params-override`|Additional parameters of the base task to override for this parameter search. Use the following JSON format for each parameter: `{"name": "param_name", "value": }`. Windows users, see [JSON format note](#json_note).|
|
|`--pool-period-min`|The time between two consecutive polls (minutes).|
|
diff --git a/docs/apps/clearml_task.md b/docs/apps/clearml_task.md
index dd7fb019..f8349e67 100644
--- a/docs/apps/clearml_task.md
+++ b/docs/apps/clearml_task.md
@@ -11,7 +11,7 @@ line arguments, Python module dependencies, and a requirements.txt file!
## What Is ClearML Task For?
* Launching off-the-shelf code on a remote machine with dedicated resources (e.g. GPU)
-* Running [hyperparameter optimization](../fundamentals/hpo.md) on a codebase that is still not in ClearML
+* Running [hyperparameter optimization](../hpo.md) on a codebase that is still not in ClearML
* Creating a pipeline from an assortment of scripts, that you need to turn into ClearML tasks
* Running some code on a remote machine, either using an on-prem cluster or on the cloud
diff --git a/docs/fundamentals/hpo.md b/docs/clearml_sdk/hpo_sdk.md
similarity index 80%
rename from docs/fundamentals/hpo.md
rename to docs/clearml_sdk/hpo_sdk.md
index d384cc15..e80debcd 100644
--- a/docs/fundamentals/hpo.md
+++ b/docs/clearml_sdk/hpo_sdk.md
@@ -2,16 +2,8 @@
title: Hyperparameter Optimization
---
-## What is Hyperparameter Optimization?
-Hyperparameters are variables that directly control the behaviors of training algorithms, and have a significant effect on
-the performance of the resulting machine learning models. Finding the hyperparameter values that yield the best
-performing models can be complicated. Manually adjusting hyperparameters over the course of many training trials can be
-slow and tedious. Luckily, you can automate and boost hyperparameter optimization (HPO) with ClearML's
-[**`HyperParameterOptimizer`**](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class.
-
-## ClearML's Hyperparameter Optimization
-
-ClearML provides the `HyperParameterOptimizer` class, which takes care of the entire optimization process for users
+You can automate and boost hyperparameter optimization (HPO) with ClearML's
+[**`HyperParameterOptimizer`**](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class, which takes care of the entire optimization process
with a simple interface.
ClearML's approach to hyperparameter optimization is scalable, easy to set up and to manage, and it makes it easy to
@@ -57,11 +49,11 @@ optimization.
documentation.
* **BOHB** - [`automation.hpbandster.OptimizerBOHB`](../references/sdk/hpo_hpbandster_bandster_optimizerbohb.md). BOHB performs robust and efficient hyperparameter optimization
at scale by combining the speed of Hyperband searches with the guidance and guarantees of convergence of Bayesian Optimization.
- For more information about HpBandSter BOHB, see the [HpBandSter](https://automl.github.io/HpBandSter/build/html/index.html)
+ For more information about HpBandSter BOHB, see the [HpBandSter](../https://automl.github.io/HpBandSter/build/html/index.html)
documentation and a [code example](../guides/frameworks/pytorch/notebooks/image/hyperparameter_search.md).
* **Random** uniform sampling of hyperparameters - [`automation.RandomSearch`](../references/sdk/hpo_optimization_randomsearch.md).
* **Full grid** sampling strategy of every hyperparameter combination - [`automation.GridSearch`](../references/sdk/hpo_optimization_gridsearch.md).
-* **Custom** - [`automation.optimization.SearchStrategy`](https://github.com/allegroai/clearml/blob/master/clearml/automation/optimization.py#L268) - Use a custom class and inherit from the ClearML automation base strategy class.
+* **Custom** - [`automation.optimization.SearchStrategy`](../https://github.com/allegroai/clearml/blob/master/clearml/automation/optimization.py#L268) - Use a custom class and inherit from the ClearML automation base strategy class.
## Defining a Hyperparameter Optimization Search Example
@@ -137,9 +129,9 @@ optimization.
## Optimizer Execution Options
-The `HyperParameterOptimizer` provides options to launch the optimization tasks locally or through a ClearML [queue](agents_and_queues.md#what-is-a-queue).
+The `HyperParameterOptimizer` provides options to launch the optimization tasks locally or through a ClearML [queue](../fundamentals/agents_and_queues.md#what-is-a-queue).
Start a `HyperParameterOptimizer` instance using either [`HyperParameterOptimizer.start()`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md#start)
-or [`HyperParameterOptimizer.start_locally()`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md#start_locally).
+or [`HyperParameterOptimizer.start_locally()`](references/sdk/hpo_optimization_hyperparameteroptimizer.md#start_locally).
Both methods run the optimizer controller locally. `start()` launches the base task clones through a queue
specified when instantiating the controller, while `start_locally()` runs the tasks locally.
@@ -156,17 +148,3 @@ Check out the [Hyperparameter Optimization tutorial](../guides/optimization/hype
## SDK Reference
For detailed information, see the complete [HyperParameterOptimizer SDK reference page](../references/sdk/hpo_optimization_hyperparameteroptimizer.md).
-
-## CLI
-
-ClearML also provides `clearml-param-search`, a CLI utility for managing the hyperparameter optimization process. See
-[ClearML Param Search](../apps/clearml_param_search.md) for more information.
-
-## UI Application
-
-:::info Pro Plan Offering
-The ClearML HPO App is available under the ClearML Pro plan.
-:::
-
-ClearML provides the [Hyperparameter Optimization GUI application](../webapp/applications/apps_hpo.md) for launching and
-managing the hyperparameter optimization process.
diff --git a/docs/fundamentals/agents_and_queues.md b/docs/fundamentals/agents_and_queues.md
index 2c2e16e5..c939e632 100644
--- a/docs/fundamentals/agents_and_queues.md
+++ b/docs/fundamentals/agents_and_queues.md
@@ -17,7 +17,7 @@ from installing required packages to setting environment variables,
all leading to executing the code (supporting both virtual environment or flexible docker container configurations).
The agent also supports overriding parameter values on-the-fly without code modification, thus enabling no-code experimentation (this is also the foundation on which
-ClearML [Hyperparameter Optimization](hpo.md) is implemented).
+ClearML [Hyperparameter Optimization](../hpo.md) is implemented).
An agent can be associated with specific GPUs, enabling workload distribution. For example, on a machine with 8 GPUs you
can allocate several GPUs to an agent and use the rest for a different workload, even through another agent (see [Dynamic GPU Allocation](../clearml_agent/clearml_agent_dynamic_gpus.md)).
diff --git a/docs/fundamentals/hyperparameters.md b/docs/fundamentals/hyperparameters.md
index f91da4a4..428011d2 100644
--- a/docs/fundamentals/hyperparameters.md
+++ b/docs/fundamentals/hyperparameters.md
@@ -6,7 +6,7 @@ Hyperparameters are a script's configuration options. Since hyperparameters can
model performance, it is crucial to efficiently track and manage them.
ClearML supports tracking and managing hyperparameters in each task and provides a dedicated [hyperparameter
-optimization module](hpo.md). With ClearML's logging and tracking capabilities, tasks can be reproduced, and their
+optimization module](../hpo.md). With ClearML's logging and tracking capabilities, tasks can be reproduced, and their
hyperparameters and results can be saved and compared, which is key to understanding model behavior.
ClearML lets you easily try out different hyperparameter values without changing your original code. ClearML's [execution
diff --git a/docs/fundamentals/task.md b/docs/fundamentals/task.md
index 5c8cf6ab..e5980aaa 100644
--- a/docs/fundamentals/task.md
+++ b/docs/fundamentals/task.md
@@ -124,7 +124,7 @@ Available task types are:
* *inference* - Model inference job (e.g. offline / batch model execution)
* *controller* - A task that lays out the logic for other tasks' interactions, manual or automatic (e.g. a pipeline
controller)
-* *optimizer* - A specific type of controller for optimization tasks (e.g. [hyperparameter optimization](hpo.md))
+* *optimizer* - A specific type of controller for optimization tasks (e.g. [hyperparameter optimization](../hpo.md))
* *service* - Long lasting or recurring service (e.g. server cleanup, auto ingress, sync services etc.)
* *monitor* - A specific type of service for monitoring
* *application* - A task implementing custom applicative logic, like [autoscaler](../guides/services/aws_autoscaler.md)
diff --git a/docs/getting_started/ds/ds_second_steps.md b/docs/getting_started/ds/ds_second_steps.md
index 21b1640d..d11b2983 100644
--- a/docs/getting_started/ds/ds_second_steps.md
+++ b/docs/getting_started/ds/ds_second_steps.md
@@ -181,7 +181,7 @@ or check these pages out:
- Scale you work and deploy [ClearML Agents](../../clearml_agent.md)
- Develop on remote machines with [ClearML Session](../../apps/clearml_session.md)
- Structure your work and put it into [Pipelines](../../pipelines/pipelines.md)
-- Improve your experiments with [Hyperparameter Optimization](../../fundamentals/hpo.md)
+- Improve your experiments with [Hyperparameter Optimization](../../hpo.md)
- Check out ClearML's integrations with your favorite ML frameworks like [TensorFlow](../../integrations/tensorflow.md),
[PyTorch](../../integrations/pytorch.md), [Keras](../../integrations/keras.md),
and more
diff --git a/docs/getting_started/main.md b/docs/getting_started/main.md
index 9384a879..25110ce1 100644
--- a/docs/getting_started/main.md
+++ b/docs/getting_started/main.md
@@ -112,7 +112,7 @@ alert you whenever your model improves in accuracy)
- Automatically scale cloud instances according to your resource needs with ClearML's
[AWS Autoscaler](../webapp/applications/apps_aws_autoscaler.md) and [GCP Autoscaler](../webapp/applications/apps_gcp_autoscaler.md)
GUI applications
-- Run [hyperparameter optimization](../fundamentals/hpo.md)
+- Run [hyperparameter optimization](../hpo.md)
- Build [pipelines](../pipelines/pipelines.md) from code
- Much more!
diff --git a/docs/hpo.md b/docs/hpo.md
new file mode 100644
index 00000000..5d648698
--- /dev/null
+++ b/docs/hpo.md
@@ -0,0 +1,34 @@
+---
+title: Hyperparameter Optimization
+---
+
+## What is Hyperparameter Optimization?
+Hyperparameters are variables that directly control the behaviors of training algorithms, and have a significant effect on
+the performance of the resulting machine learning models. Hyperparameter optimization (HPO) is crucial for improving
+model performance and generalization.
+
+Finding the hyperparameter values that yield the best performing models can be complicated. Manually adjusting
+hyperparameters over the course of many training trials can be slow and tedious. Luckily, ClearML offers automated
+solutions to boost hyperparameter optimization efficiency.
+
+## Workflow
+
+
+
+The preceding diagram demonstrates the typical flow of hyperparameter optimization where the parameters of a base task are optimized:
+
+1. Configure an Optimization Task with a base task whose parameters will be optimized, optimization targets, and a set of parameter values to
+ test
+1. Clone the base task. Each clone's parameter is overridden with a value from the optimization task
+1. Enqueue each clone for execution by a ClearML Agent
+1. The Optimization Task records and monitors the cloned tasks' configuration and execution details, and returns a
+ summary of the optimization results.
+
+## ClearML Solutions
+
+ClearML offers three solutions for hyperparameter optimization:
+* [GUI application](webapp/applications/apps_hpo.md): The Hyperparameter Optimization app allows you to run and manage the optimization tasks
+ directly from the web interface--no code necessary (available under the ClearML Pro plan).
+* [Command-Line Interface (CLI)](apps/clearml_param_search.md): The `clearml-param-search` CLI tool enables you to configure and launch the optimization process from your terminal.
+* [Python Interface](clearml_sdk/hpo_sdk.md): The `HyperParameterOptimizer` class within the ClearML SDK allows you to
+ configure and launch optimization tasks, and seamlessly integrate them in your existing model training tasks.
diff --git a/docs/integrations/catboost.md b/docs/integrations/catboost.md
index 50c41700..ed5bd5df 100644
--- a/docs/integrations/catboost.md
+++ b/docs/integrations/catboost.md
@@ -117,5 +117,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
## Hyperparameter Optimization
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
-the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
+the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
for more information.
diff --git a/docs/integrations/keras.md b/docs/integrations/keras.md
index 52f6f487..88a9d182 100644
--- a/docs/integrations/keras.md
+++ b/docs/integrations/keras.md
@@ -129,5 +129,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
## Hyperparameter Optimization
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
-the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
+the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
for more information.
diff --git a/docs/integrations/lightgbm.md b/docs/integrations/lightgbm.md
index cce9887e..ddba0057 100644
--- a/docs/integrations/lightgbm.md
+++ b/docs/integrations/lightgbm.md
@@ -118,5 +118,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
## Hyperparameter Optimization
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
-the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
+the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
for more information.
diff --git a/docs/integrations/megengine.md b/docs/integrations/megengine.md
index 77cad702..05e2bed4 100644
--- a/docs/integrations/megengine.md
+++ b/docs/integrations/megengine.md
@@ -114,5 +114,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
## Hyperparameter Optimization
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
-the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
+the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
for more information.
diff --git a/docs/integrations/optuna.md b/docs/integrations/optuna.md
index f660f78b..5b895ac4 100644
--- a/docs/integrations/optuna.md
+++ b/docs/integrations/optuna.md
@@ -2,7 +2,7 @@
title: Optuna
---
-[Optuna](https://optuna.readthedocs.io/en/latest) is a [hyperparameter optimization](../fundamentals/hpo.md) framework,
+[Optuna](https://optuna.readthedocs.io/en/latest) is a [hyperparameter optimization](../hpo.md) framework,
which makes use of different samplers such as grid search, random, bayesian, and evolutionary algorithms. You can integrate
Optuna into ClearML's automated hyperparameter optimization.
diff --git a/docs/integrations/pytorch_lightning.md b/docs/integrations/pytorch_lightning.md
index d01f5cb2..476489bd 100644
--- a/docs/integrations/pytorch_lightning.md
+++ b/docs/integrations/pytorch_lightning.md
@@ -144,6 +144,6 @@ task.execute_remotely(queue_name='default', exit_process=True)
## Hyperparameter Optimization
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
-the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
+the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
for more information.
diff --git a/docs/integrations/tensorflow.md b/docs/integrations/tensorflow.md
index 72aa4a9b..867cbb17 100644
--- a/docs/integrations/tensorflow.md
+++ b/docs/integrations/tensorflow.md
@@ -131,5 +131,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
## Hyperparameter Optimization
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
-the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
+the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
for more information.
diff --git a/docs/integrations/transformers.md b/docs/integrations/transformers.md
index 754fd07f..5bf1d27e 100644
--- a/docs/integrations/transformers.md
+++ b/docs/integrations/transformers.md
@@ -90,5 +90,5 @@ The ClearML Agent executing the task will use the new values to [override any ha
## Hyperparameter Optimization
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
-the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
+the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
for more information.
diff --git a/docs/integrations/xgboost.md b/docs/integrations/xgboost.md
index 7f230f81..5a9ae5ad 100644
--- a/docs/integrations/xgboost.md
+++ b/docs/integrations/xgboost.md
@@ -145,5 +145,5 @@ task.execute_remotely(queue_name='default', exit_process=True)
## Hyperparameter Optimization
Use ClearML's [`HyperParameterOptimizer`](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class to find
-the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../fundamentals/hpo.md)
+the hyperparameter values that yield the best performing models. See [Hyperparameter Optimization](../hpo.md)
for more information.
diff --git a/docs/integrations/yolov5.md b/docs/integrations/yolov5.md
index 6690cf75..2629b791 100644
--- a/docs/integrations/yolov5.md
+++ b/docs/integrations/yolov5.md
@@ -7,7 +7,7 @@ built in logger:
* Track every YOLOv5 training run in ClearML
* Version and easily access your custom training data with [ClearML Data](../clearml_data/clearml_data.md)
* Remotely train and monitor your YOLOv5 training runs using [ClearML Agent](../clearml_agent.md)
-* Get the very best mAP using ClearML [Hyperparameter Optimization](../fundamentals/hpo.md)
+* Get the very best mAP using ClearML [Hyperparameter Optimization](../hpo.md)
* Turn your newly trained YOLOv5 model into an API with just a few commands using [ClearML Serving](../clearml_serving/clearml_serving.md)
## Setup
diff --git a/sidebars.js b/sidebars.js
index 2656a4f8..54a400ae 100644
--- a/sidebars.js
+++ b/sidebars.js
@@ -134,7 +134,7 @@ module.exports = {
]}
],
},
- 'webapp/applications/apps_hpo',
+ 'hpo',
{"Deploying Model Endpoints": [
'webapp/applications/apps_embed_model_deployment',
'webapp/applications/apps_model_deployment',
@@ -254,7 +254,6 @@ module.exports = {
'fundamentals/artifacts',
'fundamentals/models',
'fundamentals/logger',
- 'fundamentals/hpo'
]},
{
type: 'category',