Read the transcript
-Hello and welcome to ClearML. In this video we’ll take a look at one cool way of using the agent other than rerunning a task remotely: hyperparameter optimization.
+Hello and welcome to ClearML. In this video we’ll take a look at one cool way of using the agent other than rerunning a task remotely: hyperparameter optimization (HPO).
By now, we know that ClearML can easily capture our hyperparameters and scalars as part of the experiment tracking. We also know we can clone any task and change its hyperparameters, so they’ll be injected into the original code at runtime. In the last video, we learnt how to make a remote machine execute this task automatically by using the agent.
diff --git a/docs/guides/frameworks/pytorch/notebooks/image/hyperparameter_search.md b/docs/guides/frameworks/pytorch/notebooks/image/hyperparameter_search.md
index 03fbb5e2..4d91c111 100644
--- a/docs/guides/frameworks/pytorch/notebooks/image/hyperparameter_search.md
+++ b/docs/guides/frameworks/pytorch/notebooks/image/hyperparameter_search.md
@@ -4,7 +4,7 @@ title: Image Hyperparameter Optimization - Jupyter Notebook
[hyperparameter_search.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/image/hyperparameter_search.ipynb)
demonstrates using ClearML's [HyperParameterOptimizer](../../../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
-class to perform automated hyperparameter optimization.
+class to perform automated hyperparameter optimization (HPO).
The code creates a HyperParameterOptimizer object, which is a search controller. The search controller uses the
[Optuna](../../../../../references/sdk/hpo_optuna_optuna_optimizeroptuna.md) search strategy optimizer.
diff --git a/docs/guides/optimization/hyper-parameter-optimization/examples_hyperparam_opt.md b/docs/guides/optimization/hyper-parameter-optimization/examples_hyperparam_opt.md
index 7e269201..85856044 100644
--- a/docs/guides/optimization/hyper-parameter-optimization/examples_hyperparam_opt.md
+++ b/docs/guides/optimization/hyper-parameter-optimization/examples_hyperparam_opt.md
@@ -3,7 +3,7 @@ title: Hyperparameter Optimization
---
The [hyper_parameter_optimizer.py](https://github.com/allegroai/clearml/blob/master/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py)
-example script demonstrates hyperparameter optimization, which is automated by using ClearML
+example script demonstrates hyperparameter optimization (HPO), which is automated by using ClearML.
## Set the Search Strategy for Optimization
diff --git a/docs/integrations/storage.md b/docs/integrations/storage.md
index 14bb6922..e3e6ee39 100644
--- a/docs/integrations/storage.md
+++ b/docs/integrations/storage.md
@@ -22,9 +22,9 @@ The ClearML configuration file uses [HOCON](https://github.com/lightbend/config/
### Configuring AWS S3
-Modify these parts of the clearml.conf file and add the key, secret, and region of the s3 bucket.
+Modify these parts of the clearml.conf file and add the key, secret, and region of the S3 bucket.
-It's possible to also give access to specific s3 buckets in the `aws.s3.credentials` section. The default configuration
+It's possible to also give access to specific S3 buckets in the `aws.s3.credentials` section. The default configuration
provided in the `aws.s3` section is applied to any bucket without a bucket-specific configuration.
You can also enable using a credentials chain to let Boto3