Small edits (#435)

This commit is contained in:
pollfly 2023-01-12 16:57:08 +02:00 committed by GitHub
parent 0934530a3a
commit 21d9c7e29b
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
10 changed files with 12 additions and 12 deletions

View File

@ -2,7 +2,7 @@
title: ClearML Param Search
---
Use the `clearml-param-search` CLI tool to launch ClearML's automated hyperparameter optimization. This process finds
Use the `clearml-param-search` CLI tool to launch ClearML's automated hyperparameter optimization (HPO). This process finds
the optimal values for your experiments' hyperparameters that yield the best performing models.
## How Does `clearml-param-search` Work?

View File

@ -68,7 +68,7 @@ clearml-data add [-h] [--id ID] [--dataset-folder DATASET_FOLDER]
|`--id` | Dataset's ID. Default: previously created / accessed dataset| <img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" /> |
|`--files`| Files / folders to add. Items will be uploaded to the datasets designated storage. | <img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" /> |
|`--wildcard`| Add specific set of files, denoted by these wildcards. For example: `~/data/*.jpg ~/data/json`. Multiple wildcards can be passed. | <img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" /> |
|`--links`| Files / folders link to add. Supports s3, gs, azure links. Example: `s3://bucket/data` `azure://bucket/folder`. Items remain in their original location. | <img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" /> |
|`--links`| Files / folders link to add. Supports S3, GS, Azure links. Example: `s3://bucket/data` `azure://bucket/folder`. Items remain in their original location. | <img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" /> |
|`--dataset-folder` | Dataset base folder to add the files to in the dataset. Default: dataset root| <img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" /> |
|`--non-recursive` | Disable recursive scan of files | <img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" /> |
|`--verbose` | Verbose reporting | <img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" />|

View File

@ -52,7 +52,7 @@ See the [Logger SDK reference page](../references/sdk/logger.md).
### Hyperparameter Optimization
ClearML's `optimization` module includes classes that support hyperparameter optimization:
ClearML's `optimization` module includes classes that support hyperparameter optimization (HPO):
* [HyperParameterOptimizer](../references/sdk/automation_controller_pipelinecontroller.md) - Hyperparameter search
controller class
* Optimization search strategy classes including [Optuna](../references/sdk/hpo_optuna_optuna_optimizeroptuna.md), [HpBandSter](../references/sdk/hpo_hpbandster_bandster_optimizerbohb.md),

View File

@ -6,7 +6,7 @@ title: Hyperparameter Optimization
Hyperparameters are variables that directly control the behaviors of training algorithms, and have a significant effect on
the performance of the resulting machine learning models. Finding the hyperparameter values that yield the best
performing models can be complicated. Manually adjusting hyperparameters over the course of many training trials can be
slow and tedious. Luckily, you can automate and boost hyperparameter optimization with ClearML's
slow and tedious. Luckily, you can automate and boost hyperparameter optimization (HPO) with ClearML's
[**`HyperParameterOptimizer`**](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class.
## ClearML's Hyperparameter Optimization

View File

@ -178,7 +178,7 @@ or check these pages out:
- Scale you work and deploy [ClearML Agents](../../clearml_agent.md)
- Develop on remote machines with [ClearML Session](../../apps/clearml_session.md)
- Structure your work and put it into [Pipelines](../../pipelines/pipelines.md)
- Improve your experiments with [HyperParameter Optimization](../../fundamentals/hpo.md)
- Improve your experiments with [Hyperparameter Optimization](../../fundamentals/hpo.md)
- Check out ClearML's integrations to [external libraries](../../integrations/libraries.md).
## YouTube Playlist

View File

@ -16,7 +16,7 @@ The sections below describe the following scenarios:
## Building Tasks
### Dataset Creation
Let's assume we have some code that extracts data from a production Database into a local folder.
Let's assume we have some code that extracts data from a production database into a local folder.
Our goal is to create an immutable copy of the data to be used by further steps:
```bash
@ -24,7 +24,7 @@ clearml-data create --project data --name dataset
clearml-data sync --folder ./from_production
```
We could also add a Tag `latest` to the Dataset, marking it as the latest version.
We could also add a tag `latest` to the Dataset, marking it as the latest version.
### Preprocessing Data
The second step is to preprocess the date. First we need to access it, then we want to modify it,

View File

@ -20,7 +20,7 @@ keywords: [mlops, components, hyperparameter optimization, hyperparameter]
<details className="cml-expansion-panel info">
<summary className="cml-expansion-panel-summary">Read the transcript</summary>
<div className="cml-expansion-panel-content">
Hello and welcome to ClearML. In this video well take a look at one cool way of using the agent other than rerunning a task remotely: hyperparameter optimization.
Hello and welcome to ClearML. In this video well take a look at one cool way of using the agent other than rerunning a task remotely: hyperparameter optimization (HPO).
By now, we know that ClearML can easily capture our hyperparameters and scalars as part of the experiment tracking. We also know we can clone any task and change its hyperparameters, so theyll be injected into the original code at runtime. In the last video, we learnt how to make a remote machine execute this task automatically by using the agent.

View File

@ -4,7 +4,7 @@ title: Image Hyperparameter Optimization - Jupyter Notebook
[hyperparameter_search.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/image/hyperparameter_search.ipynb)
demonstrates using ClearML's [HyperParameterOptimizer](../../../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
class to perform automated hyperparameter optimization.
class to perform automated hyperparameter optimization (HPO).
The code creates a HyperParameterOptimizer object, which is a search controller. The search controller uses the
[Optuna](../../../../../references/sdk/hpo_optuna_optuna_optimizeroptuna.md) search strategy optimizer.

View File

@ -3,7 +3,7 @@ title: Hyperparameter Optimization
---
The [hyper_parameter_optimizer.py](https://github.com/allegroai/clearml/blob/master/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py)
example script demonstrates hyperparameter optimization, which is automated by using ClearML
example script demonstrates hyperparameter optimization (HPO), which is automated by using ClearML.
## Set the Search Strategy for Optimization

View File

@ -22,9 +22,9 @@ The ClearML configuration file uses [HOCON](https://github.com/lightbend/config/
### Configuring AWS S3
Modify these parts of the clearml.conf file and add the key, secret, and region of the s3 bucket.
Modify these parts of the clearml.conf file and add the key, secret, and region of the S3 bucket.
It's possible to also give access to specific s3 buckets in the `aws.s3.credentials` section. The default configuration
It's possible to also give access to specific S3 buckets in the `aws.s3.credentials` section. The default configuration
provided in the `aws.s3` section is applied to any bucket without a bucket-specific configuration.
You can also enable using a credentials chain to let Boto3