Small edits (#674)

This commit is contained in:
pollfly 2023-09-21 13:52:36 +03:00 committed by GitHub
parent c8dcf50796
commit 7f0b4ce7d9
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
10 changed files with 11 additions and 11 deletions

View File

@ -147,7 +147,7 @@ The Task must be connected to a git repository, since currently single script de
:::
1. In the **ClearML web UI**, find the experiment (Task) that needs debugging.
1. Click on the ID button next to the Task name, and copy the unique ID.
1. Click the `ID` button next to the Task name, and copy the unique ID.
1. Enter the following command: `clearml-session --debugging-session <experiment_id_here>`
1. Click on the JupyterLab / VS Code link, or connect directly to the SSH session.
1. In JupyterLab / VS Code, access the experiment's repository in the `environment/task_repository` folder.

View File

@ -720,7 +720,7 @@ You must use a secure protocol with ``api.web_server``, ``api.files_server``, an
**`api.http.default_method`** (*string*)
* Set the request method for all API requests and auth login. This could be useful when `GET` requests with payloads are
* Set the request method for all API requests and auth login. This can be useful when `GET` requests with payloads are
blocked by a server, and `POST` requests can be used instead. The request options are: "GET", "POST", "PUT".
:::caution

View File

@ -47,7 +47,7 @@ Overrides Repository Auto-logging
|**CLEARML_API_ACCESS_KEY** | Sets the Server's Public Access Key|
|**CLEARML_API_SECRET_KEY** | Sets the Server's Private Access Key|
|**CLEARML_API_HOST_VERIFY_CERT** | Enables / Disables server certificate verification (if behind a firewall)|
|**CLEARML_API_DEFAULT_REQ_METHOD**| *Experimental - this option has not been vigorously tested.* Set the request method for all API requests and auth login. This could be useful when GET requests with payloads are blocked by a server, so POST/PUT requests can be used instead. |
|**CLEARML_API_DEFAULT_REQ_METHOD**| *Experimental - this option has not been vigorously tested.* Set the request method for all API requests and auth login. This can be useful when GET requests with payloads are blocked by a server, so POST/PUT requests can be used instead. |
|**CLEARML_OFFLINE_MODE** | Sets Offline mode|
|**CLEARML_NO_DEFAULT_SERVER** | Disables sending information to demo server when no HOST server is set|

View File

@ -111,6 +111,6 @@ pipe.add_step(
)
```
We could also pass the parameters from one step to the other (for example `Task.id`).
We can also pass the parameters from one step to the other (for example `Task.id`).
In addition to pipelines made up of Task steps, ClearML also supports pipelines consisting of function steps. See more in the
full pipeline documentation [here](../../pipelines/pipelines.md).

View File

@ -82,7 +82,7 @@ of a dataset card to open its context menu and access dataset actions:
## Create New Hyper-Datasets
To create a new Hyper-Dataset, click the **+ NEW DATASET** button in the top right of the page, which will open a
To create a Hyper-Dataset, click the **+ NEW DATASET** button in the top right of the page, which will open a
**New Dataset** modal.
![Hyper-Dataset creation modal](../../img/webapp_hyperdataset_creation.png)

View File

@ -10,7 +10,7 @@ The [HyperParameterOptimizer](../references/sdk/hpo_optimization_hyperparametero
hyperparameter optimization modules. Its modular design enables using different optimizers, including existing software
frameworks, like Optuna, enabling simple,
accurate, and fast hyperparameter optimization. The Optuna ([`automation.optuna.OptimizerOptuna`](../references/sdk/hpo_optuna_optuna_optimizeroptuna.md)),
optimizer allows you to simultaneously optimize many hyperparameters efficiently by relying on early stopping (pruning)
optimizer lets you simultaneously optimize many hyperparameters efficiently by relying on early stopping (pruning)
and smart resource allocation.
To use optuna in ClearML's hyperparameter optimization, you must first install it. When you instantiate `HyperParameterOptimizer`,

View File

@ -121,7 +121,7 @@ clearml-data sync --project YOLOv5 --name coco128 --folder .
This command syncs the folder's content with ClearML, packaging all of the folder's contents into a ClearML dataset.
Alternatively, you could run these commands one after the other to create a dataset:
Alternatively, you can run these commands one after the other to create a dataset:
```commandline
# Optionally add --parent <parent_dataset_id> if you want to base

View File

@ -101,8 +101,8 @@ When you rerun the pipeline through the ClearML WebApp, the pipeline is construc
code.
To change this behavior, pass `always_create_from_code=False` when instantiating a `PipelineController`. In this case,
when rerun, the pipeline DAG will be generated from the pipeline configuration stored in the pipeline task. This allows
you to modify the pipeline configuration via the UI, without changing the original codebase.
when rerun, the pipeline DAG will be generated from the pipeline configuration stored in the pipeline task. This
lets you modify the pipeline configuration via the UI, without changing the original codebase.
### Pipeline Versions
Each pipeline must be assigned a version number to help track the evolution of your pipeline structure and parameters.

View File

@ -105,7 +105,7 @@ See [add_step](../references/sdk/automation_controller_pipelinecontroller.md#add
#### parameter_override
Use the `parameter_override` argument to modify the steps parameter values. The `parameter_override` dictionary key is
the task parameters full path, which includes the parameter section's name and the parameter name separated by a slash
(e.g. `'General/dataset_url'`). Passing `"${}"` in the argument value allows you to reference input/output configurations
(e.g. `'General/dataset_url'`). Passing `"${}"` in the argument value lets you reference input/output configurations
from other pipeline steps. For example: `"${<step_name>.id}"` will be converted to the Task ID of the referenced pipeline
step.

View File

@ -354,7 +354,7 @@ These controls allow you to better analyze the results. Hover over a plot, and t
Experiment outputs such as images, audio, and videos appear in **DEBUG SAMPLES**. These include data generated by
libraries and visualization tools, and explicitly reported using the [ClearML Logger](../fundamentals/logger.md).
You can view debug samples by metric at any iteration. Filter the samples by metric by selecting a metric from the
You can view debug samples by metric in the reported iterations. Filter the samples by metric by selecting a metric from the
dropdown menu above the samples. The most recent iteration appears first.
![Debug Samples tab](../img/webapp_tracking_43.png)