diff --git a/docs/clearml_sdk/task_sdk.md b/docs/clearml_sdk/task_sdk.md index 58a931b8..a26763f1 100644 --- a/docs/clearml_sdk/task_sdk.md +++ b/docs/clearml_sdk/task_sdk.md @@ -831,7 +831,7 @@ task = Task.init(project_name='examples', task_name='parameters') task.set_parameters({'Args/epochs':7, 'lr': 0.5}) # setting a single parameter -task.set_parameter(name='decay',value=0.001) +task.set_parameter(name='decay', value=0.001) ``` :::warning Overwriting Parameters @@ -889,7 +889,7 @@ me = Person('Erik', 5) params_dictionary = {'epochs': 3, 'lr': 0.4} -task = Task.init(project_name='examples',task_name='python objects') +task = Task.init(project_name='examples', task_name='python objects') task.connect(me) task.connect(params_dictionary) diff --git a/docs/clearml_serving/clearml_serving_tutorial.md b/docs/clearml_serving/clearml_serving_tutorial.md index f7cf1d93..d6e383d6 100644 --- a/docs/clearml_serving/clearml_serving_tutorial.md +++ b/docs/clearml_serving/clearml_serving_tutorial.md @@ -38,13 +38,13 @@ clearml-serving --id model add --engine sklearn --endpoint "test_mo :::info Service ID Make sure that you have executed `clearml-serving`'s [initial setup](clearml_serving_setup.md#initial-setup), in which you create a Serving Service. -The Serving Service's ID is required to register a model, and to execute `clearml-serving`'s `metrics` and `config` commands +The Serving Service's ID is required to register a model, and to execute `clearml-serving`'s `metrics` and `config` commands. ::: :::note The preprocessing Python code is packaged and uploaded to the Serving Service, to be used by any inference container, -and downloaded in real time when updated +and downloaded in real time when updated. ::: ### Step 3: Spin Inference Container @@ -110,7 +110,7 @@ or with the `clearml-serving` CLI. You can also provide a different storage destination for the model, such as S3/GS/Azure, by passing `--destination="s3://bucket/folder"`, `s3://host_addr:port/bucket` (for non-AWS S3-like services like MinIO), `gs://bucket/folder`, `azure://.blob.core.windows.net/path/to/file`. There is no need to provide a unique path to the destination argument, the location of the model will be a unique path based on the serving service ID and the -model name +model name. ::: ## Additional Options @@ -160,7 +160,7 @@ This means that any request coming to `/test_model_sklearn_canary/` will be rout :::note As with any other Serving Service configuration, you can configure the Canary endpoint while the Inference containers are -already running and deployed, they will get updated in their next update cycle (default: once every 5 minutes) +already running and deployed, they will get updated in their next update cycle (default: once every 5 minutes). ::: You can also prepare a "fixed" canary endpoint, always splitting the load between the last two deployed models: @@ -244,7 +244,7 @@ With the new metrics logged, you can create a visualization dashboard over the l :::note If not specified all serving requests will be logged, which can be changed with the `CLEARML_DEFAULT_METRIC_LOG_FREQ` environment variable. For example `CLEARML_DEFAULT_METRIC_LOG_FREQ=0.2` means only 20% of all requests will be logged. -You can also specify per-endpoint log frequency with the `clearml-serving` CLI. See [clearml-serving metrics](clearml_serving_cli.md#metrics) +You can also specify per-endpoint log frequency with the `clearml-serving` CLI. See [clearml-serving metrics](clearml_serving_cli.md#metrics). ::: ## Further Examples diff --git a/docs/configs/clearml_conf.md b/docs/configs/clearml_conf.md index d7f86216..298f22a5 100644 --- a/docs/configs/clearml_conf.md +++ b/docs/configs/clearml_conf.md @@ -1107,8 +1107,8 @@ URL to a CA bundle, or set this option to `false` to skip SSL certificate verifi * Log specific environment variables. OS environments are listed in the UI, under an experiment's **CONFIGURATION > HYPERPARAMETERS > Environment** section. - Multiple selected variables are supported including the suffix "\*". For example: "AWS\_\*" will log any OS environment - variable starting with `"AWS\_"`. Example: `log_os_environments: ["AWS_*", "CUDA_VERSION"]` + Multiple selected variables are supported including the suffix `*`. For example: `"AWS_*"` will log any OS environment + variable starting with `"AWS_"`. Example: `log_os_environments: ["AWS_*", "CUDA_VERSION"]` * This value can be overwritten with OS environment variable `CLEARML_LOG_ENVIRONMENT=AWS_*,CUDA_VERSION`. diff --git a/docs/webapp/webapp_exp_track_visual.md b/docs/webapp/webapp_exp_track_visual.md index 2d6e1012..2d95a2a5 100644 --- a/docs/webapp/webapp_exp_track_visual.md +++ b/docs/webapp/webapp_exp_track_visual.md @@ -199,6 +199,7 @@ The task's input and output models appear in the **ARTIFACTS** tab. Each model e * Model name * ID * Configuration. + Input models also display their creating experiment, which on-click navigates you to the experiment's page. ![Models in Artifacts tab](../img/webapp_exp_artifacts_01.png)