diff --git a/docs/clearml_serving/clearml_serving_cli.md b/docs/clearml_serving/clearml_serving_cli.md
index fb9795d9..96c4a8c4 100644
--- a/docs/clearml_serving/clearml_serving_cli.md
+++ b/docs/clearml_serving/clearml_serving_cli.md
@@ -185,7 +185,7 @@ Upload and register model files/folder.
```bash
clearml-serving model upload [-h] --name NAME [--tags TAGS [TAGS ...]] --project PROJECT
- [--framework {scikit-learn,xgboost,lightgbm,tensorflow,pytorch}]
+ [--framework {tensorflow,tensorflowjs,tensorflowlite,pytorch,torchscript,caffe,caffe2,onnx,keras,mknet,cntk,torch,darknet,paddlepaddle,scikitlearn,xgboost,lightgbm,parquet,megengine,catboost,tensorrt,openvino,custom}]
[--publish] [--path PATH] [--url URL]
[--destination DESTINATION]
```
@@ -198,7 +198,7 @@ clearml-serving model upload [-h] --name NAME [--tags TAGS [TAGS ...]] --project
|`--name`|Specifying the model name to be registered in|
|
|`--tags`| Add tags to the newly created model|
|
|`--project`| Specify the project for the model to be registered in|
|
-|`--framework`| Specify the model framework. Options are: "scikit-learn", "xgboost", "lightgbm", "tensorflow", "pytorch" |
|
+|`--framework`| Specify the model framework. Options are: 'tensorflow', 'tensorflowjs', 'tensorflowlite', 'pytorch', 'torchscript', 'caffe', 'caffe2', 'onnx', 'keras', 'mknet', 'cntk' , 'torch', 'darknet', 'paddlepaddle', 'scikitlearn', 'xgboost', 'lightgbm', 'parquet', 'megengine', 'catboost', 'tensorrt', 'openvino', 'custom' |
|
|`--publish`| Publish the newly created model (change model state to "published" (i.e. locked and ready to deploy)|
|
|`--path`|Specify a model file/folder to be uploaded and registered|
|
|`--url`| Specify an already uploaded model url (e.g. `s3://bucket/model.bin`, `gs://bucket/model.bin`)|
|
@@ -294,7 +294,7 @@ clearml-serving model add [-h] --engine ENGINE --endpoint ENDPOINT [--version VE
|`--engine`| Model endpoint serving engine (triton, sklearn, xgboost, lightgbm)|
|
|`--endpoint`| Base model endpoint (must be unique)|
|
|`--version`|Model endpoint version (default: None) |
|
-|`model-id`|Specify a model ID to be served|
|
+|`--model-id`|Specify a model ID to be served|
|
|`--preprocess` |Specify Pre/Post processing code to be used with the model (point to local file / folder) - this should hold for all the models |
|
|`--input-size`| Specify the model matrix input size [Rows x Columns X Channels etc ...] |
|
|`--input-type`| Specify the model matrix input type. Examples: uint8, float32, int16, float16 etc. |
|
@@ -303,10 +303,10 @@ clearml-serving model add [-h] --engine ENGINE --endpoint ENDPOINT [--version VE
|`--output_type`| Specify the model matrix output type. Examples: uint8, float32, int16, float16 etc. |
|
|`--output-name`|Specify the model layer pulling results from. Examples: layer_99|
|
|`--aux-config`| Specify additional engine specific auxiliary configuration in the form of key=value. Example: `platform=onnxruntime_onnx response_cache.enable=true max_batch_size=8`. Notice: you can also pass a full configuration file (e.g. Triton "config.pbtxt")|
|
-|`--name`| Instead of specifying `model-id` select based on model name |
|
+|`--name`| Instead of specifying `--model-id` select based on model name |
|
|`--tags`|Specify tags to be selected and auto-updated |
|
-|`--project`|Instead of specifying `model-id` select based on model project |
|
-|`--published`| Instead of specifying `model-id` select based on model published |
|
+|`--project`|Instead of specifying `--model-id` select based on model project |
|
+|`--published`| Instead of specifying `--model-id` select based on model published |
|
diff --git a/docs/clearml_serving/clearml_serving_tutorial.md b/docs/clearml_serving/clearml_serving_tutorial.md
index 604c8c49..206c798f 100644
--- a/docs/clearml_serving/clearml_serving_tutorial.md
+++ b/docs/clearml_serving/clearml_serving_tutorial.md
@@ -38,7 +38,7 @@ clearml-serving --id model add --engine sklearn --endpoint "test_mo
:::info Service ID
Make sure that you have executed `clearml-serving`'s
-[initial setup](clearml_serving.md#initial-setup), in which you create a Serving Service.
+[initial setup](clearml_serving_setup.md#initial-setup), in which you create a Serving Service.
The Serving Service's ID is required to register a model, and to execute `clearml-serving`'s `metrics` and `config` commands
:::
@@ -85,10 +85,10 @@ Uploading an existing model file into the model repository can be done via the `
or with the `clearml-serving` CLI.
1. Upload the model file to the `clearml-server` file storage and register it. The `--path` parameter is used to input
- the path to a local model file.
+ the path to a local model file (local model created in [step 1](#step-1--train-model) located in `./sklearn-model.pkl`).
```bash
- clearml-serving --id model upload --name "manual sklearn model" --project "serving examples" --framework "scikit-learn" --path examples/sklearn/sklearn-model.pkl
+ clearml-serving --id model upload --name "manual sklearn model" --project "serving examples" --framework "scikitlearn" --path ./sklearn-model.pkl
```
You now have a new Model named `manual sklearn model` in the `serving examples` project. The CLI output prints