From f9ec7981551503cabe93edb17fa97c2606699c53 Mon Sep 17 00:00:00 2001 From: pollfly <75068813+pollfly@users.noreply.github.com> Date: Sun, 11 Sep 2022 14:06:45 +0300 Subject: [PATCH] Reorganize ClearML Serving pages (#326) --- docs/clearml_serving/clearml_serving.md | 99 +--------------- docs/clearml_serving/clearml_serving_setup.md | 109 ++++++++++++++++++ .../clearml_serving_tutorial.md | 2 +- docs/release_notes/ver_1_1.md | 26 +---- sidebars.js | 2 +- 5 files changed, 115 insertions(+), 123 deletions(-) create mode 100644 docs/clearml_serving/clearml_serving_setup.md diff --git a/docs/clearml_serving/clearml_serving.md b/docs/clearml_serving/clearml_serving.md index 1b8488e9..72cbda35 100644 --- a/docs/clearml_serving/clearml_serving.md +++ b/docs/clearml_serving/clearml_serving.md @@ -59,100 +59,7 @@ solution. * **Dashboards** - Customizable dashboard solution on top of the collected statistics, e.g. Grafana -## Installation +## Next Steps -:::important Upgrading ClearML Serving -To upgrade to ClearML Serving version 1.1, see instructions [here](../release_notes/ver_1_1.md#clearml-serving-110). -::: - -### Prerequisites - -* ClearML-Server : Model repository, Service Health, Control plane -* Kubernetes / Single-instance Machine : Deploying containers -* CLI : Configuration & model deployment interface - -### Initial Setup -1. Set up your [ClearML Server](../deploying_clearml/clearml_server.md) or use the - [free hosted service](https://app.clear.ml) -1. Connect `clearml` SDK to the server, see instructions [here](../getting_started/ds/ds_first_steps.md#install-clearml) - -1. Install clearml-serving CLI: - - ```bash - pip3 install clearml-serving - ``` - -1. Create the Serving Service Controller: - - ```bash - clearml-serving create --name "serving example" - ``` - - The new serving service UID should be printed - - ```console - New Serving Service created: id=aa11bb22aa11bb22 - ``` - - Write down the Serving Service UID - -1. Clone the `clearml-serving` repository: - ```bash - git clone https://github.com/allegroai/clearml-serving.git - ``` - -1. Edit the environment variables file (docker/example.env) with your clearml-server credentials and Serving Service UID. - For example, you should have something like - ```bash - cat docker/example.env - ``` - - ```console - CLEARML_WEB_HOST="https://app.clear.ml" - CLEARML_API_HOST="https://api.clear.ml" - CLEARML_FILES_HOST="https://files.clear.ml" - CLEARML_API_ACCESS_KEY="" - CLEARML_API_SECRET_KEY="" - CLEARML_SERVING_TASK_ID="" - ``` - -1. Spin up the `clearml-serving` containers with `docker-compose` (or if running on Kubernetes, use the helm chart) - - ```bash - cd docker && docker-compose --env-file example.env -f docker-compose.yml up - ``` - - If you need Triton support (keras/pytorch/onnx etc.), use the triton docker-compose file - ```bash - cd docker && docker-compose --env-file example.env -f docker-compose-triton.yml up - ``` - - If running on a GPU instance w/ Triton support (keras/pytorch/onnx etc.), use the triton gpu docker-compose file: - ```bash - cd docker && docker-compose --env-file example.env -f docker-compose-triton-gpu.yml up - ``` - -:::note -Any model that registers with Triton engine will run the pre/post processing code on the Inference service container, -and the model inference itself will be executed on the Triton Engine container. -::: - -### Advanced Setup - S3/GS/Azure Access (Optional) -To add access credentials and allow the inference containers to download models from your S3/GS/Azure object-storage, -add the respective environment variables to your env files (example.env). See further details on configuring the storage -access [here](../integrations/storage.md#configuring-storage). - -``` -AWS_ACCESS_KEY_ID -AWS_SECRET_ACCESS_KEY -AWS_DEFAULT_REGION - -GOOGLE_APPLICATION_CREDENTIALS - -AZURE_STORAGE_ACCOUNT -AZURE_STORAGE_KEY -``` - -## Tutorial - -For further details, see the ClearML Serving [Tutorial](clearml_serving_tutorial.md). \ No newline at end of file +See ClearML Serving setup instructions [here](clearml_serving_setup.md). For further details, see the ClearML Serving +[Tutorial](clearml_serving_tutorial.md). \ No newline at end of file diff --git a/docs/clearml_serving/clearml_serving_setup.md b/docs/clearml_serving/clearml_serving_setup.md new file mode 100644 index 00000000..f1225dd3 --- /dev/null +++ b/docs/clearml_serving/clearml_serving_setup.md @@ -0,0 +1,109 @@ +--- +title: Setup +--- + +The following page goes over how to set up and upgrade `clearml-serving`. + +## Prerequisites + +* ClearML-Server : Model repository, Service Health, Control plane +* Kubernetes / Single-instance Machine : Deploying containers +* CLI : Configuration & model deployment interface + +## Initial Setup +1. Set up your [ClearML Server](../deploying_clearml/clearml_server.md) or use the + [free hosted service](https://app.clear.ml) +1. Connect `clearml` SDK to the server, see instructions [here](../getting_started/ds/ds_first_steps.md#install-clearml) + +1. Install clearml-serving CLI: + + ```bash + pip3 install clearml-serving + ``` + +1. Create the Serving Service Controller: + + ```bash + clearml-serving create --name "serving example" + ``` + + The new serving service UID should be printed + + ```console + New Serving Service created: id=aa11bb22aa11bb22 + ``` + + Write down the Serving Service UID + +1. Clone the `clearml-serving` repository: + ```bash + git clone https://github.com/allegroai/clearml-serving.git + ``` + +1. Edit the environment variables file (docker/example.env) with your clearml-server credentials and Serving Service UID. + For example, you should have something like + ```bash + cat docker/example.env + ``` + + ```console + CLEARML_WEB_HOST="https://app.clear.ml" + CLEARML_API_HOST="https://api.clear.ml" + CLEARML_FILES_HOST="https://files.clear.ml" + CLEARML_API_ACCESS_KEY="" + CLEARML_API_SECRET_KEY="" + CLEARML_SERVING_TASK_ID="" + ``` + +1. Spin up the `clearml-serving` containers with `docker-compose` (or if running on Kubernetes, use the helm chart) + + ```bash + cd docker && docker-compose --env-file example.env -f docker-compose.yml up + ``` + + If you need Triton support (keras/pytorch/onnx etc.), use the triton docker-compose file + ```bash + cd docker && docker-compose --env-file example.env -f docker-compose-triton.yml up + ``` + + If running on a GPU instance w/ Triton support (keras/pytorch/onnx etc.), use the triton gpu docker-compose file: + ```bash + cd docker && docker-compose --env-file example.env -f docker-compose-triton-gpu.yml up + ``` + +:::note +Any model that registers with Triton engine will run the pre/post processing code on the Inference service container, +and the model inference itself will be executed on the Triton Engine container. +::: + +## Advanced Setup - S3/GS/Azure Access (Optional) +To add access credentials and allow the inference containers to download models from your S3/GS/Azure object-storage, +add the respective environment variables to your env files (example.env). See further details on configuring the storage +access [here](../integrations/storage.md#configuring-storage). + +``` +AWS_ACCESS_KEY_ID +AWS_SECRET_ACCESS_KEY +AWS_DEFAULT_REGION + +GOOGLE_APPLICATION_CREDENTIALS + +AZURE_STORAGE_ACCOUNT +AZURE_STORAGE_KEY +``` + +## Upgrading ClearML Serving + +**Upgrading to v1.1** + +1. Take down the serving containers (`docker-compose` or k8s) +1. Update the `clearml-serving` CLI `pip3 install -U clearml-serving` +1. Re-add a single existing endpoint with `clearml-serving model add ... ` (press yes when asked). It will upgrade the + `clearml-serving` session definitions +1. Pull the latest serving containers (`docker-compose pull ...` or k8s) +1. Re-spin serving containers (`docker-compose` or k8s) + + +## Tutorial + +For further details, see the ClearML Serving [Tutorial](clearml_serving_tutorial.md). \ No newline at end of file diff --git a/docs/clearml_serving/clearml_serving_tutorial.md b/docs/clearml_serving/clearml_serving_tutorial.md index 0c382f1e..72200a4d 100644 --- a/docs/clearml_serving/clearml_serving_tutorial.md +++ b/docs/clearml_serving/clearml_serving_tutorial.md @@ -14,7 +14,7 @@ The tutorial will also go over these additional options that you can use with `c ## Prerequisite -Before executing the steps below, make sure you have completed `clearml-serving`'s [initial setup](clearml_serving.md#initial-setup). +Before executing the steps below, make sure you have completed `clearml-serving`'s [initial setup](clearml_serving_setup.md#initial-setup). ## Steps ### Step 1: Train Model diff --git a/docs/release_notes/ver_1_1.md b/docs/release_notes/ver_1_1.md index 06ab5767..25d26945 100644 --- a/docs/release_notes/ver_1_1.md +++ b/docs/release_notes/ver_1_1.md @@ -17,32 +17,8 @@ This release is not backwards compatible - see notes below on upgrading * Triton engine support for variable request (matrix) sizes * Triton support, fix `--aux-config` to support more configurations elements * Huggingface Transformer support -* `Preprocess` class as module (see note below) +* `Preprocess` class as module -:::important Preprocess Class -To add a `Preprocess` class from a module (the entire module folder will be packaged) - -``` -preprocess_folder -├── __init__.py # from .sub.some_file import Preprocess -└── sub - └── some_file.py -``` -Pass the top folder as a path for --preprocess, for example: - -```bash -clearml-serving module --id add --preprocess /path/to/preprocess_folder ... -``` -::: - -**Upgrading from v1.0** - -1. Take down the serving containers (`docker-compose` or k8s) -1. Update the `clearml-serving` CLI `pip3 install -U clearml-serving` -1. Re-add a single existing endpoint with `clearml-serving module add ... ` (press yes when asked). It will upgrade the - `clearml-serving` session definitions -1. Pull the latest serving containers (`docker-compose pull ...` or k8s) -1. Re-spin serving containers (`docker-compose` or k8s) ### ClearML Agent 1.1.2 diff --git a/sidebars.js b/sidebars.js index d4ad1671..a85fe6b6 100644 --- a/sidebars.js +++ b/sidebars.js @@ -37,7 +37,7 @@ module.exports = { }, {'ClearML Data': ['clearml_data/clearml_data', 'clearml_data/clearml_data_cli', 'clearml_data/clearml_data_sdk', 'clearml_data/best_practices', {'Workflows': ['clearml_data/data_management_examples/workflows', 'clearml_data/data_management_examples/data_man_simple', 'clearml_data/data_management_examples/data_man_folder_sync', 'clearml_data/data_management_examples/data_man_cifar_classification', 'clearml_data/data_management_examples/data_man_python']},]}, - {'ClearML Serving':['clearml_serving/clearml_serving', 'clearml_serving/clearml_serving_cli', 'clearml_serving/clearml_serving_tutorial']}, + {'ClearML Serving':['clearml_serving/clearml_serving', 'clearml_serving/clearml_serving_setup', 'clearml_serving/clearml_serving_cli', 'clearml_serving/clearml_serving_tutorial']}, {'CLI Tools': ['apps/clearml_session', 'apps/clearml_task', 'apps/clearml_param_search']}, 'integrations/libraries', 'integrations/storage',