Reorganize ClearML Serving pages (#326)

This commit is contained in:
pollfly 2022-09-11 14:06:45 +03:00 committed by GitHub
parent 8af117876e
commit f9ec798155
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
5 changed files with 115 additions and 123 deletions

View File

@ -59,100 +59,7 @@ solution.
* **Dashboards** - Customizable dashboard solution on top of the collected statistics, e.g. Grafana
## Installation
## Next Steps
:::important Upgrading ClearML Serving
To upgrade to ClearML Serving version 1.1, see instructions [here](../release_notes/ver_1_1.md#clearml-serving-110).
:::
### Prerequisites
* ClearML-Server : Model repository, Service Health, Control plane
* Kubernetes / Single-instance Machine : Deploying containers
* CLI : Configuration & model deployment interface
### Initial Setup
1. Set up your [ClearML Server](../deploying_clearml/clearml_server.md) or use the
[free hosted service](https://app.clear.ml)
1. Connect `clearml` SDK to the server, see instructions [here](../getting_started/ds/ds_first_steps.md#install-clearml)
1. Install clearml-serving CLI:
```bash
pip3 install clearml-serving
```
1. Create the Serving Service Controller:
```bash
clearml-serving create --name "serving example"
```
The new serving service UID should be printed
```console
New Serving Service created: id=aa11bb22aa11bb22
```
Write down the Serving Service UID
1. Clone the `clearml-serving` repository:
```bash
git clone https://github.com/allegroai/clearml-serving.git
```
1. Edit the environment variables file (docker/example.env) with your clearml-server credentials and Serving Service UID.
For example, you should have something like
```bash
cat docker/example.env
```
```console
CLEARML_WEB_HOST="https://app.clear.ml"
CLEARML_API_HOST="https://api.clear.ml"
CLEARML_FILES_HOST="https://files.clear.ml"
CLEARML_API_ACCESS_KEY="<access_key_here>"
CLEARML_API_SECRET_KEY="<secret_key_here>"
CLEARML_SERVING_TASK_ID="<serving_service_id_here>"
```
1. Spin up the `clearml-serving` containers with `docker-compose` (or if running on Kubernetes, use the helm chart)
```bash
cd docker && docker-compose --env-file example.env -f docker-compose.yml up
```
If you need Triton support (keras/pytorch/onnx etc.), use the triton docker-compose file
```bash
cd docker && docker-compose --env-file example.env -f docker-compose-triton.yml up
```
If running on a GPU instance w/ Triton support (keras/pytorch/onnx etc.), use the triton gpu docker-compose file:
```bash
cd docker && docker-compose --env-file example.env -f docker-compose-triton-gpu.yml up
```
:::note
Any model that registers with Triton engine will run the pre/post processing code on the Inference service container,
and the model inference itself will be executed on the Triton Engine container.
:::
### Advanced Setup - S3/GS/Azure Access (Optional)
To add access credentials and allow the inference containers to download models from your S3/GS/Azure object-storage,
add the respective environment variables to your env files (example.env). See further details on configuring the storage
access [here](../integrations/storage.md#configuring-storage).
```
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION
GOOGLE_APPLICATION_CREDENTIALS
AZURE_STORAGE_ACCOUNT
AZURE_STORAGE_KEY
```
## Tutorial
For further details, see the ClearML Serving [Tutorial](clearml_serving_tutorial.md).
See ClearML Serving setup instructions [here](clearml_serving_setup.md). For further details, see the ClearML Serving
[Tutorial](clearml_serving_tutorial.md).

View File

@ -0,0 +1,109 @@
---
title: Setup
---
The following page goes over how to set up and upgrade `clearml-serving`.
## Prerequisites
* ClearML-Server : Model repository, Service Health, Control plane
* Kubernetes / Single-instance Machine : Deploying containers
* CLI : Configuration & model deployment interface
## Initial Setup
1. Set up your [ClearML Server](../deploying_clearml/clearml_server.md) or use the
[free hosted service](https://app.clear.ml)
1. Connect `clearml` SDK to the server, see instructions [here](../getting_started/ds/ds_first_steps.md#install-clearml)
1. Install clearml-serving CLI:
```bash
pip3 install clearml-serving
```
1. Create the Serving Service Controller:
```bash
clearml-serving create --name "serving example"
```
The new serving service UID should be printed
```console
New Serving Service created: id=aa11bb22aa11bb22
```
Write down the Serving Service UID
1. Clone the `clearml-serving` repository:
```bash
git clone https://github.com/allegroai/clearml-serving.git
```
1. Edit the environment variables file (docker/example.env) with your clearml-server credentials and Serving Service UID.
For example, you should have something like
```bash
cat docker/example.env
```
```console
CLEARML_WEB_HOST="https://app.clear.ml"
CLEARML_API_HOST="https://api.clear.ml"
CLEARML_FILES_HOST="https://files.clear.ml"
CLEARML_API_ACCESS_KEY="<access_key_here>"
CLEARML_API_SECRET_KEY="<secret_key_here>"
CLEARML_SERVING_TASK_ID="<serving_service_id_here>"
```
1. Spin up the `clearml-serving` containers with `docker-compose` (or if running on Kubernetes, use the helm chart)
```bash
cd docker && docker-compose --env-file example.env -f docker-compose.yml up
```
If you need Triton support (keras/pytorch/onnx etc.), use the triton docker-compose file
```bash
cd docker && docker-compose --env-file example.env -f docker-compose-triton.yml up
```
If running on a GPU instance w/ Triton support (keras/pytorch/onnx etc.), use the triton gpu docker-compose file:
```bash
cd docker && docker-compose --env-file example.env -f docker-compose-triton-gpu.yml up
```
:::note
Any model that registers with Triton engine will run the pre/post processing code on the Inference service container,
and the model inference itself will be executed on the Triton Engine container.
:::
## Advanced Setup - S3/GS/Azure Access (Optional)
To add access credentials and allow the inference containers to download models from your S3/GS/Azure object-storage,
add the respective environment variables to your env files (example.env). See further details on configuring the storage
access [here](../integrations/storage.md#configuring-storage).
```
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION
GOOGLE_APPLICATION_CREDENTIALS
AZURE_STORAGE_ACCOUNT
AZURE_STORAGE_KEY
```
## Upgrading ClearML Serving
**Upgrading to v1.1**
1. Take down the serving containers (`docker-compose` or k8s)
1. Update the `clearml-serving` CLI `pip3 install -U clearml-serving`
1. Re-add a single existing endpoint with `clearml-serving model add ... ` (press yes when asked). It will upgrade the
`clearml-serving` session definitions
1. Pull the latest serving containers (`docker-compose pull ...` or k8s)
1. Re-spin serving containers (`docker-compose` or k8s)
## Tutorial
For further details, see the ClearML Serving [Tutorial](clearml_serving_tutorial.md).

View File

@ -14,7 +14,7 @@ The tutorial will also go over these additional options that you can use with `c
## Prerequisite
Before executing the steps below, make sure you have completed `clearml-serving`'s [initial setup](clearml_serving.md#initial-setup).
Before executing the steps below, make sure you have completed `clearml-serving`'s [initial setup](clearml_serving_setup.md#initial-setup).
## Steps
### Step 1: Train Model

View File

@ -17,32 +17,8 @@ This release is not backwards compatible - see notes below on upgrading
* Triton engine support for variable request (matrix) sizes
* Triton support, fix `--aux-config` to support more configurations elements
* Huggingface Transformer support
* `Preprocess` class as module (see note below)
* `Preprocess` class as module
:::important Preprocess Class
To add a `Preprocess` class from a module (the entire module folder will be packaged)
```
preprocess_folder
├── __init__.py # from .sub.some_file import Preprocess
└── sub
└── some_file.py
```
Pass the top folder as a path for --preprocess, for example:
```bash
clearml-serving module --id <serving_session_id> add --preprocess /path/to/preprocess_folder ...
```
:::
**Upgrading from v1.0**
1. Take down the serving containers (`docker-compose` or k8s)
1. Update the `clearml-serving` CLI `pip3 install -U clearml-serving`
1. Re-add a single existing endpoint with `clearml-serving module add ... ` (press yes when asked). It will upgrade the
`clearml-serving` session definitions
1. Pull the latest serving containers (`docker-compose pull ...` or k8s)
1. Re-spin serving containers (`docker-compose` or k8s)
### ClearML Agent 1.1.2

View File

@ -37,7 +37,7 @@ module.exports = {
},
{'ClearML Data': ['clearml_data/clearml_data', 'clearml_data/clearml_data_cli', 'clearml_data/clearml_data_sdk', 'clearml_data/best_practices',
{'Workflows': ['clearml_data/data_management_examples/workflows', 'clearml_data/data_management_examples/data_man_simple', 'clearml_data/data_management_examples/data_man_folder_sync', 'clearml_data/data_management_examples/data_man_cifar_classification', 'clearml_data/data_management_examples/data_man_python']},]},
{'ClearML Serving':['clearml_serving/clearml_serving', 'clearml_serving/clearml_serving_cli', 'clearml_serving/clearml_serving_tutorial']},
{'ClearML Serving':['clearml_serving/clearml_serving', 'clearml_serving/clearml_serving_setup', 'clearml_serving/clearml_serving_cli', 'clearml_serving/clearml_serving_tutorial']},
{'CLI Tools': ['apps/clearml_session', 'apps/clearml_task', 'apps/clearml_param_search']},
'integrations/libraries',
'integrations/storage',