diff --git a/docs/apps/clearml_session.md b/docs/apps/clearml_session.md index 49a363ac..fd5de9c3 100644 --- a/docs/apps/clearml_session.md +++ b/docs/apps/clearml_session.md @@ -56,7 +56,7 @@ error, you are good to go. 1. The session Task is enqueued in the selected queue, and a ClearML Agent pulls and executes it. The agent downloads the appropriate IDE(s) and launches it. -1. Once the agent finishes the initial setup of the interactive Task, the local `cleaml-session` connects to the host +1. Once the agent finishes the initial setup of the interactive Task, the local `clearml-session` connects to the host machine via SSH, and tunnels both SSH and IDE over the SSH connection. If a container is specified, the IDE environment runs inside of it. diff --git a/docs/best_practices/data_scientist_best_practices.md b/docs/best_practices/data_scientist_best_practices.md index f092791c..2996d5ef 100644 --- a/docs/best_practices/data_scientist_best_practices.md +++ b/docs/best_practices/data_scientist_best_practices.md @@ -47,7 +47,7 @@ that you need. accessed, [compared](../webapp/webapp_exp_comparing.md) and [tracked](../webapp/webapp_exp_track_visual.md). - [ClearML Agent](../clearml_agent.md) does the heavy lifting. It reproduces the execution environment, clones your code, applies code patches, manages parameters (including overriding them on the fly), executes the code, and queues multiple tasks. - It can even [build](../../clearml_agent/clearml_agent_docker_exec#exporting-a-task-into-a-standalone-docker-container) the docker container for you! + It can even [build](../getting_started/clearml_agent_docker_exec.md#exporting-a-task-into-a-standalone-docker-container) the container for you! - [ClearML Pipelines](../pipelines/pipelines.md) ensure that steps run in the same order, programmatically chaining tasks together, while giving an overview of the execution pipeline's status. diff --git a/docs/best_practices/mlops_best_practices.md b/docs/best_practices/mlops_best_practices.md index 9a31864c..71c9d7d7 100644 --- a/docs/best_practices/mlops_best_practices.md +++ b/docs/best_practices/mlops_best_practices.md @@ -18,7 +18,7 @@ If you are afraid of clutter, use the archive option, and set up your own [clean ## Clone Tasks Define a ClearML Task with one of the following options: -- Run the actual code with the `Task.init()` call. This will create and auto-populate the Task in CleaML (including Git Repo / Python Packages / Command line etc.). +- Run the actual code with the `Task.init()` call. This will create and auto-populate the Task in ClearML (including Git Repo / Python Packages / Command line etc.). - Register local / remote code repository with `clearml-task`. See [details](../apps/clearml_task.md). Once you have a Task in ClearML, you can clone and edit its definitions in the UI, then launch it on one of your nodes with [ClearML Agent](../clearml_agent.md). diff --git a/docs/clearml_agent/clearml_agent_dynamic_gpus.md b/docs/clearml_agent/clearml_agent_dynamic_gpus.md index 8481cf92..94c3862b 100644 --- a/docs/clearml_agent/clearml_agent_dynamic_gpus.md +++ b/docs/clearml_agent/clearml_agent_dynamic_gpus.md @@ -1,8 +1,9 @@ --- title: Dynamic GPU Allocation --- + :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +Dynamic GPU allocation is available under the ClearML Enterprise plan. ::: The ClearML Enterprise server supports dynamic allocation of GPUs based on queue properties. diff --git a/docs/configs/clearml_conf.md b/docs/configs/clearml_conf.md index 525445b2..9aeb0959 100644 --- a/docs/configs/clearml_conf.md +++ b/docs/configs/clearml_conf.md @@ -414,7 +414,7 @@ These settings define which Docker image and arguments should be used unless [ex * **`agent.default_docker.match_rules`** (*[dict]*) :::important Enterprise Feature - This feature is available under the ClearML Enterprise plan. + The `match_rules` configuration option is available under the ClearML Enterprise plan. ::: * Lookup table of rules that determine the default container and arguments when running a worker in Docker mode. The @@ -1599,7 +1599,7 @@ sdk { ## Configuration Vault :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +Configuration vaults are available under the ClearML Enterprise plan. ::: The ClearML Enterprise Server includes the configuration vault. Users can add configuration sections to the vault and, once diff --git a/docs/deploying_clearml/clearml_server_config.md b/docs/deploying_clearml/clearml_server_config.md index aa4026e1..014161cc 100644 --- a/docs/deploying_clearml/clearml_server_config.md +++ b/docs/deploying_clearml/clearml_server_config.md @@ -422,7 +422,7 @@ options. ### Custom UI Context Menu Actions :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +Custom UI context menu actions are available under the ClearML Enterprise plan. ::: Create custom UI context menu actions to be performed on ClearML objects (projects, tasks, models, dataviews, or queues) diff --git a/docs/deploying_clearml/clearml_server_es7_migration.md b/docs/deploying_clearml/clearml_server_es7_migration.md index abc6b671..126b222b 100644 --- a/docs/deploying_clearml/clearml_server_es7_migration.md +++ b/docs/deploying_clearml/clearml_server_es7_migration.md @@ -129,7 +129,7 @@ and ClearML Server needs to be installed. 1. Add the `clearml-server` repository to Helm client. ``` - helm repo add allegroai https://allegroai.github.io/clearml-server-helm/ + helm repo add clearml https://clearml.github.io/clearml-server-helm/ ``` Confirm the `clearml-server` repository is now in the Helm client. diff --git a/docs/deploying_clearml/clearml_server_linux_mac.md b/docs/deploying_clearml/clearml_server_linux_mac.md index 9509c748..52b25658 100644 --- a/docs/deploying_clearml/clearml_server_linux_mac.md +++ b/docs/deploying_clearml/clearml_server_linux_mac.md @@ -136,7 +136,7 @@ Deploying the server requires a minimum of 8 GB of memory, 16 GB is recommended. 2. Download the ClearML Server docker-compose YAML file. ``` - sudo curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml + sudo curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml ``` 1. For Linux only, configure the **ClearML Agent Services**: diff --git a/docs/deploying_clearml/clearml_server_win.md b/docs/deploying_clearml/clearml_server_win.md index f3a54e20..5cf0e768 100644 --- a/docs/deploying_clearml/clearml_server_win.md +++ b/docs/deploying_clearml/clearml_server_win.md @@ -57,7 +57,7 @@ Deploying the server requires a minimum of 8 GB of memory, 16 GB is recommended. 1. Save the ClearML Server docker-compose YAML file. ``` - curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose-win10.yml -o c:\opt\clearml\docker-compose-win10.yml + curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose-win10.yml -o c:\opt\clearml\docker-compose-win10.yml ``` 1. Run `docker-compose`. In PowerShell, execute the following commands: diff --git a/docs/deploying_clearml/enterprise_deploy/app_install_ex_server.md b/docs/deploying_clearml/enterprise_deploy/app_install_ex_server.md index 9b1990a9..4286d2bb 100644 --- a/docs/deploying_clearml/enterprise_deploy/app_install_ex_server.md +++ b/docs/deploying_clearml/enterprise_deploy/app_install_ex_server.md @@ -2,6 +2,10 @@ title: Installing External Applications Server --- +:::important Enterprise Feature +UI application deployment is available under the ClearML Enterprise plan. +::: + ClearML supports applications, which are extensions that allow additional capabilities, such as cloud auto-scaling, Hyperparameter Optimizations, etc. For more information, see [ClearML Applications](../../webapp/applications/apps_overview.md). diff --git a/docs/deploying_clearml/enterprise_deploy/app_install_ubuntu_on_prem.md b/docs/deploying_clearml/enterprise_deploy/app_install_ubuntu_on_prem.md index 3fbcb504..4bbc1e7e 100644 --- a/docs/deploying_clearml/enterprise_deploy/app_install_ubuntu_on_prem.md +++ b/docs/deploying_clearml/enterprise_deploy/app_install_ubuntu_on_prem.md @@ -2,6 +2,10 @@ title: Application Installation on On-Prem and VPC Servers --- +:::important Enterprise Feature +UI application deployment is available under the ClearML Enterprise plan. +::: + ClearML Applications are like plugins that allow you to manage ML workloads and automatically run recurring workflows without any coding. Applications are installed on top of the ClearML Server. diff --git a/docs/deploying_clearml/enterprise_deploy/appgw.md b/docs/deploying_clearml/enterprise_deploy/appgw.md index fe302472..2679df85 100644 --- a/docs/deploying_clearml/enterprise_deploy/appgw.md +++ b/docs/deploying_clearml/enterprise_deploy/appgw.md @@ -3,7 +3,7 @@ title: AI Application Gateway --- :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +The AI Application Gateway is available under the ClearML Enterprise plan. ::: Services running through a cluster orchestrator such as Kubernetes or cloud hyperscaler require meticulous configuration diff --git a/docs/deploying_clearml/enterprise_deploy/appgw_install_compose.md b/docs/deploying_clearml/enterprise_deploy/appgw_install_compose.md index 70335045..91cc338a 100644 --- a/docs/deploying_clearml/enterprise_deploy/appgw_install_compose.md +++ b/docs/deploying_clearml/enterprise_deploy/appgw_install_compose.md @@ -1,4 +1,10 @@ -# Docker-Compose Deployment +--- +title: Docker-Compose Deployment +--- + +:::important Enterprise Feature +The Application Gateway is available under the ClearML Enterprise plan. +::: ## Requirements diff --git a/docs/deploying_clearml/enterprise_deploy/appgw_install_k8s.md b/docs/deploying_clearml/enterprise_deploy/appgw_install_k8s.md index 9eef033b..4274f844 100644 --- a/docs/deploying_clearml/enterprise_deploy/appgw_install_k8s.md +++ b/docs/deploying_clearml/enterprise_deploy/appgw_install_k8s.md @@ -1,4 +1,10 @@ -# Kubernetes Deployment +--- +title: Kubernetes Deployment +--- + +:::important Enterprise Feature +The Application Gateway is available under the ClearML Enterprise plan. +::: This guide details the installation of the ClearML AI Application Gateway, specifically the ClearML Task Router Component. @@ -6,8 +12,8 @@ This guide details the installation of the ClearML AI Application Gateway, speci * Kubernetes cluster: `>= 1.21.0-0 < 1.32.0-0` * Helm installed and configured -* Helm token to access allegroai helm-chart repo -* Credentials for allegroai docker repo +* Helm token to access `allegroai` helm-chart repo +* Credentials for `allegroai` docker repo * A valid ClearML Server installation ## Optional for HTTPS @@ -21,7 +27,7 @@ This guide details the installation of the ClearML AI Application Gateway, speci ``` helm repo add allegroai-enterprise \ -https://raw.githubusercontent.com/allegroai/clearml-enterprise-helm-charts/gh-pages \ +https://raw.githubusercontent.com/clearml/clearml-enterprise-helm-charts/gh-pages \ --username \ --password ``` diff --git a/docs/deploying_clearml/enterprise_deploy/change_artifact_links.md b/docs/deploying_clearml/enterprise_deploy/change_artifact_links.md index cbb100fd..34e67d5c 100644 --- a/docs/deploying_clearml/enterprise_deploy/change_artifact_links.md +++ b/docs/deploying_clearml/enterprise_deploy/change_artifact_links.md @@ -1,5 +1,5 @@ --- -title: Changing CleaML Artifacts Links +title: Changing ClearML Artifacts Links --- This guide describes how to update artifact references in the ClearML Enterprise server. diff --git a/docs/deploying_clearml/enterprise_deploy/custom_billing.md b/docs/deploying_clearml/enterprise_deploy/custom_billing.md new file mode 100644 index 00000000..484bff77 --- /dev/null +++ b/docs/deploying_clearml/enterprise_deploy/custom_billing.md @@ -0,0 +1,122 @@ +--- +title: Custom Billing Events +--- + +:::important Enterprise Feature +Sending custom billing events is available under the ClearML Enterprise plan. +::: + +ClearML supports sending custom events to selected Kafka topics. Event sending is triggered by API calls and +is available only for the companies with the `custom_events` settings set. + +## Enabling Custom Events in ClearML Server + +:::important Prerequisite +**Precondition**: Customer Kafka for custom events is installed and reachable from the `apiserver`. +::: + +Set the following environment variables in the ClearML Enterprise helm chart under the `apiserver.extraEnv`: + +* Enable custom events: + + ``` + - name: CLEARML__services__custom_events__enabled + value: "true" + ``` +* Mount custom message template files into `/mnt/custom_events/templates` folder in the `apiserver` container and point + the `apiserver` into it: + + ``` + - name: CLEARML__services__custom_events__template_folder + value: "/mnt/custom_events/templates" + ``` +* Configure the Kafka host for sending events: + + ``` + - name: CLEARML__hosts__kafka__custom_events__host + value: "[]" + ``` + Configure Kafka security parameters. Below is the example for SASL plaintext security: + + ``` + - name: CLEARML__SECURE__KAFKA__CUSTOM_EVENTS__security_protocol + value: "SASL_PLAINTEXT" + - name: CLEARML__SECURE__KAFKA__CUSTOM_EVENTS__sasl_mechanism + value: "SCRAM-SHA-512" + - name: CLEARML__SECURE__KAFKA__CUSTOM_EVENTS__sasl_plain_username + value: "" + - name: CLEARML__SECURE__KAFKA__CUSTOM_EVENTS__sasl_plain_password + value: "" + ``` +* Define Kafka topics for lifecycle and inventory messages: + + ``` + - name: CLEARML__services__custom_events__channels__main__topics__service_instance_lifecycle + value: "lifecycle" + - name: CLEARML__services__custom_events__channels__main__topics__service_instance_inventory + value: "inventory" + ``` +* For the desired companies set up the custom events properties required by the event message templates: + + ``` + curl $APISERVER_URL/system.update_company_custom_events_config -H "Content-Type: application/json" -u $APISERVER_KEY:$APISERVER_SECRET -d'{ + "company": "", + "fields": { + "service_instance_id": "", + "service_instance_name": "", + "service_instance_customer_tenant_name": "", + "service_instance_customer_space_name": "", + "service_instance_customer_space_id": "", + "parameters_connection_points": ["", ""] + }}' + ``` + +## Sending Custom Events to the API Server + +:::important Prerequisite +**Precondition:** Dedicated custom-events Redis instance installed and reachable from all the custom events deployments. +::: + +Environment lifecycle events are sent directly by the `apiserver`. Other event types are emitted by the following helm charts: + +* `clearml-pods-monitor-exporter` - Monitors running pods and sends container lifecycle events (should run one per cluster with a unique identifier, a UUID is required for the installation): + + ``` + # -- Universal Unique string to identify Pods Monitor instances across worker clusters. It cannot be empty. + # Uniqueness is required across different cluster installations to preserve the reported data status. + podsMonitorUUID: "" + # Interval + checkIntervalSeconds: 60 + ``` +* `clearml-pods-inventory` - Periodically sends inventory events about running pods. + + ``` + # Cron schedule - https://crontab.guru/ + cronJob: + schedule: "@daily" + ``` +* `clearml-company-inventory` - Monitors Clearml companies and sends environment inventory events. + + ``` + # Cron schedule - https://crontab.guru/ + cronJob: + schedule: "@daily" + ``` + +For every script chart add the below configuration to enable redis access and connection to the `apiserver`: + +``` +clearml: + apiServerUrlReference: "" + apiServerKey: "" + apiServerSecret: "" +redisConnection: + host: "" + port: + password: "" +``` + +See all other available options to customize the `custom-events` charts by running: +``` +helm show readme allegroai-enterprise/ +``` \ No newline at end of file diff --git a/docs/deploying_clearml/enterprise_deploy/import_projects.md b/docs/deploying_clearml/enterprise_deploy/import_projects.md index 324350bf..3a162644 100644 --- a/docs/deploying_clearml/enterprise_deploy/import_projects.md +++ b/docs/deploying_clearml/enterprise_deploy/import_projects.md @@ -1,5 +1,5 @@ --- -title: Exporting and Importing ClearML Projects +title: Project Migration --- When migrating from a ClearML Open Server to a ClearML Enterprise Server, you may need to transfer projects. This is done @@ -235,6 +235,6 @@ Note that this is not required if the new file server is replacing the old file exact address. Once the projects' data has been copied to the target server, and the projects themselves were imported, see -[Changing CleaML Artifacts Links](change_artifact_links.md) for information on how to fix the URLs. +[Changing ClearML Artifacts Links](change_artifact_links.md) for information on how to fix the URLs. diff --git a/docs/deploying_clearml/upgrade_server_aws_ec2_ami.md b/docs/deploying_clearml/upgrade_server_aws_ec2_ami.md index 9edd3457..ed129ab2 100644 --- a/docs/deploying_clearml/upgrade_server_aws_ec2_ami.md +++ b/docs/deploying_clearml/upgrade_server_aws_ec2_ami.md @@ -2,14 +2,21 @@ title: AWS EC2 AMIs --- -:::note -For upgrade purposes, the terms **Trains Server** and **ClearML Server** are interchangeable. -::: + + +MongoDB major version was upgraded from `v5.x` to `6.x`. Please note that if your current ClearML Server version is older than +`v1.17` (where MongoDB `v5.x` was first used), you'll need to first upgrade to ClearML Server v1.17. + +First upgrade to ClearML Server v1.17 following the procedure below and using [this `docker-compose` file](https://github.com/clearml/clearml-server/blob/2976ce69cc91550a3614996e8a8d8cd799af2efd/upgrade/1_17_to_2_0/docker-compose.yml). Once successfully upgraded, +you can proceed to upgrade to v2.x. + + + The sections below contain the steps to upgrade ClearML Server on the [same AWS instance](#upgrading-on-the-same-aws-instance), and to upgrade and migrate to a [new AWS instance](#upgrading-and-migrating-to-a-new-aws-instance). -### Upgrading on the Same AWS Instance +## Upgrading on the Same AWS Instance This section contains the steps to upgrade ClearML Server on the same AWS instance. @@ -42,7 +49,7 @@ If upgrading from Trains Server version 0.15 or older, a data migration is requi 1. Download the latest `docker-compose.yml` file. Execute the following command: ``` - sudo curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml + sudo curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml ``` 1. Startup ClearML Server. This automatically pulls the latest ClearML Server build. @@ -52,7 +59,7 @@ If upgrading from Trains Server version 0.15 or older, a data migration is requi docker-compose -f docker-compose.yml up -d ``` -### Upgrading and Migrating to a New AWS Instance +## Upgrading and Migrating to a New AWS Instance This section contains the steps to upgrade ClearML Server on the new AWS instance. @@ -67,8 +74,9 @@ This section contains the steps to upgrade ClearML Server on the new AWS instanc 1. On the old AWS instance, [backup your data](clearml_server_aws_ec2_ami.md#backing-up-and-restoring-data-and-configuration) and, if your configuration folder is not empty, backup your configuration. -1. If upgrading from ClearML Server version older than 1.2, you need to migrate your data before upgrading your server. See instructions [here](clearml_server_mongo44_migration.md). - If upgrading from Trains Server version 0.15 or older, a data migration is required before continuing this upgrade. See instructions [here](clearml_server_es7_migration.md). +1. If upgrading from Trains Server version 0.15 or older, you need to migrate your data before upgrading your server. See instructions [here](clearml_server_es7_migration.md). + +1. If upgrading from ClearML Server version 1.1 or older, you need to migrate your data before upgrading your server. See instructions [here](clearml_server_mongo44_migration.md). 1. On the new AWS instance, [restore your data](clearml_server_aws_ec2_ami.md#backing-up-and-restoring-data-and-configuration) and, if the configuration folder is not empty, restore the configuration. diff --git a/docs/deploying_clearml/upgrade_server_gcp.md b/docs/deploying_clearml/upgrade_server_gcp.md index b352ef68..7739d82d 100644 --- a/docs/deploying_clearml/upgrade_server_gcp.md +++ b/docs/deploying_clearml/upgrade_server_gcp.md @@ -19,11 +19,13 @@ you can proceed to upgrade to v2.x. ``` docker-compose -f docker-compose.yml down ``` + +1. [Backing up data](clearml_server_gcp.md#backing-up-and-restoring-data-and-configuration) is recommended, and if the configuration folder is + not empty, backing up the configuration. 1. If upgrading from **Trains Server** version 0.15 or older to **ClearML Server**, do the following: - 1. Follow these [data migration instructions](clearml_server_es7_migration.md), - and then continue this upgrade. + 1. Follow these [data migration instructions](clearml_server_es7_migration.md). 1. Rename `/opt/trains` and its subdirectories to `/opt/clearml`: @@ -31,14 +33,12 @@ you can proceed to upgrade to v2.x. sudo mv /opt/trains /opt/clearml ``` -1. If upgrading from ClearML Server version older than 1.2, you need to migrate your data before upgrading your server. See instructions [here](clearml_server_mongo44_migration.md). -1. [Backing up data](clearml_server_gcp.md#backing-up-and-restoring-data-and-configuration) is recommended, and if the configuration folder is - not empty, backing up the configuration. +1. If upgrading from ClearML Server version 1.1 or older, you need to migrate your data before upgrading your server. See instructions [here](clearml_server_mongo44_migration.md). 1. Download the latest `docker-compose.yml` file: ``` - curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml + curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml ``` 1. Startup ClearML Server. This automatically pulls the latest ClearML Server build. diff --git a/docs/deploying_clearml/upgrade_server_kubernetes_helm.md b/docs/deploying_clearml/upgrade_server_kubernetes_helm.md index a2d2fb0a..7aa1b6e0 100644 --- a/docs/deploying_clearml/upgrade_server_kubernetes_helm.md +++ b/docs/deploying_clearml/upgrade_server_kubernetes_helm.md @@ -7,13 +7,13 @@ title: Kubernetes ```bash helm repo update -helm upgrade clearml allegroai/clearml +helm upgrade clearml clearml/clearml ``` **To change the values in an existing installation,** execute the following: ```bash -helm upgrade clearml allegroai/clearml --version -f custom_values.yaml +helm upgrade clearml clearml/clearml --version -f custom_values.yaml ``` See the [clearml-helm-charts repository](https://github.com/clearml/clearml-helm-charts/tree/main/charts/clearml#local-environment) diff --git a/docs/deploying_clearml/upgrade_server_linux_mac.md b/docs/deploying_clearml/upgrade_server_linux_mac.md index 7be52d46..3a77d8a6 100644 --- a/docs/deploying_clearml/upgrade_server_linux_mac.md +++ b/docs/deploying_clearml/upgrade_server_linux_mac.md @@ -40,24 +40,26 @@ For backwards compatibility, the environment variables ``TRAINS_HOST_IP``, ``TRA ``` docker-compose -f docker-compose.yml down ``` - -1. If upgrading from **Trains Server** version 0.15 or older, a data migration is required before continuing this upgrade. See instructions [here](clearml_server_es7_migration.md). - -1. If upgrading from ClearML Server version older than 1.2, you need to migrate your data before upgrading your server. See instructions [here](clearml_server_mongo44_migration.md). 1. [Backing up data](clearml_server_linux_mac.md#backing-up-and-restoring-data-and-configuration) is recommended and, if the configuration folder is not empty, backing up the configuration. + +1. If upgrading from **Trains Server** version 0.15 or older to **ClearML Server**, do the following: -1. If upgrading from **Trains Server** to **ClearML Server**, rename `/opt/trains` and its subdirectories to `/opt/clearml`: + 1. Follow these [data migration instructions](clearml_server_es7_migration.md). + + 1. Rename `/opt/trains` and its subdirectories to `/opt/clearml`: + + ``` + sudo mv /opt/trains /opt/clearml + ``` - ``` - sudo mv /opt/trains /opt/clearml - ``` +1. If upgrading from ClearML Server version 1.1 or older, you need to migrate your data before upgrading your server. See instructions [here](clearml_server_mongo44_migration.md). 1. Download the latest `docker-compose.yml` file: ``` - curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml + curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml ``` 1. Startup ClearML Server. This automatically pulls the latest ClearML Server build: diff --git a/docs/deploying_clearml/upgrade_server_win.md b/docs/deploying_clearml/upgrade_server_win.md index ab22f542..11f8690a 100644 --- a/docs/deploying_clearml/upgrade_server_win.md +++ b/docs/deploying_clearml/upgrade_server_win.md @@ -29,10 +29,7 @@ you can proceed to upgrade to v2.x. ``` docker-compose -f c:\opt\trains\docker-compose-win10.yml down ``` - -1. If upgrading from **Trains Server** version 0.15 or older, a data migration is required before continuing this upgrade. See instructions [here](clearml_server_es7_migration.md). -1. If upgrading from ClearML Server version older than 1.2, you need to migrate your data before upgrading your server. See instructions [here](clearml_server_mongo44_migration.md). 1. Backing up data is recommended, and if the configuration folder is not empty, backing up the configuration. @@ -40,13 +37,19 @@ you can proceed to upgrade to v2.x. For example, if the configuration is in ``c:\opt\clearml``, then backup ``c:\opt\clearml\config`` and ``c:\opt\clearml\data``. Before restoring, remove the old artifacts in ``c:\opt\clearml\config`` and ``c:\opt\clearml\data``, and then restore. ::: - -1. If upgrading from **Trains Server** to **ClearML Server**, rename `/opt/trains` and its subdirectories to `/opt/clearml`. +1. If upgrading from **Trains Server** version 0.15 or older to **ClearML Server**, do the following: + + 1. Follow these [data migration instructions](clearml_server_es7_migration.md). + + 1. Rename `/opt/trains` and its subdirectories to `/opt/clearml`. + +1. If upgrading from ClearML Server version 1.1 or older, you need to migrate your data before upgrading your server. See instructions [here](clearml_server_mongo44_migration.md). + 1. Download the latest `docker-compose.yml` file: ``` - curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose-win10.yml -o c:\opt\clearml\docker-compose-win10.yml + curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose-win10.yml -o c:\opt\clearml\docker-compose-win10.yml ``` 1. Startup ClearML Server. This automatically pulls the latest ClearML Server build. diff --git a/docs/getting_started/clearml_agent_docker_exec.md b/docs/getting_started/clearml_agent_docker_exec.md index c1c3eb30..6e15df1f 100644 --- a/docs/getting_started/clearml_agent_docker_exec.md +++ b/docs/getting_started/clearml_agent_docker_exec.md @@ -1,5 +1,5 @@ --- -title: Building Executable Task Containers +title: Building Executable Task Containers --- ## Exporting a Task into a Standalone Docker Container diff --git a/docs/getting_started/clearml_agent_scheduling.md b/docs/getting_started/clearml_agent_scheduling.md index 80d22df7..ed3c5948 100644 --- a/docs/getting_started/clearml_agent_scheduling.md +++ b/docs/getting_started/clearml_agent_scheduling.md @@ -3,7 +3,7 @@ title: Managing Agent Work Schedules --- :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +Agent work schedule management is available under the ClearML Enterprise plan. ::: The Agent scheduler enables scheduling working hours for each Agent. During working hours, a worker will actively poll diff --git a/docs/getting_started/main.md b/docs/getting_started/main.md index bc1b3b17..12ab7727 100644 --- a/docs/getting_started/main.md +++ b/docs/getting_started/main.md @@ -32,19 +32,19 @@ training, and deploying models at every scale on any AI infrastructure. Step 1 - Experiment Management - + Open In Colab Step 2 - Remote Execution Agent Setup - + Open In Colab Step 3 - Remotely Execute Tasks - + Open In Colab diff --git a/docs/guides/clearml-task/clearml_task_tutorial.md b/docs/guides/clearml-task/clearml_task_tutorial.md index 99b86e0f..9f47b5df 100644 --- a/docs/guides/clearml-task/clearml_task_tutorial.md +++ b/docs/guides/clearml-task/clearml_task_tutorial.md @@ -49,7 +49,7 @@ Execution log at: https://app.clear.ml/projects/552d5399112d47029c146d5248570295 ### Executing a Local Script For this example, use a local version of [this script](https://github.com/clearml/events/blob/master/webinar-0620/keras_mnist.py). -1. Clone the [allegroai/events](https://github.com/clearml/events) repository +1. Clone the [clearml/events](https://github.com/clearml/events) repository 1. Go to the root folder of the cloned repository 1. Run the following command: diff --git a/docs/guides/ide/google_colab.md b/docs/guides/ide/google_colab.md index 49163696..dbee1a2d 100644 --- a/docs/guides/ide/google_colab.md +++ b/docs/guides/ide/google_colab.md @@ -16,7 +16,7 @@ and running, users can send Tasks to be executed on Google Colab's hardware. ## Steps -1. Open up [this Google Colab notebook](https://colab.research.google.com/github/allegroai/clearml/blob/master/examples/clearml_agent/clearml_colab_agent.ipynb). +1. Open up [this Google Colab notebook](https://colab.research.google.com/github/clearml/clearml/blob/master/examples/clearml_agent/clearml_colab_agent.ipynb). 1. Run the first cell, which installs all the necessary packages: ``` diff --git a/docs/guides/pipeline/pipeline_decorator.md b/docs/guides/pipeline/pipeline_decorator.md index 4c5a3213..0ae0a0a5 100644 --- a/docs/guides/pipeline/pipeline_decorator.md +++ b/docs/guides/pipeline/pipeline_decorator.md @@ -3,7 +3,7 @@ title: Pipeline from Decorators --- The [pipeline_from_decorator.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/pipeline_from_decorator.py) -example demonstrates the creation of a pipeline in ClearML using the [`PipelineDecorator`](../../references/sdk/automation_controller_pipelinecontroller.md#class-automationcontrollerpipelinedecorator) +example demonstrates the creation of a pipeline in ClearML using the [`PipelineDecorator`](../../references/sdk/automation_controller_pipelinedecorator.md#class-automationcontrollerpipelinedecorator) class. This example creates a pipeline incorporating four tasks, each of which is created from a Python function using a custom decorator: @@ -14,11 +14,11 @@ This example creates a pipeline incorporating four tasks, each of which is creat * `step_four` - Uses data from `step_two` and the model from `step_three` to make a prediction. The pipeline steps, defined in the `step_one`, `step_two`, `step_three`, and `step_four` functions, are each wrapped with the -[`@PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorcomponent) +[`@PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorcomponent) decorator, which creates a ClearML pipeline step for each one when the pipeline is executed. The logic that executes these steps and controls the interaction between them is implemented in the `executing_pipeline` -function. This function is wrapped with the [`@PipelineDecorator.pipeline`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorpipeline) +function. This function is wrapped with the [`@PipelineDecorator.pipeline`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorpipeline) decorator which creates the ClearML pipeline task when it is executed. The sections below describe in more detail what happens in the pipeline controller and steps. @@ -28,7 +28,7 @@ The sections below describe in more detail what happens in the pipeline controll In this example, the pipeline controller is implemented by the `executing_pipeline` function. Using the `@PipelineDecorator.pipeline` decorator creates a ClearML Controller Task from the function when it is executed. -For detailed information, see [`@PipelineDecorator.pipeline`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorpipeline). +For detailed information, see [`@PipelineDecorator.pipeline`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorpipeline). In the example script, the controller defines the interactions between the pipeline steps in the following way: 1. The controller function passes its argument, `pickle_url`, to the pipeline's first step (`step_one`) @@ -39,13 +39,13 @@ In the example script, the controller defines the interactions between the pipel :::info Local Execution In this example, the pipeline is set to run in local mode by using -[`PipelineDecorator.run_locally()`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorrun_locally) +[`PipelineDecorator.run_locally()`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorrun_locally) before calling the pipeline function. See pipeline execution options [here](../../pipelines/pipelines_sdk_function_decorators.md#running-the-pipeline). ::: ## Pipeline Steps Using the `@PipelineDecorator.component` decorator will make the function a pipeline component that can be called from the -pipeline controller, which implements the pipeline's execution logic. For detailed information, see [`@PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorcomponent). +pipeline controller, which implements the pipeline's execution logic. For detailed information, see [`@PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorcomponent). When the pipeline controller calls a pipeline step, a corresponding ClearML task will be created. Notice that all package imports inside the function will be automatically logged as required packages for the pipeline execution step. @@ -63,7 +63,7 @@ executing_pipeline( ``` By default, the pipeline controller and the pipeline steps are launched through ClearML [queues](../../fundamentals/agents_and_queues.md#what-is-a-queue). -Use the [`PipelineDecorator.set_default_execution_queue`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorset_default_execution_queue) +Use the [`PipelineDecorator.set_default_execution_queue`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorset_default_execution_queue) method to specify the execution queue of all pipeline steps. The `execution_queue` parameter of the `@PipelineDecorator.component` decorator overrides the default queue value for the specific step for which it was specified. diff --git a/docs/guides/services/slack_alerts.md b/docs/guides/services/slack_alerts.md index e7bcb251..8a3bbd6a 100644 --- a/docs/guides/services/slack_alerts.md +++ b/docs/guides/services/slack_alerts.md @@ -22,7 +22,7 @@ The Slack API token and channel you create are required to configure the Slack a 1. In **Development Slack Workspace**, select a workspace. 1. Click **Create App**. 1. In **Basic Information**, under **Display Information**, complete the following: - - In **Short description**, enter "Allegro Train Bot". + - In **Short description**, enter "ClearML Train Bot". - In **Background color**, enter "#202432". 1. Click **Save Changes**. 1. In **OAuth & Permissions**, under **Scopes**, click **Add an OAuth Scope**, and then select the following permissions diff --git a/docs/img/gif/dataset_dark.gif b/docs/img/gif/dataset_dark.gif index 85486974..092b403b 100644 Binary files a/docs/img/gif/dataset_dark.gif and b/docs/img/gif/dataset_dark.gif differ diff --git a/docs/img/gif/integrations_yolov5.gif b/docs/img/gif/integrations_yolov5.gif index 0a0795bd..4c99ca63 100644 Binary files a/docs/img/gif/integrations_yolov5.gif and b/docs/img/gif/integrations_yolov5.gif differ diff --git a/docs/img/webapp_compare_exp_select_2.png b/docs/img/webapp_compare_exp_select_2.png index e1e561fd..46fd6aae 100644 Binary files a/docs/img/webapp_compare_exp_select_2.png and b/docs/img/webapp_compare_exp_select_2.png differ diff --git a/docs/img/webapp_compare_exp_select_2_dark.png b/docs/img/webapp_compare_exp_select_2_dark.png index d1a3b2e0..e03940fe 100644 Binary files a/docs/img/webapp_compare_exp_select_2_dark.png and b/docs/img/webapp_compare_exp_select_2_dark.png differ diff --git a/docs/pipelines/pipelines_sdk_function_decorators.md b/docs/pipelines/pipelines_sdk_function_decorators.md index c6ef6d8f..97f43e75 100644 --- a/docs/pipelines/pipelines_sdk_function_decorators.md +++ b/docs/pipelines/pipelines_sdk_function_decorators.md @@ -4,14 +4,14 @@ title: PipelineDecorator ## Creating Pipelines Using Function Decorators -Use the [`PipelineDecorator`](../references/sdk/automation_controller_pipelinecontroller.md#class-automationcontrollerpipelinedecorator) -class to create pipelines from your existing functions. Use [`@PipelineDecorator.component`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorcomponent) -to denote functions that comprise the steps of your pipeline, and [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorpipeline) +Use the [`PipelineDecorator`](../references/sdk/automation_controller_pipelinedecorator.md#class-automationcontrollerpipelinedecorator) +class to create pipelines from your existing functions. Use [`@PipelineDecorator.component`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorcomponent) +to denote functions that comprise the steps of your pipeline, and [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorpipeline) for your main pipeline execution logic function. ## @PipelineDecorator.pipeline -Using the [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorpipeline) +Using the [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorpipeline) decorator transforms the function which implements your pipeline's execution logic to a ClearML pipeline controller, an independently executed task. @@ -70,13 +70,13 @@ parameters. When launching a new pipeline run from the [UI](../webapp/pipelines/ ![Pipeline new run](../img/pipelines_new_run.png) ## @PipelineDecorator.component -Using the [`@PipelineDecorator.component`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorcomponent) +Using the [`@PipelineDecorator.component`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorcomponent) decorator transforms a function into a ClearML pipeline step when called from a pipeline controller. When the pipeline controller calls a pipeline step, a corresponding ClearML task is created. :::tip Package Imports -In the case that the `skip_global_imports` parameter of [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorpipeline) +In the case that the `skip_global_imports` parameter of [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorpipeline) is set to `False`, all global imports will be automatically imported at the beginning of each step's execution. Otherwise, if set to `True`, make sure that each function which makes up a pipeline step contains package imports, which are automatically logged as required packages for the pipeline execution step. @@ -110,7 +110,7 @@ def step_one(pickle_data_url: str, extra: int = 43): * `packages` - A list of required packages or a local requirements.txt file. Example: `["tqdm>=2.1", "scikit-learn"]` or `"./requirements.txt"`. If not provided, packages are automatically added based on the imports used inside the function. * `execution_queue` (optional) - Queue in which to enqueue the specific step. This overrides the queue set with the - [`PipelineDecorator.set_default_execution_queue method`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorset_default_execution_queue) + [`PipelineDecorator.set_default_execution_queue method`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorset_default_execution_queue) method. * `continue_on_fail` - If `True`, a failed step does not cause the pipeline to stop (or marked as failed). Notice, that steps that are connected (or indirectly connected) to the failed step are skipped (default `False`) @@ -186,14 +186,14 @@ specify which frameworks to log. See `Task.init`'s [`auto_connect_framework` par * `auto_connect_arg_parser` - Control automatic logging of argparse objects. See `Task.init`'s [`auto_connect_arg_parser` parameter](../references/sdk/task.md#taskinit) You can also directly upload a model or an artifact from the step to the pipeline controller, using the -[`PipelineDecorator.upload_model`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorupload_model) -and [`PipelineDecorator.upload_artifact`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorupload_artifact) +[`PipelineDecorator.upload_model`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorupload_model) +and [`PipelineDecorator.upload_artifact`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorupload_artifact) methods respectively. ## Controlling Pipeline Execution ### Default Execution Queue -The [`PipelineDecorator.set_default_execution_queue`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorset_default_execution_queue) +The [`PipelineDecorator.set_default_execution_queue`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorset_default_execution_queue) method lets you set a default queue through which all pipeline steps will be executed. Once set, step-specific overrides can be specified through the `@PipelineDecorator.component` decorator. @@ -226,7 +226,7 @@ You can run the pipeline logic locally, while keeping the pipeline components ex #### Debugging Mode In debugging mode, the pipeline controller and all components are treated as regular Python functions, with components called synchronously. This mode is great to debug the components and design the pipeline as the entire pipeline is -executed on the developer machine with full ability to debug each function call. Call [`PipelineDecorator.debug_pipeline`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratordebug_pipeline) +executed on the developer machine with full ability to debug each function call. Call [`PipelineDecorator.debug_pipeline`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratordebug_pipeline) before the main pipeline logic function call. Example: @@ -242,7 +242,7 @@ In local mode, the pipeline controller creates Tasks for each component, and com into sub-processes running on the same machine. Notice that the data is passed between the components and the logic with the exact same mechanism as in the remote mode (i.e. hyperparameters / artifacts), with the exception that the execution itself is local. Notice that each subprocess is using the exact same Python environment as the main pipeline logic. Call -[`PipelineDecorator.run_locally`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorrun_locally) +[`PipelineDecorator.run_locally`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorrun_locally) before the main pipeline logic function. Example: diff --git a/docs/references/sdk/automation_controller_pipelinedecorator.md b/docs/references/sdk/automation_controller_pipelinedecorator.md new file mode 100644 index 00000000..0545db42 --- /dev/null +++ b/docs/references/sdk/automation_controller_pipelinedecorator.md @@ -0,0 +1,5 @@ +--- +title: PipelineDecorator +--- + +**AutoGenerated PlaceHolder** \ No newline at end of file diff --git a/docs/user_management/identity_providers.md b/docs/user_management/identity_providers.md index fd7ec5dc..ba01c982 100644 --- a/docs/user_management/identity_providers.md +++ b/docs/user_management/identity_providers.md @@ -3,7 +3,7 @@ title: Identity Providers --- :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +Identity provider integration is available under the ClearML Enterprise plan. ::: Administrators can seamlessly connect ClearML with their identity service providers to easily implement single sign-on diff --git a/docs/webapp/applications/apps_aws_autoscaler.md b/docs/webapp/applications/apps_aws_autoscaler.md index cfab329f..3068db42 100644 --- a/docs/webapp/applications/apps_aws_autoscaler.md +++ b/docs/webapp/applications/apps_aws_autoscaler.md @@ -319,17 +319,10 @@ to an IAM user, and create credentials keys for that user to configure in the au "ssm:GetParameters", "ssm:GetParameter" ], - "Resource": "arn:aws:ssm:*::parameter/aws/service/marketplace/*" - }, - { - "Sid": "AllowUsingDeeplearningAMIAliases", - "Effect": "Allow", - "Action": [ - "ssm:GetParametersByPath", - "ssm:GetParameters", - "ssm:GetParameter" - ], - "Resource": "arn:aws:ssm:*::parameter/aws/service/deeplearning/*" + "Resource": [ + "arn:aws:ssm:*::parameter/aws/service/marketplace/*", + "arn:aws:ssm:*::parameter/aws/service/deeplearning/*" + ] } ] } diff --git a/docs/webapp/pipelines/webapp_pipeline_table.md b/docs/webapp/pipelines/webapp_pipeline_table.md index cc252d93..114154f7 100644 --- a/docs/webapp/pipelines/webapp_pipeline_table.md +++ b/docs/webapp/pipelines/webapp_pipeline_table.md @@ -36,7 +36,7 @@ The pipeline run table contains the following columns: | Column | Description | Type | |---|---|---| | **RUN** | Pipeline run identifier | String | -| **VERSION** | The pipeline version number. Corresponds to the [PipelineController](../../references/sdk/automation_controller_pipelinecontroller.md#class-pipelinecontroller)'s and [PipelineDecorator](../../references/sdk/automation_controller_pipelinecontroller.md#class-automationcontrollerpipelinedecorator)'s `version` parameter | Version string | +| **VERSION** | The pipeline version number. Corresponds to the [PipelineController](../../references/sdk/automation_controller_pipelinecontroller.md#class-pipelinecontroller)'s and [PipelineDecorator](../../references/sdk/automation_controller_pipelinedecorator.md#class-automationcontrollerpipelinedecorator)'s `version` parameter | Version string | | **TAGS** | Descriptive, user-defined, color-coded tags assigned to run. | Tag | | **STATUS** | Pipeline run's status. See a list of the [task states and state transitions](../../fundamentals/task.md#task-states). For Running, Failed, and Aborted runs, you will also see a progress indicator next to the status. See [here](../../pipelines/pipelines.md#tracking-pipeline-progress). | String | | **USER** | User who created the run. | String | diff --git a/docs/webapp/pipelines/webapp_pipeline_viewing.md b/docs/webapp/pipelines/webapp_pipeline_viewing.md index b16b2370..6868c063 100644 --- a/docs/webapp/pipelines/webapp_pipeline_viewing.md +++ b/docs/webapp/pipelines/webapp_pipeline_viewing.md @@ -108,7 +108,7 @@ The details panel includes three tabs: ![console](../../img/webapp_pipeline_step_console_dark.png#dark-mode-only) * **Code** - For pipeline steps generated from functions using either [`PipelineController.add_function_step`](../../references/sdk/automation_controller_pipelinecontroller.md#add_function_step) -or [`PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorcomponent), +or [`PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorcomponent), you can view the selected step's code. ![code](../../img/webapp_pipeline_step_code.png#light-mode-only) diff --git a/docs/webapp/resource_policies.md b/docs/webapp/resource_policies.md index c90a3a74..1a22a2bb 100644 --- a/docs/webapp/resource_policies.md +++ b/docs/webapp/resource_policies.md @@ -3,7 +3,7 @@ title: Resource Policies --- :::important ENTERPRISE FEATURE -This feature is available under the ClearML Enterprise plan. +Resource Policies are available under the ClearML Enterprise plan. ::: diff --git a/docs/webapp/settings/webapp_settings_access_rules.md b/docs/webapp/settings/webapp_settings_access_rules.md index d9576e04..c2978c10 100644 --- a/docs/webapp/settings/webapp_settings_access_rules.md +++ b/docs/webapp/settings/webapp_settings_access_rules.md @@ -3,7 +3,7 @@ title: Access Rules --- :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +Access rules are available under the ClearML Enterprise plan. ::: Workspace administrators can use the **Access Rules** page to manage workspace permissions, by specifying which users, diff --git a/docs/webapp/settings/webapp_settings_admin_vaults.md b/docs/webapp/settings/webapp_settings_admin_vaults.md index e67b6954..9ead42a7 100644 --- a/docs/webapp/settings/webapp_settings_admin_vaults.md +++ b/docs/webapp/settings/webapp_settings_admin_vaults.md @@ -3,7 +3,7 @@ title: Administrator Vaults --- :::info Enterprise Feature -This feature is available under the ClearML Enterprise plan. +Administrator vaults are available under the ClearML Enterprise plan. ::: Administrators can define multiple [configuration vaults](webapp_settings_profile.md#configuration-vault) which will each be applied to designated diff --git a/docs/webapp/settings/webapp_settings_id_providers.md b/docs/webapp/settings/webapp_settings_id_providers.md index 47c6cf34..6942c06c 100644 --- a/docs/webapp/settings/webapp_settings_id_providers.md +++ b/docs/webapp/settings/webapp_settings_id_providers.md @@ -3,7 +3,7 @@ title: Identity Providers --- :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +Identity provider integration is available under the ClearML Enterprise plan. ::: Administrators can connect identity service providers to the server: configure an identity connection, which allows diff --git a/docs/webapp/settings/webapp_settings_profile.md b/docs/webapp/settings/webapp_settings_profile.md index 38c0f00b..e7e92d8e 100644 --- a/docs/webapp/settings/webapp_settings_profile.md +++ b/docs/webapp/settings/webapp_settings_profile.md @@ -100,7 +100,7 @@ these credentials cannot be recovered. ### AI Application Gateway Tokens :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +The AI Application Gateway is available under the ClearML Enterprise plan. ::: The AI Application Gateway enables external access to ClearML tasks and applications. The gateway is configured with an @@ -146,7 +146,7 @@ in that workspace. You can rejoin the workspace only if you are re-invited. ### Configuration Vault :::info Enterprise Feature -This feature is available under the ClearML Enterprise plan. +Configuration vaults are available under the ClearML Enterprise plan. ::: Use the configuration vault to store global ClearML configuration entries that can extend the ClearML [configuration file](../../configs/clearml_conf.md) diff --git a/docs/webapp/settings/webapp_settings_users.md b/docs/webapp/settings/webapp_settings_users.md index dc0cd40f..22d07379 100644 --- a/docs/webapp/settings/webapp_settings_users.md +++ b/docs/webapp/settings/webapp_settings_users.md @@ -42,7 +42,7 @@ user can only rejoin your workspace when you re-invite them. ## Service Accounts :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +Service accounts are available under the ClearML Enterprise plan. ::: Service accounts are ClearML users that provide services with access to the ClearML API, but not the @@ -155,7 +155,7 @@ To delete a service account: ## User Groups :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan, as part of the [Access Rules](webapp_settings_access_rules.md) +User groups are available under the ClearML Enterprise plan, as part of the [Access Rules](webapp_settings_access_rules.md) feature. ::: diff --git a/docs/webapp/webapp_exp_track_visual.md b/docs/webapp/webapp_exp_track_visual.md index 8d8ef485..14f3fbe2 100644 --- a/docs/webapp/webapp_exp_track_visual.md +++ b/docs/webapp/webapp_exp_track_visual.md @@ -93,7 +93,7 @@ using to set up an environment (`pip` or `conda`) are available. Select which re ### Container The Container section list the following information: -* Image - a pre-configured container that ClearML Agent will use to remotely execute this task (see [Building Docker containers](../getting_started/clearml_agent_docker_exec.md)) +* Image - a pre-configured container that ClearML Agent will use to remotely execute this task (see [Building Task Execution Environments in a Container](../getting_started/clearml_agent_base_docker.md)) * Arguments - add container arguments * Setup shell script - a bash script to be executed inside the container before setting up the task's environment @@ -230,13 +230,13 @@ The **INFO** tab shows extended task information: * [Task description](#description) * [Task details](#task-details) -### Latest Events Log +### Latest Events Log -:::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +:::info Hosted Service and Enterprise Feature +The latest events log is available only on the ClearML Hosted Service and under the ClearML Enterprise plan. ::: -The Enterprise Server also displays a detailed history of task activity: +The **INFO** tab includes a detailed history of task activity: * Task action (e.g. status changes, project move, etc.) * Action time * Acting user @@ -252,7 +252,7 @@ To download the task history as a CSV file, hover over the log and click Graph view) shows scalar series plotted as a time series line chart. By default, a single plot is shown for each scalar metric, with all variants overlaid within. diff --git a/docs/webapp/webapp_exp_tuning.md b/docs/webapp/webapp_exp_tuning.md index b63dc423..d78ded27 100644 --- a/docs/webapp/webapp_exp_tuning.md +++ b/docs/webapp/webapp_exp_tuning.md @@ -72,7 +72,7 @@ and/or Reset functions. #### Default Container -Select a pre-configured container that the [ClearML Agent](../clearml_agent.md) will use to remotely execute this task (see [Building Docker containers](../getting_started/clearml_agent_docker_exec.md)). +Select a pre-configured container that the [ClearML Agent](../clearml_agent.md) will use to remotely execute this task (see [Building Task Execution Environments in a Container](../getting_started/clearml_agent_base_docker.md)). **To add, change, or delete a default container:** diff --git a/docs/webapp/webapp_orchestration_dash.md b/docs/webapp/webapp_orchestration_dash.md index 2c8059b3..4848cf18 100644 --- a/docs/webapp/webapp_orchestration_dash.md +++ b/docs/webapp/webapp_orchestration_dash.md @@ -3,7 +3,7 @@ title: Orchestration Dashboard --- :::important Enterprise Feature -This feature is available under the ClearML Enterprise plan. +The Orchestration Dashboard is available under the ClearML Enterprise plan. ::: Use the orchestration dashboard to monitor all of your available and in-use compute resources: diff --git a/docs/webapp/webapp_reports.md b/docs/webapp/webapp_reports.md index a2766916..e84057c6 100644 --- a/docs/webapp/webapp_reports.md +++ b/docs/webapp/webapp_reports.md @@ -424,22 +424,22 @@ To add an image, add an exclamation point, followed by the alt text enclosed by image enclosed in parentheses: ``` -![Logo](https://raw.githubusercontent.com/allegroai/clearml/master/docs/clearml-logo.svg) +![Logo](https://raw.githubusercontent.com/clearml/clearml/master/docs/clearml-logo.svg) ``` The rendered output should look like this: -![Logo](https://raw.githubusercontent.com/allegroai/clearml/master/docs/clearml-logo.svg) +![Logo](https://raw.githubusercontent.com/clearml/clearml/master/docs/clearml-logo.svg) To add a title to the image, which you can see in a tooltip when hovering over the image, add the title after the image's link: ``` -![With title](https://raw.githubusercontent.com/allegroai/clearml/master/docs/clearml-logo.svg "ClearML logo") +![With title](https://raw.githubusercontent.com/clearml/clearml/master/docs/clearml-logo.svg "ClearML logo") ``` The rendered output should look like this: -Logo with Title +Logo with Title Hover over the image to see its title. diff --git a/docusaurus.config.js b/docusaurus.config.js index 9daa5d95..da24b73b 100644 --- a/docusaurus.config.js +++ b/docusaurus.config.js @@ -114,7 +114,7 @@ module.exports = { { label: 'References', to: '/docs/references/sdk/task', - activeBaseRegex: '^/docs/latest/docs/(references/|webapp/.*|hyperdatasets/webapp/.*|clearml_agent/(clearml_agent_ref|clearml_agent_env_var)|configs/(clearml_conf|env_vars)|apps/(clearml_task|clearml_param_search))(/.*)?$', + activeBaseRegex: '^/docs/latest/docs/(references/.*|webapp/.*|hyperdatasets/webapp/.*|clearml_agent/(clearml_agent_ref|clearml_agent_env_var)|configs/(clearml_conf|env_vars)|apps/(clearml_task|clearml_param_search))(/.*)?$', }, { label: 'Best Practices', @@ -127,7 +127,7 @@ module.exports = { activeBaseRegex: '^/docs/latest/docs/guides', }, { - label: 'Integrations', + label: 'Code Integrations', to: '/docs/integrations', activeBaseRegex: '^/docs/latest/docs/integrations(?!/storage)', }, diff --git a/sidebars.js b/sidebars.js index cbbd6289..0674d6a0 100644 --- a/sidebars.js +++ b/sidebars.js @@ -399,8 +399,10 @@ module.exports = { 'references/sdk/dataset', {'Pipeline': [ 'references/sdk/automation_controller_pipelinecontroller', + 'references/sdk/automation_controller_pipelinedecorator', 'references/sdk/automation_job_clearmljob' - ]}, + ] + }, 'references/sdk/scheduler', 'references/sdk/trigger', {'HyperParameter Optimization': [ @@ -635,11 +637,19 @@ module.exports = { 'getting_started/architecture', ]},*/ { - 'Enterprise Server Deployment': [ - 'deploying_clearml/enterprise_deploy/multi_tenant_k8s', - 'deploying_clearml/enterprise_deploy/vpc_aws', - 'deploying_clearml/enterprise_deploy/on_prem_ubuntu', - ] + 'Enterprise Server': { + 'Deployment Options': [ + 'deploying_clearml/enterprise_deploy/multi_tenant_k8s', + 'deploying_clearml/enterprise_deploy/vpc_aws', + 'deploying_clearml/enterprise_deploy/on_prem_ubuntu', + ], + 'Maintenance': [ + 'deploying_clearml/enterprise_deploy/import_projects', + 'deploying_clearml/enterprise_deploy/change_artifact_links', + 'deploying_clearml/enterprise_deploy/delete_tenant', + ] + + } }, { type: 'category', @@ -651,11 +661,9 @@ module.exports = { 'deploying_clearml/enterprise_deploy/appgw_install_k8s', ] }, - 'deploying_clearml/enterprise_deploy/delete_tenant', - 'deploying_clearml/enterprise_deploy/import_projects', - 'deploying_clearml/enterprise_deploy/change_artifact_links', + 'deploying_clearml/enterprise_deploy/custom_billing', { - 'Enterprise Applications': [ + 'UI Applications': [ 'deploying_clearml/enterprise_deploy/app_install_ubuntu_on_prem', 'deploying_clearml/enterprise_deploy/app_install_ex_server', 'deploying_clearml/enterprise_deploy/app_custom',