From ec8b229fadb694d77437ebfc2cb1e5d09c898cf6 Mon Sep 17 00:00:00 2001 From: pollfly <75068813+pollfly@users.noreply.github.com> Date: Sun, 2 Mar 2025 11:17:33 +0200 Subject: [PATCH 1/3] Remove Enterprise admonition from "Latest events log" (#1065) --- docs/webapp/webapp_exp_track_visual.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/webapp/webapp_exp_track_visual.md b/docs/webapp/webapp_exp_track_visual.md index a216e54a..c2396e3b 100644 --- a/docs/webapp/webapp_exp_track_visual.md +++ b/docs/webapp/webapp_exp_track_visual.md @@ -230,13 +230,13 @@ The **INFO** tab shows extended task information: * [Task description](#description) * [Task details](#task-details) -### Latest Events Log +### Latest Events Log -:::important Enterprise Feature -The latest events log is available under the ClearML Enterprise plan. +:::info Hosted Service and Enterprise Feature +The latest events log is available only on the ClearML Hosted Service and under the ClearML Enterprise plan. ::: -The Enterprise Server also displays a detailed history of task activity: +The **INFO** tab includes a detailed history of task activity: * Task action (e.g. status changes, project move, etc.) * Action time * Acting user From 3829d1f4b771c812ec5b7514e1b31e3b25a5aca4 Mon Sep 17 00:00:00 2001 From: pollfly <75068813+pollfly@users.noreply.github.com> Date: Sun, 2 Mar 2025 11:54:05 +0200 Subject: [PATCH 2/3] Small edits (#1067) --- docs/deploying_clearml/clearml_server_es7_migration.md | 2 +- docs/deploying_clearml/clearml_server_linux_mac.md | 2 +- docs/deploying_clearml/clearml_server_win.md | 2 +- .../enterprise_deploy/appgw_install_k8s.md | 6 +++--- docs/deploying_clearml/upgrade_server_aws_ec2_ami.md | 2 +- docs/deploying_clearml/upgrade_server_gcp.md | 2 +- docs/deploying_clearml/upgrade_server_kubernetes_helm.md | 4 ++-- docs/deploying_clearml/upgrade_server_linux_mac.md | 2 +- docs/deploying_clearml/upgrade_server_win.md | 2 +- docs/getting_started/main.md | 6 +++--- docs/guides/clearml-task/clearml_task_tutorial.md | 2 +- docs/guides/ide/google_colab.md | 2 +- docs/guides/services/slack_alerts.md | 2 +- docs/webapp/webapp_exp_track_visual.md | 4 ++-- docs/webapp/webapp_reports.md | 8 ++++---- 15 files changed, 24 insertions(+), 24 deletions(-) diff --git a/docs/deploying_clearml/clearml_server_es7_migration.md b/docs/deploying_clearml/clearml_server_es7_migration.md index abc6b671..126b222b 100644 --- a/docs/deploying_clearml/clearml_server_es7_migration.md +++ b/docs/deploying_clearml/clearml_server_es7_migration.md @@ -129,7 +129,7 @@ and ClearML Server needs to be installed. 1. Add the `clearml-server` repository to Helm client. ``` - helm repo add allegroai https://allegroai.github.io/clearml-server-helm/ + helm repo add clearml https://clearml.github.io/clearml-server-helm/ ``` Confirm the `clearml-server` repository is now in the Helm client. diff --git a/docs/deploying_clearml/clearml_server_linux_mac.md b/docs/deploying_clearml/clearml_server_linux_mac.md index 9509c748..52b25658 100644 --- a/docs/deploying_clearml/clearml_server_linux_mac.md +++ b/docs/deploying_clearml/clearml_server_linux_mac.md @@ -136,7 +136,7 @@ Deploying the server requires a minimum of 8 GB of memory, 16 GB is recommended. 2. Download the ClearML Server docker-compose YAML file. ``` - sudo curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml + sudo curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml ``` 1. For Linux only, configure the **ClearML Agent Services**: diff --git a/docs/deploying_clearml/clearml_server_win.md b/docs/deploying_clearml/clearml_server_win.md index f3a54e20..5cf0e768 100644 --- a/docs/deploying_clearml/clearml_server_win.md +++ b/docs/deploying_clearml/clearml_server_win.md @@ -57,7 +57,7 @@ Deploying the server requires a minimum of 8 GB of memory, 16 GB is recommended. 1. Save the ClearML Server docker-compose YAML file. ``` - curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose-win10.yml -o c:\opt\clearml\docker-compose-win10.yml + curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose-win10.yml -o c:\opt\clearml\docker-compose-win10.yml ``` 1. Run `docker-compose`. In PowerShell, execute the following commands: diff --git a/docs/deploying_clearml/enterprise_deploy/appgw_install_k8s.md b/docs/deploying_clearml/enterprise_deploy/appgw_install_k8s.md index 945a31cb..4274f844 100644 --- a/docs/deploying_clearml/enterprise_deploy/appgw_install_k8s.md +++ b/docs/deploying_clearml/enterprise_deploy/appgw_install_k8s.md @@ -12,8 +12,8 @@ This guide details the installation of the ClearML AI Application Gateway, speci * Kubernetes cluster: `>= 1.21.0-0 < 1.32.0-0` * Helm installed and configured -* Helm token to access allegroai helm-chart repo -* Credentials for allegroai docker repo +* Helm token to access `allegroai` helm-chart repo +* Credentials for `allegroai` docker repo * A valid ClearML Server installation ## Optional for HTTPS @@ -27,7 +27,7 @@ This guide details the installation of the ClearML AI Application Gateway, speci ``` helm repo add allegroai-enterprise \ -https://raw.githubusercontent.com/allegroai/clearml-enterprise-helm-charts/gh-pages \ +https://raw.githubusercontent.com/clearml/clearml-enterprise-helm-charts/gh-pages \ --username \ --password ``` diff --git a/docs/deploying_clearml/upgrade_server_aws_ec2_ami.md b/docs/deploying_clearml/upgrade_server_aws_ec2_ami.md index d962e3cb..ed129ab2 100644 --- a/docs/deploying_clearml/upgrade_server_aws_ec2_ami.md +++ b/docs/deploying_clearml/upgrade_server_aws_ec2_ami.md @@ -49,7 +49,7 @@ If upgrading from Trains Server version 0.15 or older, a data migration is requi 1. Download the latest `docker-compose.yml` file. Execute the following command: ``` - sudo curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml + sudo curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml ``` 1. Startup ClearML Server. This automatically pulls the latest ClearML Server build. diff --git a/docs/deploying_clearml/upgrade_server_gcp.md b/docs/deploying_clearml/upgrade_server_gcp.md index a3d5c020..7739d82d 100644 --- a/docs/deploying_clearml/upgrade_server_gcp.md +++ b/docs/deploying_clearml/upgrade_server_gcp.md @@ -38,7 +38,7 @@ you can proceed to upgrade to v2.x. 1. Download the latest `docker-compose.yml` file: ``` - curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml + curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml ``` 1. Startup ClearML Server. This automatically pulls the latest ClearML Server build. diff --git a/docs/deploying_clearml/upgrade_server_kubernetes_helm.md b/docs/deploying_clearml/upgrade_server_kubernetes_helm.md index a2d2fb0a..7aa1b6e0 100644 --- a/docs/deploying_clearml/upgrade_server_kubernetes_helm.md +++ b/docs/deploying_clearml/upgrade_server_kubernetes_helm.md @@ -7,13 +7,13 @@ title: Kubernetes ```bash helm repo update -helm upgrade clearml allegroai/clearml +helm upgrade clearml clearml/clearml ``` **To change the values in an existing installation,** execute the following: ```bash -helm upgrade clearml allegroai/clearml --version -f custom_values.yaml +helm upgrade clearml clearml/clearml --version -f custom_values.yaml ``` See the [clearml-helm-charts repository](https://github.com/clearml/clearml-helm-charts/tree/main/charts/clearml#local-environment) diff --git a/docs/deploying_clearml/upgrade_server_linux_mac.md b/docs/deploying_clearml/upgrade_server_linux_mac.md index 78f87ac5..3a77d8a6 100644 --- a/docs/deploying_clearml/upgrade_server_linux_mac.md +++ b/docs/deploying_clearml/upgrade_server_linux_mac.md @@ -59,7 +59,7 @@ For backwards compatibility, the environment variables ``TRAINS_HOST_IP``, ``TRA 1. Download the latest `docker-compose.yml` file: ``` - curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml + curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose.yml -o /opt/clearml/docker-compose.yml ``` 1. Startup ClearML Server. This automatically pulls the latest ClearML Server build: diff --git a/docs/deploying_clearml/upgrade_server_win.md b/docs/deploying_clearml/upgrade_server_win.md index 20350cd5..11f8690a 100644 --- a/docs/deploying_clearml/upgrade_server_win.md +++ b/docs/deploying_clearml/upgrade_server_win.md @@ -49,7 +49,7 @@ you can proceed to upgrade to v2.x. 1. Download the latest `docker-compose.yml` file: ``` - curl https://raw.githubusercontent.com/allegroai/clearml-server/master/docker/docker-compose-win10.yml -o c:\opt\clearml\docker-compose-win10.yml + curl https://raw.githubusercontent.com/clearml/clearml-server/master/docker/docker-compose-win10.yml -o c:\opt\clearml\docker-compose-win10.yml ``` 1. Startup ClearML Server. This automatically pulls the latest ClearML Server build. diff --git a/docs/getting_started/main.md b/docs/getting_started/main.md index bc1b3b17..12ab7727 100644 --- a/docs/getting_started/main.md +++ b/docs/getting_started/main.md @@ -32,19 +32,19 @@ training, and deploying models at every scale on any AI infrastructure. Step 1 - Experiment Management - + Open In Colab Step 2 - Remote Execution Agent Setup - + Open In Colab Step 3 - Remotely Execute Tasks - + Open In Colab diff --git a/docs/guides/clearml-task/clearml_task_tutorial.md b/docs/guides/clearml-task/clearml_task_tutorial.md index 99b86e0f..9f47b5df 100644 --- a/docs/guides/clearml-task/clearml_task_tutorial.md +++ b/docs/guides/clearml-task/clearml_task_tutorial.md @@ -49,7 +49,7 @@ Execution log at: https://app.clear.ml/projects/552d5399112d47029c146d5248570295 ### Executing a Local Script For this example, use a local version of [this script](https://github.com/clearml/events/blob/master/webinar-0620/keras_mnist.py). -1. Clone the [allegroai/events](https://github.com/clearml/events) repository +1. Clone the [clearml/events](https://github.com/clearml/events) repository 1. Go to the root folder of the cloned repository 1. Run the following command: diff --git a/docs/guides/ide/google_colab.md b/docs/guides/ide/google_colab.md index 49163696..dbee1a2d 100644 --- a/docs/guides/ide/google_colab.md +++ b/docs/guides/ide/google_colab.md @@ -16,7 +16,7 @@ and running, users can send Tasks to be executed on Google Colab's hardware. ## Steps -1. Open up [this Google Colab notebook](https://colab.research.google.com/github/allegroai/clearml/blob/master/examples/clearml_agent/clearml_colab_agent.ipynb). +1. Open up [this Google Colab notebook](https://colab.research.google.com/github/clearml/clearml/blob/master/examples/clearml_agent/clearml_colab_agent.ipynb). 1. Run the first cell, which installs all the necessary packages: ``` diff --git a/docs/guides/services/slack_alerts.md b/docs/guides/services/slack_alerts.md index e7bcb251..8a3bbd6a 100644 --- a/docs/guides/services/slack_alerts.md +++ b/docs/guides/services/slack_alerts.md @@ -22,7 +22,7 @@ The Slack API token and channel you create are required to configure the Slack a 1. In **Development Slack Workspace**, select a workspace. 1. Click **Create App**. 1. In **Basic Information**, under **Display Information**, complete the following: - - In **Short description**, enter "Allegro Train Bot". + - In **Short description**, enter "ClearML Train Bot". - In **Background color**, enter "#202432". 1. Click **Save Changes**. 1. In **OAuth & Permissions**, under **Scopes**, click **Add an OAuth Scope**, and then select the following permissions diff --git a/docs/webapp/webapp_exp_track_visual.md b/docs/webapp/webapp_exp_track_visual.md index c2396e3b..14f3fbe2 100644 --- a/docs/webapp/webapp_exp_track_visual.md +++ b/docs/webapp/webapp_exp_track_visual.md @@ -252,7 +252,7 @@ To download the task history as a CSV file, hover over the log and click Graph view) shows scalar series plotted as a time series line chart. By default, a single plot is shown for each scalar metric, with all variants overlaid within. diff --git a/docs/webapp/webapp_reports.md b/docs/webapp/webapp_reports.md index a2766916..e84057c6 100644 --- a/docs/webapp/webapp_reports.md +++ b/docs/webapp/webapp_reports.md @@ -424,22 +424,22 @@ To add an image, add an exclamation point, followed by the alt text enclosed by image enclosed in parentheses: ``` -![Logo](https://raw.githubusercontent.com/allegroai/clearml/master/docs/clearml-logo.svg) +![Logo](https://raw.githubusercontent.com/clearml/clearml/master/docs/clearml-logo.svg) ``` The rendered output should look like this: -![Logo](https://raw.githubusercontent.com/allegroai/clearml/master/docs/clearml-logo.svg) +![Logo](https://raw.githubusercontent.com/clearml/clearml/master/docs/clearml-logo.svg) To add a title to the image, which you can see in a tooltip when hovering over the image, add the title after the image's link: ``` -![With title](https://raw.githubusercontent.com/allegroai/clearml/master/docs/clearml-logo.svg "ClearML logo") +![With title](https://raw.githubusercontent.com/clearml/clearml/master/docs/clearml-logo.svg "ClearML logo") ``` The rendered output should look like this: -Logo with Title +Logo with Title Hover over the image to see its title. From 6f8d8601f079ae0f2a243e07468e85a87c3d4315 Mon Sep 17 00:00:00 2001 From: pollfly <75068813+pollfly@users.noreply.github.com> Date: Sun, 2 Mar 2025 11:54:30 +0200 Subject: [PATCH 3/3] Split PipelineController and PipelineDecorator reference pages (#1063) --- docs/guides/pipeline/pipeline_decorator.md | 14 +++++------ .../pipelines_sdk_function_decorators.md | 24 +++++++++---------- ...automation_controller_pipelinedecorator.md | 5 ++++ .../webapp/pipelines/webapp_pipeline_table.md | 2 +- .../pipelines/webapp_pipeline_viewing.md | 2 +- sidebars.js | 4 +++- 6 files changed, 29 insertions(+), 22 deletions(-) create mode 100644 docs/references/sdk/automation_controller_pipelinedecorator.md diff --git a/docs/guides/pipeline/pipeline_decorator.md b/docs/guides/pipeline/pipeline_decorator.md index 4c5a3213..0ae0a0a5 100644 --- a/docs/guides/pipeline/pipeline_decorator.md +++ b/docs/guides/pipeline/pipeline_decorator.md @@ -3,7 +3,7 @@ title: Pipeline from Decorators --- The [pipeline_from_decorator.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/pipeline_from_decorator.py) -example demonstrates the creation of a pipeline in ClearML using the [`PipelineDecorator`](../../references/sdk/automation_controller_pipelinecontroller.md#class-automationcontrollerpipelinedecorator) +example demonstrates the creation of a pipeline in ClearML using the [`PipelineDecorator`](../../references/sdk/automation_controller_pipelinedecorator.md#class-automationcontrollerpipelinedecorator) class. This example creates a pipeline incorporating four tasks, each of which is created from a Python function using a custom decorator: @@ -14,11 +14,11 @@ This example creates a pipeline incorporating four tasks, each of which is creat * `step_four` - Uses data from `step_two` and the model from `step_three` to make a prediction. The pipeline steps, defined in the `step_one`, `step_two`, `step_three`, and `step_four` functions, are each wrapped with the -[`@PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorcomponent) +[`@PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorcomponent) decorator, which creates a ClearML pipeline step for each one when the pipeline is executed. The logic that executes these steps and controls the interaction between them is implemented in the `executing_pipeline` -function. This function is wrapped with the [`@PipelineDecorator.pipeline`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorpipeline) +function. This function is wrapped with the [`@PipelineDecorator.pipeline`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorpipeline) decorator which creates the ClearML pipeline task when it is executed. The sections below describe in more detail what happens in the pipeline controller and steps. @@ -28,7 +28,7 @@ The sections below describe in more detail what happens in the pipeline controll In this example, the pipeline controller is implemented by the `executing_pipeline` function. Using the `@PipelineDecorator.pipeline` decorator creates a ClearML Controller Task from the function when it is executed. -For detailed information, see [`@PipelineDecorator.pipeline`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorpipeline). +For detailed information, see [`@PipelineDecorator.pipeline`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorpipeline). In the example script, the controller defines the interactions between the pipeline steps in the following way: 1. The controller function passes its argument, `pickle_url`, to the pipeline's first step (`step_one`) @@ -39,13 +39,13 @@ In the example script, the controller defines the interactions between the pipel :::info Local Execution In this example, the pipeline is set to run in local mode by using -[`PipelineDecorator.run_locally()`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorrun_locally) +[`PipelineDecorator.run_locally()`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorrun_locally) before calling the pipeline function. See pipeline execution options [here](../../pipelines/pipelines_sdk_function_decorators.md#running-the-pipeline). ::: ## Pipeline Steps Using the `@PipelineDecorator.component` decorator will make the function a pipeline component that can be called from the -pipeline controller, which implements the pipeline's execution logic. For detailed information, see [`@PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorcomponent). +pipeline controller, which implements the pipeline's execution logic. For detailed information, see [`@PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorcomponent). When the pipeline controller calls a pipeline step, a corresponding ClearML task will be created. Notice that all package imports inside the function will be automatically logged as required packages for the pipeline execution step. @@ -63,7 +63,7 @@ executing_pipeline( ``` By default, the pipeline controller and the pipeline steps are launched through ClearML [queues](../../fundamentals/agents_and_queues.md#what-is-a-queue). -Use the [`PipelineDecorator.set_default_execution_queue`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorset_default_execution_queue) +Use the [`PipelineDecorator.set_default_execution_queue`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorset_default_execution_queue) method to specify the execution queue of all pipeline steps. The `execution_queue` parameter of the `@PipelineDecorator.component` decorator overrides the default queue value for the specific step for which it was specified. diff --git a/docs/pipelines/pipelines_sdk_function_decorators.md b/docs/pipelines/pipelines_sdk_function_decorators.md index c6ef6d8f..97f43e75 100644 --- a/docs/pipelines/pipelines_sdk_function_decorators.md +++ b/docs/pipelines/pipelines_sdk_function_decorators.md @@ -4,14 +4,14 @@ title: PipelineDecorator ## Creating Pipelines Using Function Decorators -Use the [`PipelineDecorator`](../references/sdk/automation_controller_pipelinecontroller.md#class-automationcontrollerpipelinedecorator) -class to create pipelines from your existing functions. Use [`@PipelineDecorator.component`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorcomponent) -to denote functions that comprise the steps of your pipeline, and [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorpipeline) +Use the [`PipelineDecorator`](../references/sdk/automation_controller_pipelinedecorator.md#class-automationcontrollerpipelinedecorator) +class to create pipelines from your existing functions. Use [`@PipelineDecorator.component`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorcomponent) +to denote functions that comprise the steps of your pipeline, and [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorpipeline) for your main pipeline execution logic function. ## @PipelineDecorator.pipeline -Using the [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorpipeline) +Using the [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorpipeline) decorator transforms the function which implements your pipeline's execution logic to a ClearML pipeline controller, an independently executed task. @@ -70,13 +70,13 @@ parameters. When launching a new pipeline run from the [UI](../webapp/pipelines/ ![Pipeline new run](../img/pipelines_new_run.png) ## @PipelineDecorator.component -Using the [`@PipelineDecorator.component`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorcomponent) +Using the [`@PipelineDecorator.component`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorcomponent) decorator transforms a function into a ClearML pipeline step when called from a pipeline controller. When the pipeline controller calls a pipeline step, a corresponding ClearML task is created. :::tip Package Imports -In the case that the `skip_global_imports` parameter of [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorpipeline) +In the case that the `skip_global_imports` parameter of [`@PipelineDecorator.pipeline`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorpipeline) is set to `False`, all global imports will be automatically imported at the beginning of each step's execution. Otherwise, if set to `True`, make sure that each function which makes up a pipeline step contains package imports, which are automatically logged as required packages for the pipeline execution step. @@ -110,7 +110,7 @@ def step_one(pickle_data_url: str, extra: int = 43): * `packages` - A list of required packages or a local requirements.txt file. Example: `["tqdm>=2.1", "scikit-learn"]` or `"./requirements.txt"`. If not provided, packages are automatically added based on the imports used inside the function. * `execution_queue` (optional) - Queue in which to enqueue the specific step. This overrides the queue set with the - [`PipelineDecorator.set_default_execution_queue method`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorset_default_execution_queue) + [`PipelineDecorator.set_default_execution_queue method`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorset_default_execution_queue) method. * `continue_on_fail` - If `True`, a failed step does not cause the pipeline to stop (or marked as failed). Notice, that steps that are connected (or indirectly connected) to the failed step are skipped (default `False`) @@ -186,14 +186,14 @@ specify which frameworks to log. See `Task.init`'s [`auto_connect_framework` par * `auto_connect_arg_parser` - Control automatic logging of argparse objects. See `Task.init`'s [`auto_connect_arg_parser` parameter](../references/sdk/task.md#taskinit) You can also directly upload a model or an artifact from the step to the pipeline controller, using the -[`PipelineDecorator.upload_model`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorupload_model) -and [`PipelineDecorator.upload_artifact`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorupload_artifact) +[`PipelineDecorator.upload_model`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorupload_model) +and [`PipelineDecorator.upload_artifact`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorupload_artifact) methods respectively. ## Controlling Pipeline Execution ### Default Execution Queue -The [`PipelineDecorator.set_default_execution_queue`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorset_default_execution_queue) +The [`PipelineDecorator.set_default_execution_queue`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorset_default_execution_queue) method lets you set a default queue through which all pipeline steps will be executed. Once set, step-specific overrides can be specified through the `@PipelineDecorator.component` decorator. @@ -226,7 +226,7 @@ You can run the pipeline logic locally, while keeping the pipeline components ex #### Debugging Mode In debugging mode, the pipeline controller and all components are treated as regular Python functions, with components called synchronously. This mode is great to debug the components and design the pipeline as the entire pipeline is -executed on the developer machine with full ability to debug each function call. Call [`PipelineDecorator.debug_pipeline`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratordebug_pipeline) +executed on the developer machine with full ability to debug each function call. Call [`PipelineDecorator.debug_pipeline`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratordebug_pipeline) before the main pipeline logic function call. Example: @@ -242,7 +242,7 @@ In local mode, the pipeline controller creates Tasks for each component, and com into sub-processes running on the same machine. Notice that the data is passed between the components and the logic with the exact same mechanism as in the remote mode (i.e. hyperparameters / artifacts), with the exception that the execution itself is local. Notice that each subprocess is using the exact same Python environment as the main pipeline logic. Call -[`PipelineDecorator.run_locally`](../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorrun_locally) +[`PipelineDecorator.run_locally`](../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorrun_locally) before the main pipeline logic function. Example: diff --git a/docs/references/sdk/automation_controller_pipelinedecorator.md b/docs/references/sdk/automation_controller_pipelinedecorator.md new file mode 100644 index 00000000..0545db42 --- /dev/null +++ b/docs/references/sdk/automation_controller_pipelinedecorator.md @@ -0,0 +1,5 @@ +--- +title: PipelineDecorator +--- + +**AutoGenerated PlaceHolder** \ No newline at end of file diff --git a/docs/webapp/pipelines/webapp_pipeline_table.md b/docs/webapp/pipelines/webapp_pipeline_table.md index cc252d93..114154f7 100644 --- a/docs/webapp/pipelines/webapp_pipeline_table.md +++ b/docs/webapp/pipelines/webapp_pipeline_table.md @@ -36,7 +36,7 @@ The pipeline run table contains the following columns: | Column | Description | Type | |---|---|---| | **RUN** | Pipeline run identifier | String | -| **VERSION** | The pipeline version number. Corresponds to the [PipelineController](../../references/sdk/automation_controller_pipelinecontroller.md#class-pipelinecontroller)'s and [PipelineDecorator](../../references/sdk/automation_controller_pipelinecontroller.md#class-automationcontrollerpipelinedecorator)'s `version` parameter | Version string | +| **VERSION** | The pipeline version number. Corresponds to the [PipelineController](../../references/sdk/automation_controller_pipelinecontroller.md#class-pipelinecontroller)'s and [PipelineDecorator](../../references/sdk/automation_controller_pipelinedecorator.md#class-automationcontrollerpipelinedecorator)'s `version` parameter | Version string | | **TAGS** | Descriptive, user-defined, color-coded tags assigned to run. | Tag | | **STATUS** | Pipeline run's status. See a list of the [task states and state transitions](../../fundamentals/task.md#task-states). For Running, Failed, and Aborted runs, you will also see a progress indicator next to the status. See [here](../../pipelines/pipelines.md#tracking-pipeline-progress). | String | | **USER** | User who created the run. | String | diff --git a/docs/webapp/pipelines/webapp_pipeline_viewing.md b/docs/webapp/pipelines/webapp_pipeline_viewing.md index b16b2370..6868c063 100644 --- a/docs/webapp/pipelines/webapp_pipeline_viewing.md +++ b/docs/webapp/pipelines/webapp_pipeline_viewing.md @@ -108,7 +108,7 @@ The details panel includes three tabs: ![console](../../img/webapp_pipeline_step_console_dark.png#dark-mode-only) * **Code** - For pipeline steps generated from functions using either [`PipelineController.add_function_step`](../../references/sdk/automation_controller_pipelinecontroller.md#add_function_step) -or [`PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinecontroller.md#pipelinedecoratorcomponent), +or [`PipelineDecorator.component`](../../references/sdk/automation_controller_pipelinedecorator.md#pipelinedecoratorcomponent), you can view the selected step's code. ![code](../../img/webapp_pipeline_step_code.png#light-mode-only) diff --git a/sidebars.js b/sidebars.js index a6edf4bb..0674d6a0 100644 --- a/sidebars.js +++ b/sidebars.js @@ -399,8 +399,10 @@ module.exports = { 'references/sdk/dataset', {'Pipeline': [ 'references/sdk/automation_controller_pipelinecontroller', + 'references/sdk/automation_controller_pipelinedecorator', 'references/sdk/automation_job_clearmljob' - ]}, + ] + }, 'references/sdk/scheduler', 'references/sdk/trigger', {'HyperParameter Optimization': [