Add pipeline step transformation info (#295)

This commit is contained in:
pollfly 2022-07-27 10:40:54 +03:00 committed by GitHub
parent 4e3f2dc7b8
commit 8afe79d3ee
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
2 changed files with 8 additions and 2 deletions

View File

@ -34,7 +34,8 @@ example of a pipeline with concurrent steps.
## Running Your Pipelines
ClearML supports multiple modes for pipeline execution:
* **Remote Mode** (default) - In this mode, the pipeline controller logic is executed through a designated queue, and all
the pipeline steps are launched remotely through their respective queues.
the pipeline steps are launched remotely through their respective queues. Since each task is executed independently,
it can have control over its git repository (if needed), required python packages and specific container to be used.
* **Local Mode** - In this mode, the pipeline is executed locally, and the steps are executed as sub-processes. Each
subprocess uses the exact same Python environment as the main pipeline logic.
* **Debugging Mode** (for PipelineDecorator) - In this mode, the entire pipeline is executed locally, with the pipeline

View File

@ -50,7 +50,7 @@ or [functions in your code](#steps-from-functions). When the pipeline runs, the
to the specified structure.
### Steps from Tasks
Creating a pipeline step from an existing ClearML task means that when the step is run, the task will be cloned, and a
Creating a pipeline step from an existing ClearML task means that when the step is run, the task will be cloned, and a
new task will be launched through the configured execution queue (the original task is unmodified). The new tasks
parameters can be [specified](#parameter_override).
@ -109,6 +109,11 @@ Examples:
Creating a pipeline step from a function means that when the function is called, it will be transformed into a ClearML task,
translating its arguments into parameters, and returning values into artifacts.
:::info Function to ClearML Task conversion
As each function is transformed into an independently executed step, it needs to be self-contained. To facilitate this,
all package imports inside the function are automatically logged as required packages for the pipeline step.
:::
Function steps are added using the [`PipelineController.add_function_step`](../references/sdk/automation_controller_pipelinecontroller.md#add_function_step)
method: