Update links

This commit is contained in:
revital 2025-02-13 10:48:26 +02:00
parent a746df84f5
commit c747706c2a
190 changed files with 796 additions and 796 deletions

View File

@ -1,6 +1,6 @@
<div align="center">
<a href="https://app.clear.ml"><img src="https://github.com/allegroai/clearml/blob/master/docs/clearml-logo.svg?raw=true" width="250px"></a>
<a href="https://app.clear.ml"><img src="https://github.com/clearml/clearml/blob/master/docs/clearml-logo.svg?raw=true" width="250px"></a>
**ClearML - The End-to-End Platform for AI Builders

View File

@ -112,7 +112,7 @@ session.
### Kubernetes Support
With ClearML k8s-glue you can enable launching ClearML sessions directly within Kubernetes pods. Set up the network and
ingress settings for `clearml-session` in the `sessions` section of the [`values.yaml`](https://github.com/allegroai/clearml-helm-charts/blob/main/charts/clearml-agent/values.yaml)
ingress settings for `clearml-session` in the `sessions` section of the [`values.yaml`](https://github.com/clearml/clearml-helm-charts/blob/main/charts/clearml-agent/values.yaml)
file.
Make sure to set the following values:

View File

@ -71,7 +71,7 @@ errors in identifying the correct default branch.
| `--packages` | Manually specify a list of required packages. Example: `--packages "tqdm>=2.1" "scikit-learn"` | <img src="/docs/latest/icons/ico-optional-no.svg" alt="No" className="icon size-md center-md" /> |
| `--project`| Set the project name for the task (required, unless using `--base-task-id`). If the named project does not exist, it is created on-the-fly | <img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" /> |
| `--queue` | Select a task's execution queue. If not provided, a task is created but not launched | <img src="/docs/latest/icons/ico-optional-no.svg" alt="No" className="icon size-md center-md" /> |
| `--repo` | URL of remote repository. Example: `--repo https://github.com/allegroai/clearml.git` | <img src="/docs/latest/icons/ico-optional-no.svg" alt="No" className="icon size-md center-md" /> |
| `--repo` | URL of remote repository. Example: `--repo https://github.com/clearml/clearml.git` | <img src="/docs/latest/icons/ico-optional-no.svg" alt="No" className="icon size-md center-md" /> |
| `--requirements` | Specify `requirements.txt` file to install when setting the session. By default, the` requirements.txt` from the repository will be used | <img src="/docs/latest/icons/ico-optional-no.svg" alt="No" className="icon size-md center-md" /> |
| `--script` | Entry point script for the remote execution. When used with `--repo`, input the script's relative path inside the repository. For example: `--script source/train.py`. When used with `--folder`, it supports a direct path to a file inside the local repository itself, for example: `--script ~/project/source/train.py` | <img src="/docs/latest/icons/ico-optional-yes.svg" alt="Yes" className="icon size-md center-md" /> |
| `--skip-task-init` | If set, `Task.init()` call is not added to the entry point, and is assumed to be called within the script | <img src="/docs/latest/icons/ico-optional-no.svg" alt="No" className="icon size-md center-md" /> |
@ -87,10 +87,10 @@ These commands demonstrate a few useful use cases for `clearml-task`.
### Executing Code from a Remote Repository
```bash
clearml-task --project examples --name remote_test --repo https://github.com/allegroai/events.git --branch master --script /webinar-0620/keras_mnist.py --args batch_size=64 epochs=1 --queue default
clearml-task --project examples --name remote_test --repo https://github.com/clearml/events.git --branch master --script /webinar-0620/keras_mnist.py --args batch_size=64 epochs=1 --queue default
```
The `keras_mnist.py` script from the [events](https://github.com/allegroai/events) GitHub repository is imported as a
The `keras_mnist.py` script from the [events](https://github.com/clearml/events) GitHub repository is imported as a
ClearML task named `remote_test` in the `examples` project. Its command line arguments `batch_size` and `epochs` values
are set, and the task is enqueued for execution on the `default` queue.

View File

@ -107,10 +107,10 @@ ClearML Agent is deployed onto a Kubernetes cluster using **Kubernetes-Glue**, w
You can deploy ClearML Agent onto Kubernetes using one of the following methods:
1. **ClearML Agent Helm Chart (Recommended)**:
Use the [ClearML Agent Helm Chart](https://github.com/allegroai/clearml-helm-charts/tree/main/charts/clearml-agent) to spin up an agent pod acting as a controller. This is the recommended and scalable approach.
Use the [ClearML Agent Helm Chart](https://github.com/clearml/clearml-helm-charts/tree/main/charts/clearml-agent) to spin up an agent pod acting as a controller. This is the recommended and scalable approach.
2. **K8s Glue Script**:
Run a [K8s Glue script](https://github.com/allegroai/clearml-agent/blob/master/examples/k8s_glue_example.py) on a Kubernetes CPU node. This approach is less scalable and typically suited for simpler use cases.
Run a [K8s Glue script](https://github.com/clearml/clearml-agent/blob/master/examples/k8s_glue_example.py) on a Kubernetes CPU node. This approach is less scalable and typically suited for simpler use cases.
### How It Works
The ClearML Kubernetes-Glue performs the following:

View File

@ -214,7 +214,7 @@ Where `<gpu_fraction_value>` must be set to one of the following values:
* "0.875"
## Container-based Memory Limits
Use [`clearml-fractional-gpu`](https://github.com/allegroai/clearml-fractional-gpu)'s pre-packaged containers with
Use [`clearml-fractional-gpu`](https://github.com/clearml/clearml-fractional-gpu)'s pre-packaged containers with
built-in hard memory limitations. Workloads running in these containers will only be able to use up to the container's
memory limit. Multiple isolated workloads can run on the same GPU without impacting each other.
@ -223,7 +223,7 @@ memory limit. Multiple isolated workloads can run on the same GPU without impact
#### Manual Execution
1. Choose the container with the appropriate memory limit. ClearML supports CUDA 11.x and CUDA 12.x with memory limits
ranging from 2 GB to 12 GB (see [clearml-fractional-gpu repository](https://github.com/allegroai/clearml-fractional-gpu/blob/main/README.md#-containers) for full list).
ranging from 2 GB to 12 GB (see [clearml-fractional-gpu repository](https://github.com/clearml/clearml-fractional-gpu/blob/main/README.md#-containers) for full list).
1. Launch the container:
```bash
@ -312,7 +312,7 @@ when limiting memory usage.
Build your own custom fractional GPU container by inheriting from one of ClearML's containers: In your Dockerfile, make
sure to include `From <clearml_container_image>` so the container will inherit from the relevant container.
See example custom Dockerfiles in the [clearml-fractional-gpu repository](https://github.com/allegroai/clearml-fractional-gpu/tree/main/examples).
See example custom Dockerfiles in the [clearml-fractional-gpu repository](https://github.com/clearml/clearml-fractional-gpu/tree/main/examples).
## Kubernetes Static MIG Fractions
Set up NVIDIA MIG (Multi-Instance GPU) support for Kubernetes to define GPU fraction profiles for specific workloads

View File

@ -146,7 +146,7 @@ In case a `clearml.conf` file already exists, add a few ClearML Agent specific c
worker_id: ""
}
```
View a complete ClearML Agent configuration file sample including an `agent` section [here](https://github.com/allegroai/clearml-agent/blob/master/docs/clearml.conf).
View a complete ClearML Agent configuration file sample including an `agent` section [here](https://github.com/clearml/clearml-agent/blob/master/docs/clearml.conf).
1. Save the configuration.

View File

@ -74,7 +74,7 @@ In the panel's **CONTENT** tab, you can see a table summarizing version contents
Now that you have a new dataset registered, you can consume it.
The [data_ingestion.py](https://github.com/allegroai/clearml/blob/master/examples/datasets/data_ingestion.py) example
The [data_ingestion.py](https://github.com/clearml/clearml/blob/master/examples/datasets/data_ingestion.py) example
script demonstrates using the dataset within Python code.
```python

View File

@ -9,7 +9,7 @@ from time to time. When the point of truth is updated, users can call `clearml-d
changes (file addition, modification, or removal) will be reflected in ClearML.
## Prerequisites
1. First, make sure that you have cloned the [clearml](https://github.com/allegroai/clearml) repository. It contains all
1. First, make sure that you have cloned the [clearml](https://github.com/clearml/clearml) repository. It contains all
the needed files.
1. Open terminal and change directory to the cloned repository's examples folder

View File

@ -2,14 +2,14 @@
title: Data Management with Python
---
The [dataset_creation.py](https://github.com/allegroai/clearml/blob/master/examples/datasets/dataset_creation.py) and
[data_ingestion.py](https://github.com/allegroai/clearml/blob/master/examples/datasets/data_ingestion.py) scripts
The [dataset_creation.py](https://github.com/clearml/clearml/blob/master/examples/datasets/dataset_creation.py) and
[data_ingestion.py](https://github.com/clearml/clearml/blob/master/examples/datasets/data_ingestion.py) scripts
together demonstrate how to use ClearML's [`Dataset`](../../references/sdk/dataset.md) class to create a dataset and
subsequently ingest the data.
## Dataset Creation
The [dataset_creation.py](https://github.com/allegroai/clearml/blob/master/examples/datasets/dataset_creation.py) script
The [dataset_creation.py](https://github.com/clearml/clearml/blob/master/examples/datasets/dataset_creation.py) script
demonstrates how to do the following:
* Create a dataset and add files to it
* Upload the dataset to the ClearML Server
@ -85,7 +85,7 @@ In the panel's **CONTENT** tab, you can see a table summarizing version contents
Now that a new dataset is registered, you can consume it!
The [data_ingestion.py](https://github.com/allegroai/clearml/blob/master/examples/datasets/data_ingestion.py) script
The [data_ingestion.py](https://github.com/clearml/clearml/blob/master/examples/datasets/data_ingestion.py) script
demonstrates data ingestion using the dataset created in the first script.
The following script gets the dataset and uses [`Dataset.get_local_copy()`](../../references/sdk/dataset.md#get_local_copy)

View File

@ -5,7 +5,7 @@ title: Data Management from CLI
In this example we'll create a simple dataset and demonstrate basic actions on it, using the `clearml-data` CLI.
## Prerequisites
1. First, make sure that you have cloned the [clearml](https://github.com/allegroai/clearml) repository. It contains all
1. First, make sure that you have cloned the [clearml](https://github.com/clearml/clearml) repository. It contains all
the needed files.
1. Open terminal and change directory to the cloned repository's examples folder:

View File

@ -57,7 +57,7 @@ ClearML's `optimization` module includes classes that support hyperparameter opt
controller class
* Optimization search strategy classes including [Optuna](../references/sdk/hpo_optuna_optuna_optimizeroptuna.md), [HpBandSter](../references/sdk/hpo_hpbandster_bandster_optimizerbohb.md),
[GridSearch](../references/sdk/hpo_optimization_gridsearch.md), [RandomSearch](../references/sdk/hpo_optimization_randomsearch.md),
and a base [SearchStrategy](https://github.com/allegroai/clearml/blob/master/clearml/automation/optimization.py#L310)
and a base [SearchStrategy](https://github.com/clearml/clearml/blob/master/clearml/automation/optimization.py#L310)
that can be customized
See the [HyperParameterOptimizer SDK reference page](../references/sdk/hpo_optimization_hyperparameteroptimizer.md).
@ -96,22 +96,22 @@ configurations, and other execution details.
See [reference page](../references/sdk/automation_job_clearmljob.md).
### AutoScaler
The `AutoScaler` class facilitates implementing resource budgeting. See class methods [here](https://github.com/allegroai/clearml/blob/master/clearml/automation/auto_scaler.py).
ClearML also provides a class specifically for AWS autoscaling. See [code](https://github.com/allegroai/clearml/blob/master/clearml/automation/aws_auto_scaler.py#L22)
and [example script](https://github.com/allegroai/clearml/blob/master/examples/services/aws-autoscaler/aws_autoscaler.py).
The `AutoScaler` class facilitates implementing resource budgeting. See class methods [here](https://github.com/clearml/clearml/blob/master/clearml/automation/auto_scaler.py).
ClearML also provides a class specifically for AWS autoscaling. See [code](https://github.com/clearml/clearml/blob/master/clearml/automation/aws_auto_scaler.py#L22)
and [example script](https://github.com/clearml/clearml/blob/master/examples/services/aws-autoscaler/aws_autoscaler.py).
### TaskScheduler
The `TaskScheduler` class supports methods for scheduling periodic execution (like cron jobs). See the [code](https://github.com/allegroai/clearml/blob/master/clearml/automation/scheduler.py#L481)
and [example](https://github.com/allegroai/clearml/blob/master/examples/scheduler/cron_example.py).
The `TaskScheduler` class supports methods for scheduling periodic execution (like cron jobs). See the [code](https://github.com/clearml/clearml/blob/master/clearml/automation/scheduler.py#L481)
and [example](https://github.com/clearml/clearml/blob/master/examples/scheduler/cron_example.py).
### TriggerScheduler
The `TriggerScheduler` class facilitates triggering task execution in the case that specific events occur in the system
(such as model publication, dataset creation, task failure). See [code](https://github.com/allegroai/clearml/blob/master/clearml/automation/trigger.py#L148)
and [usage example](https://github.com/allegroai/clearml/blob/master/examples/scheduler/trigger_example.py).
(such as model publication, dataset creation, task failure). See [code](https://github.com/clearml/clearml/blob/master/clearml/automation/trigger.py#L148)
and [usage example](https://github.com/clearml/clearml/blob/master/examples/scheduler/trigger_example.py).
## Examples
The `clearml` GitHub repository includes an [examples folder](https://github.com/allegroai/clearml/tree/master/examples)
The `clearml` GitHub repository includes an [examples folder](https://github.com/clearml/clearml/tree/master/examples)
with example scripts demonstrating how to use the various functionalities of the ClearML SDK.
These examples are preloaded in the [ClearML Hosted Service](https://app.clear.ml), and can be viewed, cloned,

View File

@ -209,7 +209,7 @@ For example:
task = Task.create(
project_name='example',
task_name='task template',
repo='https://github.com/allegroai/clearml.git',
repo='https://github.com/clearml/clearml.git',
branch='master',
script='examples/reporting/html_reporting.py',
working_directory='.',
@ -423,7 +423,7 @@ different number of epochs and using a new base model:
cloned_task.set_parameters({'epochs':7, 'lr': 0.5})
# Override git repo information
cloned_task.set_repo(repo="https://github.com/allegroai/clearml.git", branch="my_branch_name")
cloned_task.set_repo(repo="https://github.com/clearml/clearml.git", branch="my_branch_name")
# Remove input model and set a new one
cloned_task.remove_input_models(models_to_remove=["<model_id>"])
cloned_task.set_input_model(model_id="<new_intput_model_id>")
@ -441,7 +441,7 @@ Task.enqueue(
)
```
See enqueue [example](https://github.com/allegroai/clearml/blob/master/examples/automation/programmatic_orchestration.py).
See enqueue [example](https://github.com/clearml/clearml/blob/master/examples/automation/programmatic_orchestration.py).
## Advanced Flows
@ -720,7 +720,7 @@ preprocess_task = Task.get_task(task_id='the_preprocessing_task_id')
local_csv = preprocess_task.artifacts['data'].get_local_copy()
```
See more details in the [Using Artifacts example](https://github.com/allegroai/clearml/blob/master/examples/reporting/using_artifacts_example.py).
See more details in the [Using Artifacts example](https://github.com/clearml/clearml/blob/master/examples/reporting/using_artifacts_example.py).
## Models
The following is an overview of working with models through a `Task` object. You can also work directly with model

View File

@ -39,7 +39,7 @@ solution.
## Components
![ClearML Serving](https://github.com/allegroai/clearml-serving/raw/main/docs/design_diagram.png?raw=true)
![ClearML Serving](https://github.com/clearml/clearml-serving/raw/main/docs/design_diagram.png?raw=true)
* **CLI** - Secure configuration interface for on-line model upgrade/deployment on running Serving Services

View File

@ -37,7 +37,7 @@ The following page goes over how to set up and upgrade `clearml-serving`.
1. Clone the `clearml-serving` repository:
```bash
git clone https://github.com/allegroai/clearml-serving.git
git clone https://github.com/clearml/clearml-serving.git
```
1. Edit the environment variables file (docker/example.env) with your clearml-server credentials and Serving Service UID.

View File

@ -3,7 +3,7 @@ title: Tutorial
---
In this tutorial, you will go over the model lifecycle -- from training to serving -- in the following steps:
* Training a model using the [sklearn example script](https://github.com/allegroai/clearml-serving/blob/main/examples/sklearn/train_model.py)
* Training a model using the [sklearn example script](https://github.com/clearml/clearml-serving/blob/main/examples/sklearn/train_model.py)
* Serving the model using **ClearML Serving**
* Spinning the inference container
@ -22,7 +22,7 @@ Before executing the steps below, make sure you have completed `clearml-serving`
Train a model using the example script. Start from the root directory of your local `clearml-serving` repository.
1. Create a Python virtual environment
1. Install the script requirements: `pip3 install -r examples/sklearn/requirements.txt`
1. Execute the [training script](https://github.com/allegroai/clearml-serving/blob/main/examples/sklearn/train_model.py): `python3 examples/sklearn/train_model.py`.
1. Execute the [training script](https://github.com/clearml/clearml-serving/blob/main/examples/sklearn/train_model.py): `python3 examples/sklearn/train_model.py`.
During execution, ClearML automatically registers the sklearn model and uploads it to the model repository.
For information about explicit model registration, see [Registering and Deploying New Models Manually](#registering-and-deploying-new-models-manually).
@ -50,7 +50,7 @@ and downloaded in real time when updated.
### Step 3: Spin Inference Container
Spin the Inference Container:
1. Customize container [Dockerfile](https://github.com/allegroai/clearml-serving/blob/main/clearml_serving/serving/Dockerfile) if needed
1. Customize container [Dockerfile](https://github.com/clearml/clearml-serving/blob/main/clearml_serving/serving/Dockerfile) if needed
1. Build container:
```bash
@ -76,7 +76,7 @@ everything is cached, responses will return almost immediately.
:::note
Review the model repository in the ClearML web UI, under the "serving examples" Project on your ClearML
account/server ([free hosted](https://app.clear.ml) or [self-deployed](https://github.com/allegroai/clearml-server)).
account/server ([free hosted](https://app.clear.ml) or [self-deployed](https://github.com/clearml/clearml-server)).
Inference services status, console outputs and machine metrics are available in the ClearML UI in the Serving Service
project (default: "DevOps" project).
@ -207,7 +207,7 @@ Example:
### Model Monitoring and Performance Metrics
![Grafana Screenshot](https://github.com/allegroai/clearml-serving/raw/main/docs/grafana_screenshot.png)
![Grafana Screenshot](https://github.com/clearml/clearml-serving/raw/main/docs/grafana_screenshot.png)
ClearML serving instances send serving statistics (count/latency) automatically to Prometheus and Grafana can be used
to visualize and create live dashboards.
@ -232,7 +232,7 @@ that you will be able to visualize on Grafana.
:::info time-series values
You can also log time-series values with `--variable-value x2` or discrete results (e.g. classifications strings) with
`--variable-enum animal=cat,dog,sheep`. Additional custom variables can be added in the preprocess and postprocess with
a call to `collect_custom_statistics_fn({'new_var': 1.337})`. See [preprocess_template.py](https://github.com/allegroai/clearml-serving/blob/main/clearml_serving/preprocess/preprocess_template.py).
a call to `collect_custom_statistics_fn({'new_var': 1.337})`. See [preprocess_template.py](https://github.com/clearml/clearml-serving/blob/main/clearml_serving/preprocess/preprocess_template.py).
:::
With the new metrics logged, you can create a visualization dashboard over the latency of the calls, and the output distribution.
@ -258,10 +258,10 @@ You can also specify per-endpoint log frequency with the `clearml-serving` CLI.
See examples of ClearML Serving with other supported frameworks:
* [scikit-learn](https://github.com/allegroai/clearml-serving/blob/main/examples/sklearn/readme.md) - Random data
* [scikit-learn Model Ensemble](https://github.com/allegroai/clearml-serving/blob/main/examples/ensemble/readme.md) - Random data
* [XGBoost](https://github.com/allegroai/clearml-serving/blob/main/examples/xgboost/readme.md) - Iris dataset
* [LightGBM](https://github.com/allegroai/clearml-serving/blob/main/examples/lightgbm/readme.md) - Iris dataset
* [PyTorch](https://github.com/allegroai/clearml-serving/blob/main/examples/pytorch/readme.md) - MNIST dataset
* [TensorFlow/Keras](https://github.com/allegroai/clearml-serving/blob/main/examples/keras/readme.md) - MNIST dataset
* [Model Pipeline](https://github.com/allegroai/clearml-serving/blob/main/examples/pipeline/readme.md) - Random data
* [scikit-learn](https://github.com/clearml/clearml-serving/blob/main/examples/sklearn/readme.md) - Random data
* [scikit-learn Model Ensemble](https://github.com/clearml/clearml-serving/blob/main/examples/ensemble/readme.md) - Random data
* [XGBoost](https://github.com/clearml/clearml-serving/blob/main/examples/xgboost/readme.md) - Iris dataset
* [LightGBM](https://github.com/clearml/clearml-serving/blob/main/examples/lightgbm/readme.md) - Iris dataset
* [PyTorch](https://github.com/clearml/clearml-serving/blob/main/examples/pytorch/readme.md) - MNIST dataset
* [TensorFlow/Keras](https://github.com/clearml/clearml-serving/blob/main/examples/keras/readme.md) - MNIST dataset
* [Model Pipeline](https://github.com/clearml/clearml-serving/blob/main/examples/pipeline/readme.md) - Random data

View File

@ -100,5 +100,5 @@ underutilized nodes. See [charts](https://github.com/kubernetes/autoscaler/tree/
:::important Enterprise features
The ClearML Enterprise plan supports K8S servicing multiple ClearML queues, as well as providing a pod template for each
queue for describing the resources for each pod to use. See [ClearML Helm Charts](https://github.com/allegroai/clearml-helm-charts/tree/main).
queue for describing the resources for each pod to use. See [ClearML Helm Charts](https://github.com/clearml/clearml-helm-charts/tree/main).
:::

View File

@ -4,7 +4,7 @@ title: Community Resources
## Join the ClearML Conversation
For feature requests or bug reports, see **ClearML** [GitHub issues](https://github.com/allegroai/clearml/issues).
For feature requests or bug reports, see **ClearML** [GitHub issues](https://github.com/clearml/clearml/issues).
If you have any questions, post on the **ClearML** [Slack channel](https://joinslack.clear.ml).
@ -28,8 +28,8 @@ Firstly, thank you for taking the time to contribute!
Contributions come in many forms:
* Reporting [issues](https://github.com/allegroai/clearml/issues) you've come upon
* Participating in issue discussions in the [issue tracker](https://github.com/allegroai/clearml/issues) and the
* Reporting [issues](https://github.com/clearml/clearml/issues) you've come upon
* Participating in issue discussions in the [issue tracker](https://github.com/clearml/clearml/issues) and the
[ClearML Community Slack space](https://joinslack.clear.ml)
* Suggesting new features or enhancements
* Implementing new features or fixing outstanding issues
@ -40,7 +40,7 @@ The list above is primarily guidelines, not rules. Use your best judgment and fe
By following these guidelines, you help maintainers and the community understand your report, reproduce the behavior, and find related reports.
Before reporting an issue, please check whether it already appears [here](https://github.com/allegroai/clearml/issues). If
Before reporting an issue, please check whether it already appears [here](https://github.com/clearml/clearml/issues). If
it does, join the ongoing discussion instead.
:::note
@ -85,7 +85,7 @@ Enhancement suggestions are tracked as GitHub issues. After you determine which
Before you submit a new PR:
* Verify that the work you plan to merge addresses an existing [issue](https://github.com/allegroai/clearml/issues) (if not, open a new one)
* Verify that the work you plan to merge addresses an existing [issue](https://github.com/clearml/clearml/issues) (if not, open a new one)
* Check related discussions in the [ClearML Slack community](https://joinslack.clear.ml)
(or start your own discussion on the ``#clearml-dev`` channel)
* Make sure your code conforms to the ClearML coding standards by running:

View File

@ -14,7 +14,7 @@ This reference page is organized by configuration file section:
* [files](#files-section) - Define auto-generated files to apply into local file system
See an [example configuration file](https://github.com/allegroai/clearml-agent/blob/master/docs/clearml.conf)
See an [example configuration file](https://github.com/clearml/clearml-agent/blob/master/docs/clearml.conf)
in the ClearML Agent GitHub repository.
:::info

View File

@ -101,7 +101,7 @@ The ClearML Server uses the following configuration files:
* `services.conf`
When starting up, the ClearML Server will look for these configuration files, in the `/opt/clearml/config` directory
(this path can be modified using the `CLEARML_CONFIG_DIR` environment variable). The default configuration files are in the [clearml-server](https://github.com/allegroai/clearml-server/tree/master/apiserver/config/default) repository.
(this path can be modified using the `CLEARML_CONFIG_DIR` environment variable). The default configuration files are in the [clearml-server](https://github.com/clearml/clearml-server/tree/master/apiserver/config/default) repository.
If you want to modify server configuration, and the relevant configuration file doesn't exist, you can create the file,
and input the relevant modified configuration.

View File

@ -72,10 +72,10 @@ and ClearML Server needs to be installed.
1. Download the migration package archive:
```
curl -L -O https://github.com/allegroai/clearml-server/releases/download/0.16.0/trains-server-0.16.0-migration.zip
curl -L -O https://github.com/clearml/clearml-server/releases/download/0.16.0/trains-server-0.16.0-migration.zip
```
If the file needs to be downloaded manually, use this direct link: [trains-server-0.16.0-migration.zip](https://github.com/allegroai/clearml-server/releases/download/0.16.0/trains-server-0.16.0-migration.zip).
If the file needs to be downloaded manually, use this direct link: [trains-server-0.16.0-migration.zip](https://github.com/clearml/clearml-server/releases/download/0.16.0/trains-server-0.16.0-migration.zip).
1. Extract the archive:
@ -109,7 +109,7 @@ and ClearML Server needs to be installed.
1. Clone the `trains-server-k8s` repository and change to the new `trains-server-k8s/upgrade-elastic` directory:
```
git clone https://github.com/allegroai/clearml-server-k8s.git && cd clearml-server-k8s/upgrade-elastic
git clone https://github.com/clearml/clearml-server-k8s.git && cd clearml-server-k8s/upgrade-elastic
```
1. Create the `upgrade-elastic` namespace and deployments:
@ -170,7 +170,7 @@ If the migration script does not complete successfully, the migration script pri
:::important
For help in resolving migration issues, check the **ClearML** [Slack channel](https://joinslack.clear.ml),
[GitHub issues](https://github.com/allegroai/clearml-server/issues), and the **ClearML Server** sections of the [FAQ](../faq.md).
[GitHub issues](https://github.com/clearml/clearml-server/issues), and the **ClearML Server** sections of the [FAQ](../faq.md).
:::
### Upgrading to ClearML Server v.1.2 or Newer

View File

@ -21,7 +21,7 @@ Ensure that the `helm` binary is in the PATH of your shell.
## Deployment
You will create a multi-node Kubernetes cluster using Helm, and then install ClearML in your cluster. For deployment
instructions with up-to-date Helms charts, see the [clearml-helm-charts repository](https://github.com/allegroai/clearml-helm-charts/tree/main/charts/clearml#local-environment).
instructions with up-to-date Helms charts, see the [clearml-helm-charts repository](https://github.com/clearml/clearml-helm-charts/tree/main/charts/clearml#local-environment).
:::warning Server Access
By default, ClearML Server launches with unrestricted access. To restrict ClearML Server access, follow the

View File

@ -11,7 +11,7 @@ In v1.2, the MongoDB subsystem of ClearML Server has been upgraded from version
necessitates the migration of the database contents to be compatible with the new version.
:::note Kubernetes installations
[ClearMLs helm chart](https://github.com/allegroai/clearml-helm-charts/tree/main/charts/clearml) is already running
[ClearMLs helm chart](https://github.com/clearml/clearml-helm-charts/tree/main/charts/clearml) is already running
mongodb version 4.4. If your ClearML server had been deployed with this chart (with the default mongodb bitnami chart) -
You can stop reading here, as no migration is required.
:::
@ -49,7 +49,7 @@ To avoid data corruption, shut down your ClearML server before applying the migr
### Migrating by Script
A migration script is available to automatically run this process for all supported OSs.
[Download the script](https://github.com/allegroai/clearml-server/releases/download/1.2.0/clearml-server-1.2.0-migration.py) and run it on your ClearML server.
[Download the script](https://github.com/clearml/clearml-server/releases/download/1.2.0/clearml-server-1.2.0-migration.py) and run it on your ClearML server.
Run `clearml-server-1.2.0-migration.py -help` to see execution options.
Note the script will create a backup archive of your data in the original directory.

View File

@ -39,7 +39,7 @@ with the ClearML SDK.
However, this also means that the **server must be secured** by either preventing any external access, or by changing
defaults so that the server's credentials are not publicly known.
The ClearML Server default secrets can be found [here](https://github.com/allegroai/clearml-server/blob/master/apiserver/config/default/secure.conf), and can be changed using the `secure.conf` configuration file or using environment variables
The ClearML Server default secrets can be found [here](https://github.com/clearml/clearml-server/blob/master/apiserver/config/default/secure.conf), and can be changed using the `secure.conf` configuration file or using environment variables
(see [ClearML Server Feature Configurations](clearml_server_config.md#clearml-server-feature-configurations)).
Specifically, the relevant settings are:
@ -101,6 +101,6 @@ services:
:::important
When generating new user keys and secrets, make sure to use sufficiently long strings (we use 30 chars for keys and 50-60
chars for secrets). See [here](https://github.com/allegroai/clearml-server/blob/master/apiserver/service_repo/auth/utils.py)
chars for secrets). See [here](https://github.com/clearml/clearml-server/blob/master/apiserver/service_repo/auth/utils.py)
for Python example code to generate these strings.
:::

View File

@ -7,7 +7,7 @@ title: Google Cloud Platform
MongoDB major version was upgraded from `v5.x` to `6.x`. Please note that if your current ClearML Server version is older than
`v1.17` (where MongoDB `v5.x` was first used), you'll need to first upgrade to ClearML Server v1.17.
First upgrade to ClearML Server v1.17 following the procedure below and using [this `docker-compose` file](https://github.com/allegroai/clearml-server/blob/2976ce69cc91550a3614996e8a8d8cd799af2efd/upgrade/1_17_to_2_0/docker-compose.yml). Once successfully upgraded,
First upgrade to ClearML Server v1.17 following the procedure below and using [this `docker-compose` file](https://github.com/clearml/clearml-server/blob/2976ce69cc91550a3614996e8a8d8cd799af2efd/upgrade/1_17_to_2_0/docker-compose.yml). Once successfully upgraded,
you can proceed to upgrade to v2.x.
</Collapsible>

View File

@ -16,7 +16,7 @@ helm upgrade clearml allegroai/clearml
helm upgrade clearml allegroai/clearml --version <CURRENT CHART VERSION> -f custom_values.yaml
```
See the [clearml-helm-charts repository](https://github.com/allegroai/clearml-helm-charts/tree/main/charts/clearml#local-environment)
See the [clearml-helm-charts repository](https://github.com/clearml/clearml-helm-charts/tree/main/charts/clearml#local-environment)
to view the up-to-date charts.
:::tip

View File

@ -7,7 +7,7 @@ title: Linux or macOS
MongoDB major version was upgraded from `v5.x` to `6.x`. Please note that if your current ClearML Server version is older than
`v1.17` (where MongoDB `v5.x` was first used), you'll need to first upgrade to ClearML Server v1.17.
First upgrade to ClearML Server v1.17 following the procedure below and using [this `docker-compose` file](https://github.com/allegroai/clearml-server/blob/2976ce69cc91550a3614996e8a8d8cd799af2efd/upgrade/1_17_to_2_0/docker-compose.yml). Once successfully upgraded,
First upgrade to ClearML Server v1.17 following the procedure below and using [this `docker-compose` file](https://github.com/clearml/clearml-server/blob/2976ce69cc91550a3614996e8a8d8cd799af2efd/upgrade/1_17_to_2_0/docker-compose.yml). Once successfully upgraded,
you can proceed to upgrade to v2.x.
</Collapsible>

View File

@ -7,7 +7,7 @@ title: Windows
MongoDB major version was upgraded from `v5.x` to `6.x`. Please note that if your current ClearML Server version is older than
`v1.17` (where MongoDB `v5.x` was first used), you'll need to first upgrade to ClearML Server v1.17.
First upgrade to ClearML Server v1.17 following the procedure below and using [this `docker-compose` file](https://github.com/allegroai/clearml-server/blob/2976ce69cc91550a3614996e8a8d8cd799af2efd/upgrade/1_17_to_2_0/docker-compose-win10.yml). Once successfully upgraded,
First upgrade to ClearML Server v1.17 following the procedure below and using [this `docker-compose` file](https://github.com/clearml/clearml-server/blob/2976ce69cc91550a3614996e8a8d8cd799af2efd/upgrade/1_17_to_2_0/docker-compose-win10.yml). Once successfully upgraded,
you can proceed to upgrade to v2.x.
</Collapsible>

View File

@ -368,7 +368,7 @@ Your firewall may be preventing the connection. Try one of the following solutio
```
pip install -U clearml
```
1. Create a new `clearml.conf` configuration file (see a [sample configuration file](https://github.com/allegroai/clearml/blob/master/docs/clearml.conf)), containing:
1. Create a new `clearml.conf` configuration file (see a [sample configuration file](https://github.com/clearml/clearml/blob/master/docs/clearml.conf)), containing:
```
api { verify_certificate = False }
@ -419,7 +419,7 @@ Conda and the [typing](https://pypi.org/project/typing/) package may have some c
However, [since Python 3.5](https://docs.python.org/3.5/library/typing.html), the `typing` package is part of the standard library.
To resolve the error, uninstall `typing` and rerun your script. If this does not fix the issue, create a [new ClearML issue](https://github.com/allegroai/clearml/issues/new), including the full error, and your environment details.
To resolve the error, uninstall `typing` and rerun your script. If this does not fix the issue, create a [new ClearML issue](https://github.com/clearml/clearml/issues/new), including the full error, and your environment details.
<a id="delete_exp"></a>

View File

@ -6,7 +6,7 @@ Two major components of MLOps/LLMOps are experiment reproducibility, and the abi
coupled with execution queues, address both these needs.
A ClearML worker is instantiated by launching a ClearML Agent, which is the base for **Automation** in ClearML and can be leveraged to build automated pipelines, launch custom services
(e.g. a [monitor and alert service](https://github.com/allegroai/clearml/tree/master/examples/services/monitoring)) and more.
(e.g. a [monitor and alert service](https://github.com/clearml/clearml/tree/master/examples/services/monitoring)) and more.
## What Does a ClearML Agent Do?
The ClearML agent allows users to execute code on any machine it's installed on, thus facilitating the

View File

@ -61,7 +61,7 @@ optimization.
documentation and a [code example](../guides/frameworks/pytorch/notebooks/image/hyperparameter_search.md).
* **Random** uniform sampling of hyperparameters - [`automation.RandomSearch`](../references/sdk/hpo_optimization_randomsearch.md).
* **Full grid** sampling strategy of every hyperparameter combination - [`automation.GridSearch`](../references/sdk/hpo_optimization_gridsearch.md).
* **Custom** - [`automation.optimization.SearchStrategy`](https://github.com/allegroai/clearml/blob/master/clearml/automation/optimization.py#L268) - Use a custom class and inherit from the ClearML automation base strategy class.
* **Custom** - [`automation.optimization.SearchStrategy`](https://github.com/clearml/clearml/blob/master/clearml/automation/optimization.py#L268) - Use a custom class and inherit from the ClearML automation base strategy class.
## Defining a Hyperparameter Optimization Search Example

View File

@ -113,7 +113,7 @@ they are attached to, and then retrieving the artifact with one of its following
* `get_local_copy()` - caches the files for later use and returns a path to the cached file.
* `get()` - use for Python objects. The method that returns the Python object.
See more details in the [Using Artifacts example](https://github.com/allegroai/clearml/blob/master/examples/reporting/using_artifacts_example.py).
See more details in the [Using Artifacts example](https://github.com/clearml/clearml/blob/master/examples/reporting/using_artifacts_example.py).
## Task Types
Tasks have a *type* attribute, which denotes their purpose. This helps to further

View File

@ -89,7 +89,7 @@ Calling `get_local_copy()` returns a local cached copy of the artifact. Therefor
need to download the artifact again.
Calling `get()` gets a deserialized pickled object.
Check out the [artifacts retrieval](https://github.com/allegroai/clearml/blob/master/examples/reporting/artifacts_retrieval.py) example code.
Check out the [artifacts retrieval](https://github.com/clearml/clearml/blob/master/examples/reporting/artifacts_retrieval.py) example code.
### Models
@ -112,10 +112,10 @@ Now, whenever the framework (TensorFlow/Keras/PyTorch etc.) stores a snapshot, t
Loading models by a framework is also logged by the system; these models appear in an experiment's **Artifacts** tab,
under the "Input Models" section.
Check out model snapshots examples for [TensorFlow](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorflow/tensorflow_mnist.py),
[PyTorch](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_mnist.py),
[Keras](https://github.com/allegroai/clearml/blob/master/examples/frameworks/keras/keras_tensorboard.py),
[scikit-learn](https://github.com/allegroai/clearml/blob/master/examples/frameworks/scikit-learn/sklearn_joblib_example.py).
Check out model snapshots examples for [TensorFlow](https://github.com/clearml/clearml/blob/master/examples/frameworks/tensorflow/tensorflow_mnist.py),
[PyTorch](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/pytorch_mnist.py),
[Keras](https://github.com/clearml/clearml/blob/master/examples/frameworks/keras/keras_tensorboard.py),
[scikit-learn](https://github.com/clearml/clearml/blob/master/examples/frameworks/scikit-learn/sklearn_joblib_example.py).
#### Loading Models
Loading a previously trained model is quite similar to loading artifacts.

View File

@ -35,19 +35,19 @@ training, and deploying models at every scale on any AI infrastructure.
<table>
<tbody>
<tr>
<td><a href="https://github.com/allegroai/clearml/blob/master/docs/tutorials/Getting_Started_1_Experiment_Management.ipynb"><b>Step 1</b></a> - Experiment Management</td>
<td><a href="https://github.com/clearml/clearml/blob/master/docs/tutorials/Getting_Started_1_Experiment_Management.ipynb"><b>Step 1</b></a> - Experiment Management</td>
<td className="align-center"><a className="no-ext-icon" target="_blank" href="https://colab.research.google.com/github/allegroai/clearml/blob/master/docs/tutorials/Getting_Started_1_Experiment_Management.ipynb">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a></td>
</tr>
<tr>
<td><a href="https://github.com/allegroai/clearml/blob/master/docs/tutorials/Getting_Started_2_Setting_Up_Agent.ipynb"><b>Step 2</b></a> - Remote Execution Agent Setup</td>
<td><a href="https://github.com/clearml/clearml/blob/master/docs/tutorials/Getting_Started_2_Setting_Up_Agent.ipynb"><b>Step 2</b></a> - Remote Execution Agent Setup</td>
<td className="align-center"><a className="no-ext-icon" target="_blank" href="https://colab.research.google.com/github/allegroai/clearml/blob/master/docs/tutorials/Getting_Started_2_Setting_Up_Agent.ipynb">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a></td>
</tr>
<tr>
<td><a href="https://github.com/allegroai/clearml/blob/master/docs/tutorials/Getting_Started_3_Remote_Execution.ipynb"><b>Step 3</b></a> - Remotely Execute Tasks</td>
<td><a href="https://github.com/clearml/clearml/blob/master/docs/tutorials/Getting_Started_3_Remote_Execution.ipynb"><b>Step 3</b></a> - Remotely Execute Tasks</td>
<td className="align-center"><a className="no-ext-icon" target="_blank" href="https://colab.research.google.com/github/allegroai/clearml/blob/master/docs/tutorials/Getting_Started_3_Remote_Execution.ipynb">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a></td>

View File

@ -39,7 +39,7 @@ required Python packages, and execute and monitor the process.
```
:::note
If you've already created credentials, you can copy-paste the default agent section from [here](https://github.com/allegroai/clearml-agent/blob/master/docs/clearml.conf#L15) (this is optional. If the section is not provided the default values will be used)
If you've already created credentials, you can copy-paste the default agent section from [here](https://github.com/clearml/clearml-agent/blob/master/docs/clearml.conf#L15) (this is optional. If the section is not provided the default values will be used)
:::
1. Start the agent's daemon and assign it to a [queue](../../fundamentals/agents_and_queues.md#what-is-a-queue):

View File

@ -32,6 +32,6 @@ It's essentially a toolbox stuffed with everything you'll need to go from experi
Doesn't matter if you're starting small or already in production, there's always a ClearML tool that can make your life easier.
Start for free at [app.clear.ml](https://app.clear.ml) or host your own server from our [GitHub page](https://github.com/allegroai/clearml-server).
Start for free at [app.clear.ml](https://app.clear.ml) or host your own server from our [GitHub page](https://github.com/clearml/clearml-server).
</Collapsible>

View File

@ -2,7 +2,7 @@
title: Remote Execution
---
The [execute_remotely_example](https://github.com/allegroai/clearml/blob/master/examples/advanced/execute_remotely_example.py)
The [execute_remotely_example](https://github.com/clearml/clearml/blob/master/examples/advanced/execute_remotely_example.py)
script demonstrates the use of the [`Task.execute_remotely()`](../../references/sdk/task.md#execute_remotely) method.
:::note

View File

@ -2,7 +2,7 @@
title: Multiple Tasks in Single Process
---
The [multiple_tasks_single_process](https://github.com/allegroai/clearml/blob/master/examples/advanced/multiple_tasks_single_process.py)
The [multiple_tasks_single_process](https://github.com/clearml/clearml/blob/master/examples/advanced/multiple_tasks_single_process.py)
script demonstrates the capability to log a single script in multiple ClearML tasks.
In order to log a script in multiple tasks, each task needs to be initialized using [`Task.init()`](../../references/sdk/task.md#taskinit)

View File

@ -2,13 +2,13 @@
title: Manual Random Parameter Search
---
The [manual_random_param_search_example.py](https://github.com/allegroai/clearml/blob/master/examples/automation/manual_random_param_search_example.py)
The [manual_random_param_search_example.py](https://github.com/clearml/clearml/blob/master/examples/automation/manual_random_param_search_example.py)
script demonstrates a random parameter search by automating the execution of a task multiple times, each time with
a different set of random hyperparameters.
This example accomplishes the automated random parameter search by doing the following:
1. Creating a template Task named `Keras HP optimization base`. To create it, run the [base_template_keras_simple.py](https://github.com/allegroai/clearml/blob/master/examples/optimization/hyper-parameter-optimization/base_template_keras_simple.py)
1. Creating a template Task named `Keras HP optimization base`. To create it, run the [base_template_keras_simple.py](https://github.com/clearml/clearml/blob/master/examples/optimization/hyper-parameter-optimization/base_template_keras_simple.py)
script. This task must be executed first, so it will be stored in the server, and then it can be accessed, cloned,
and modified by another Task.
1. Creating a parameter dictionary, which is connected to the Task by calling [`Task.connect()`](../../references/sdk/task.md#connect)

View File

@ -2,7 +2,7 @@
title: Programmatic Orchestration
---
The [programmatic_orchestration.py](https://github.com/allegroai/clearml/blob/master/examples/automation/programmatic_orchestration.py)
The [programmatic_orchestration.py](https://github.com/clearml/clearml/blob/master/examples/automation/programmatic_orchestration.py)
example demonstrates:
1. Creating an instance of a Task from a template Task.
1. Customizing that instance by changing the value of a parameter
@ -11,7 +11,7 @@ example demonstrates:
This example accomplishes a task pipe by doing the following:
1. Creating the template Task which is named `Toy Base Task`. It must be stored in ClearML Server before instances of
it can be created. To create it, run another ClearML example script, [toy_base_task.py](https://github.com/allegroai/clearml/blob/master/examples/automation/toy_base_task.py).
it can be created. To create it, run another ClearML example script, [toy_base_task.py](https://github.com/clearml/clearml/blob/master/examples/automation/toy_base_task.py).
The template Task has a parameter dictionary, which is connected to the Task: `{'Example_Param': 1}`.
1. Back in `programmatic_orchestration.py`, creating a parameter dictionary, which is connected to the Task by calling [`Task.connect`](../../references/sdk/task.md#connect)
so that the parameters are logged by ClearML. The dictionary contains the name of the parameter from the template

View File

@ -2,7 +2,7 @@
title: ClearML Task Tutorial
---
In this tutorial, you will use `clearml-task` to execute [a script](https://github.com/allegroai/events/blob/master/webinar-0620/keras_mnist.py)
In this tutorial, you will use `clearml-task` to execute [a script](https://github.com/clearml/events/blob/master/webinar-0620/keras_mnist.py)
on a remote or local machine, from a remote repository and your local machine.
### Prerequisites
@ -13,13 +13,13 @@ on a remote or local machine, from a remote repository and your local machine.
### Executing Code from a Remote Repository
``` bash
clearml-task --project keras_examples --name remote_test --repo https://github.com/allegroai/events.git --branch master --script /webinar-0620/keras_mnist.py --args batch_size=64 epochs=1 --queue default
clearml-task --project keras_examples --name remote_test --repo https://github.com/clearml/events.git --branch master --script /webinar-0620/keras_mnist.py --args batch_size=64 epochs=1 --queue default
```
This sets the following arguments:
* `--project keras_examples --name remote_test` - The project and task names
* `--repo https://github.com/allegroai/events.git` - The repository's URL. By default, `clearml-task` uses the latest
* `--repo https://github.com/clearml/events.git` - The repository's URL. By default, `clearml-task` uses the latest
commit from the master branch
* `--branch master` - The repository branch
* `--script /webinar-0620/keras_mnist.py` - The script to be executed
@ -48,8 +48,8 @@ Execution log at: https://app.clear.ml/projects/552d5399112d47029c146d5248570295
### Executing a Local Script
For this example, use a local version of [this script](https://github.com/allegroai/events/blob/master/webinar-0620/keras_mnist.py).
1. Clone the [allegroai/events](https://github.com/allegroai/events) repository
For this example, use a local version of [this script](https://github.com/clearml/events/blob/master/webinar-0620/keras_mnist.py).
1. Clone the [allegroai/events](https://github.com/clearml/events) repository
1. Go to the root folder of the cloned repository
1. Run the following command:

View File

@ -4,13 +4,13 @@ title: Executable Task Containers
This tutorial demonstrates using [`clearml-agent`](../../clearml_agent.md)'s [`build`](../../clearml_agent/clearml_agent_ref.md#build)
command to package a task into an executable container. In this example, you will build a container image that, when
run, will automatically execute the [keras_tensorboard.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/keras/keras_tensorboard.py)
run, will automatically execute the [keras_tensorboard.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/keras/keras_tensorboard.py)
script.
## Prerequisites
* [`clearml-agent`](../../clearml_agent/clearml_agent_setup.md#installation) installed and configured
* [`clearml`](../../getting_started/ds/ds_first_steps.md#install-clearml) installed and configured
* [clearml](https://github.com/allegroai/clearml) repo cloned (`git clone https://github.com/allegroai/clearml.git`)
* [clearml](https://github.com/clearml/clearml) repo cloned (`git clone https://github.com/clearml/clearml.git`)
## Creating the ClearML Task
1. Set up the task's execution environment:

View File

@ -12,7 +12,7 @@ be used when running optimization tasks.
## Prerequisites
* [`clearml-agent`](../../clearml_agent/clearml_agent_setup.md#installation) installed and configured
* [`clearml`](../../getting_started/ds/ds_first_steps.md#install-clearml) installed and configured
* [clearml](https://github.com/allegroai/clearml) repo cloned (`git clone https://github.com/allegroai/clearml.git`)
* [clearml](https://github.com/clearml/clearml) repo cloned (`git clone https://github.com/clearml/clearml.git`)
## Creating the ClearML Task
1. Set up the task's execution environment:

View File

@ -2,7 +2,7 @@
title: PyTorch Distributed
---
The [pytorch_distributed_example.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_distributed_example.py)
The [pytorch_distributed_example.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/pytorch_distributed_example.py)
script demonstrates integrating ClearML into code that uses the [PyTorch Distributed Communications Package](https://pytorch.org/docs/stable/distributed.html)
(`torch.distributed`).

View File

@ -2,7 +2,7 @@
title: Subprocess
---
The [subprocess_example.py](https://github.com/allegroai/clearml/blob/master/examples/distributed/subprocess_example.py)
The [subprocess_example.py](https://github.com/clearml/clearml/blob/master/examples/distributed/subprocess_example.py)
script demonstrates multiple subprocesses interacting and reporting to a main Task. The following happens in the script:
* This script initializes a main Task and spawns subprocesses, each for an instances of that Task.
* Each Task in a subprocess references the main Task by calling [`Task.current_task()`](../../references/sdk/task.md#taskcurrent_task),

View File

@ -1,7 +1,7 @@
---
title: AutoKeras
---
The [autokeras_imdb_example.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/autokeras/autokeras_imdb_example.py) example
The [autokeras_imdb_example.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/autokeras/autokeras_imdb_example.py) example
script demonstrates the integration of ClearML into code, which uses [autokeras](https://github.com/keras-team/autokeras).
The example does the following:

View File

@ -2,7 +2,7 @@
title: CatBoost
---
The [catboost_example.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/catboost/catboost_example.py)
The [catboost_example.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/catboost/catboost_example.py)
example demonstrates the integration of ClearML into code that uses [CatBoost](https://catboost.ai).
The example script does the following:

View File

@ -1,11 +1,11 @@
---
title: Fast.ai
---
The [fastai_with_tensorboard_example.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/fastai/legacy/fastai_with_tensorboard_example.py)
The [fastai_with_tensorboard_example.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/fastai/legacy/fastai_with_tensorboard_example.py)
example demonstrates the integration of ClearML into code that uses FastAI v1 and TensorBoard.
:::note FastAI V2
The ClearML repository also includes [examples using FastAI v2](https://github.com/allegroai/clearml/tree/master/examples/frameworks/fastai).
The ClearML repository also includes [examples using FastAI v2](https://github.com/clearml/clearml/tree/master/examples/frameworks/fastai).
:::

View File

@ -2,7 +2,7 @@
title: Transformers
---
The [Hugging Face Transformers example](https://github.com/allegroai/clearml/blob/master/examples/frameworks/huggingface/transformers.ipynb)
The [Hugging Face Transformers example](https://github.com/clearml/clearml/blob/master/examples/frameworks/huggingface/transformers.ipynb)
demonstrates how to integrate ClearML into your Transformer's [Trainer](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/trainer)
code. The Hugging Face Trainer automatically uses the built-in [`ClearMLCallback`](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/callback#transformers.integrations.ClearMLCallback)
if the `clearml` package is already installed, to log Transformers models, parameters, scalars, and more.

View File

@ -2,7 +2,7 @@
title: Keras with Matplotlib - Jupyter Notebook
---
The [jupyter.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/keras/jupyter.ipynb) example
The [jupyter.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/keras/jupyter.ipynb) example
demonstrates ClearML's automatic logging of code running in a Jupyter Notebook that uses Keras and Matplotlib.
The example does the following:

View File

@ -3,11 +3,11 @@ title: Keras with TensorBoard
---
The example below demonstrates the integration of ClearML into code which uses Keras and TensorBoard.
View it in [script](https://github.com/allegroai/clearml/blob/master/examples/frameworks/keras/keras_tensorboard.py)
or in [Jupyter Notebook](https://github.com/allegroai/clearml/blob/master/examples/frameworks/keras/jupyter_keras_TB_example.ipynb).
View it in [script](https://github.com/clearml/clearml/blob/master/examples/frameworks/keras/keras_tensorboard.py)
or in [Jupyter Notebook](https://github.com/clearml/clearml/blob/master/examples/frameworks/keras/jupyter_keras_TB_example.ipynb).
:::note
The example in [Jupyter Notebook](https://github.com/allegroai/clearml/blob/master/examples/frameworks/keras/jupyter_keras_TB_example.ipynb)
The example in [Jupyter Notebook](https://github.com/clearml/clearml/blob/master/examples/frameworks/keras/jupyter_keras_TB_example.ipynb)
includes a clickable icon to open the notebook in Google Colab.
:::

View File

@ -2,7 +2,7 @@
title: LightGBM
---
The [lightgbm_example](https://github.com/allegroai/clearml/blob/master/examples/frameworks/lightgbm/lightgbm_example.py)
The [lightgbm_example](https://github.com/clearml/clearml/blob/master/examples/frameworks/lightgbm/lightgbm_example.py)
script demonstrates the integration of ClearML into code that uses LightGBM.
The example script does the following:

View File

@ -5,11 +5,11 @@ title: Matplotlib
The example below demonstrates integrating ClearML into code that uses `matplotlib` to plot scatter diagrams, and
show images. ClearML automatically logs the diagrams and images.
View the example in [script](https://github.com/allegroai/clearml/blob/master/examples/frameworks/matplotlib/matplotlib_example.py)
or in [Jupyter Notebook](https://github.com/allegroai/clearml/blob/master/examples/frameworks/matplotlib/jupyter_matplotlib_example.ipynb).
View the example in [script](https://github.com/clearml/clearml/blob/master/examples/frameworks/matplotlib/matplotlib_example.py)
or in [Jupyter Notebook](https://github.com/clearml/clearml/blob/master/examples/frameworks/matplotlib/jupyter_matplotlib_example.ipynb).
:::note
The example in [Jupyter Notebook](https://github.com/allegroai/clearml/blob/master/examples/frameworks/matplotlib/jupyter_matplotlib_example.ipynb)
The example in [Jupyter Notebook](https://github.com/clearml/clearml/blob/master/examples/frameworks/matplotlib/jupyter_matplotlib_example.ipynb)
includes a clickable icon to open the notebook in Google Colab.
:::

View File

@ -2,7 +2,7 @@
title: MegEngine
---
The [megengine_mnist.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/megengine/megengine_mnist.py)
The [megengine_mnist.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/megengine/megengine_mnist.py)
example demonstrates the integration of ClearML into code that uses [MegEngine](https://github.com/MegEngine/MegEngine)
and [TensorBoardX](https://github.com/lanpa/tensorboardX). ClearML automatically captures models saved with `megengine`.

View File

@ -2,7 +2,7 @@
title: PyTorch Model Updating
---
The [pytorch_model_update.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_model_update.py)
The [pytorch_model_update.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/pytorch_model_update.py)
example demonstrates training a model and logging it using the [OutputModel](../../../references/sdk/model_outputmodel.md)
class.

View File

@ -2,7 +2,7 @@
title: Audio Classification - Jupyter Notebooks
---
The [audio_classification_UrbanSound8K.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/audio/audio_classifier_UrbanSound8K.ipynb) example script demonstrates integrating ClearML into a Jupyter Notebook which uses PyTorch, TensorBoard, and TorchVision to train a neural network on the UrbanSound8K dataset for audio classification. The example calls TensorBoard methods in training and testing to report scalars, audio debug samples, and spectrogram visualizations. The spectrogram visualizations are plotted by calling Matplotlib methods. The example also demonstrates connecting parameters to a Task and logging them. When the script runs, it creates a task named `audio classification UrbanSound8K` in the `Audio Example` project.
The [audio_classification_UrbanSound8K.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/audio/audio_classifier_UrbanSound8K.ipynb) example script demonstrates integrating ClearML into a Jupyter Notebook which uses PyTorch, TensorBoard, and TorchVision to train a neural network on the UrbanSound8K dataset for audio classification. The example calls TensorBoard methods in training and testing to report scalars, audio debug samples, and spectrogram visualizations. The spectrogram visualizations are plotted by calling Matplotlib methods. The example also demonstrates connecting parameters to a Task and logging them. When the script runs, it creates a task named `audio classification UrbanSound8K` in the `Audio Example` project.
## Scalars

View File

@ -2,7 +2,7 @@
title: Audio Preprocessing - Jupyter Notebook
---
The example [audio_preprocessing_example.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/audio/audio_preprocessing_example.ipynb)
The example [audio_preprocessing_example.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/audio/audio_preprocessing_example.ipynb)
demonstrates integrating ClearML into a Jupyter Notebook which uses PyTorch and preprocesses audio samples. ClearML automatically logs spectrogram visualizations reported by calling Matplotlib methods, and audio samples reported by calling TensorBoard methods. The example also demonstrates connecting parameters to a Task and logging them. When the script runs, it creates a task named `data pre-processing` in the `Audio Example` project.
## Plots

View File

@ -2,7 +2,7 @@
title: Image Hyperparameter Optimization - Jupyter Notebook
---
[hyperparameter_search.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/image/hyperparameter_search.ipynb)
[hyperparameter_search.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/image/hyperparameter_search.ipynb)
demonstrates using ClearML's [HyperParameterOptimizer](../../../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
class to perform automated hyperparameter optimization (HPO).
@ -12,7 +12,7 @@ The example maximizes total accuracy by finding an optimal number of epochs, bat
automatically logs the optimization's top performing tasks.
The task whose hyperparameters are optimized is named `image_classification_CIFAR10`. It is created by running another
ClearML example, [image_classification_CIFAR10.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/image/image_classification_CIFAR10.ipynb),
ClearML example, [image_classification_CIFAR10.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/image/image_classification_CIFAR10.ipynb),
which must run before `hyperparameter_search.ipynb`.
The optimizer Task, `Hyperparameter Optimization`, and the tasks appear individually in the [ClearML Web UI](../../../../../webapp/webapp_overview.md).

View File

@ -2,7 +2,7 @@
title: Image Classification - Jupyter Notebook
---
The example [image_classification_CIFAR10.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/image/image_classification_CIFAR10.ipynb)
The example [image_classification_CIFAR10.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/image/image_classification_CIFAR10.ipynb)
demonstrates integrating ClearML into a Jupyter Notebook, which uses PyTorch, TensorBoard, and TorchVision to train a
neural network on the CIFAR10 dataset for image classification. ClearML automatically logs the example script's
calls to TensorBoard methods in training and testing which report scalars and image debug samples, as well as the model

View File

@ -2,9 +2,9 @@
title: Tabular Data Downloading and Preprocessing - Jupyter Notebook
---
The [download_and_preprocessing.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/download_and_preprocessing.ipynb) example demonstrates ClearML storing preprocessed tabular data as artifacts, and explicitly reporting the tabular data in the **ClearML Web UI**. When the script runs, it creates a task named `tabular preprocessing` in the `Table Example` project.
The [download_and_preprocessing.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/download_and_preprocessing.ipynb) example demonstrates ClearML storing preprocessed tabular data as artifacts, and explicitly reporting the tabular data in the **ClearML Web UI**. When the script runs, it creates a task named `tabular preprocessing` in the `Table Example` project.
This tabular data is prepared for another script, [train_tabular_predictor.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/train_tabular_predictor.ipynb), which trains a network with it.
This tabular data is prepared for another script, [train_tabular_predictor.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/train_tabular_predictor.ipynb), which trains a network with it.
## Artifacts

View File

@ -9,17 +9,17 @@ class.
The pipeline uses four Tasks (each Task is created using a different notebook):
* The pipeline controller Task ([tabular_ml_pipeline.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/tabular_ml_pipeline.ipynb))
* A data preprocessing Task ([preprocessing_and_encoding.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/preprocessing_and_encoding.ipynb))
* A training Task ([train_tabular_predictor.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/train_tabular_predictor.ipynb))
* A better model comparison Task ([pick_best_model.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/pick_best_model.ipynb))
* The pipeline controller Task ([tabular_ml_pipeline.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/tabular_ml_pipeline.ipynb))
* A data preprocessing Task ([preprocessing_and_encoding.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/preprocessing_and_encoding.ipynb))
* A training Task ([train_tabular_predictor.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/train_tabular_predictor.ipynb))
* A better model comparison Task ([pick_best_model.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/pick_best_model.ipynb))
The `PipelineController` class includes functionality to create a pipeline controller, add steps to the pipeline, pass data from one step to another, control the dependencies of a step beginning only after other steps complete, run the pipeline, wait for it to complete, and cleanup afterwards.
In this pipeline example, the data preprocessing Task and training Task are each added to the pipeline twice (each is in two steps). When the pipeline runs, the data preprocessing Task and training Task are cloned twice, and the newly cloned Tasks execute. The Task they are cloned from, called the base Task, does not execute. The pipeline controller passes different data to each cloned Task by overriding parameters. In this way, the same Task can run more than once in the pipeline, but with different data.
:::note Download Data
The data download Task is not a step in the pipeline, see [download_and_split](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/download_and_split.ipynb).
The data download Task is not a step in the pipeline, see [download_and_split](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/download_and_split.ipynb).
:::
## Pipeline Controller and Steps
@ -246,17 +246,17 @@ By hovering over a step or path between nodes, you can view information about it
**To run the pipeline:**
1. Download the data by running the notebook [download_and_split.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/download_and_split.ipynb).
1. Download the data by running the notebook [download_and_split.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/download_and_split.ipynb).
1. Run the script for each of the steps, if the script has not run once before.
* [preprocessing_and_encoding.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/preprocessing_and_encoding.ipynb)
* [train_tabular_predictor.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/train_tabular_predictor.ipynb)
* [pick_best_model.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/pick_best_model.ipynb).
* [preprocessing_and_encoding.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/preprocessing_and_encoding.ipynb)
* [train_tabular_predictor.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/train_tabular_predictor.ipynb)
* [pick_best_model.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/pick_best_model.ipynb).
1. Run the pipeline controller one of the following two ways:
* Run the notebook [tabular_ml_pipeline.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/tabular_ml_pipeline.ipynb).
* Run the notebook [tabular_ml_pipeline.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/table/tabular_ml_pipeline.ipynb).
* Remotely execute the Task - If the Task `tabular training pipeline` which is associated with the project `Tabular Example` already exists in ClearML Server, clone it and enqueue it to execute.

View File

@ -2,7 +2,7 @@
title: Text Classification - Jupyter Notebook
---
The example [text_classification_AG_NEWS.ipynb](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/notebooks/text/text_classification_AG_NEWS.ipynb)
The example [text_classification_AG_NEWS.ipynb](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/notebooks/text/text_classification_AG_NEWS.ipynb)
demonstrates using Jupyter Notebook for ClearML, and the integration of ClearML into code which trains a network
to classify text in the `torchtext` [AG_NEWS](https://pytorch.org/text/stable/datasets.html#ag-news) dataset, and then applies the model to predict the classification of sample text.

View File

@ -2,7 +2,7 @@
title: PyTorch Abseil
---
The [pytorch_abseil.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_abseil.py)
The [pytorch_abseil.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/pytorch_abseil.py)
example demonstrates the integration of ClearML into code that uses PyTorch and [`absl.flags`](https://abseil.io/docs/python/guides/flags).
The example script does the following:

View File

@ -2,7 +2,7 @@
title: PyTorch Distributed
---
The [pytorch_distributed_example.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_distributed_example.py)
The [pytorch_distributed_example.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/pytorch_distributed_example.py)
script demonstrates integrating ClearML into a code that uses the [PyTorch Distributed Communications Package](https://pytorch.org/docs/stable/distributed.html)
(`torch.distributed`).

View File

@ -2,7 +2,7 @@
title: PyTorch with Matplotlib
---
The [pytorch_matplotlib.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_matplotlib.py)
The [pytorch_matplotlib.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/pytorch_matplotlib.py)
example demonstrates the integration of ClearML into code that uses PyTorch and Matplotlib.
The example does the following:

View File

@ -2,7 +2,7 @@
title: PyTorch MNIST
---
The [pytorch_mnist.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_mnist.py) example
The [pytorch_mnist.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/pytorch_mnist.py) example
demonstrates the integration of ClearML into code that uses PyTorch.
The example script does the following:

View File

@ -2,7 +2,7 @@
title: PyTorch with TensorBoard
---
The [pytorch_tensorboard.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_tensorboard.py)
The [pytorch_tensorboard.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/pytorch_tensorboard.py)
example demonstrates the integration of ClearML into code that uses PyTorch and TensorBoard.
The example does the following:

View File

@ -2,7 +2,7 @@
title: PyTorch TensorBoardX
---
The [pytorch_tensorboardX.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorboardx/pytorch_tensorboardX.py)
The [pytorch_tensorboardX.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/tensorboardx/pytorch_tensorboardX.py)
example demonstrates the integration of ClearML into code that uses PyTorch and TensorBoardX.
The example does the following:

View File

@ -2,7 +2,7 @@
title: PyTorch TensorBoard Toy
---
The [tensorboard_toy_pytorch.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/tensorboard_toy_pytorch.py)
The [tensorboard_toy_pytorch.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/tensorboard_toy_pytorch.py)
example demonstrates the integration of ClearML into code, which creates a TensorBoard `SummaryWriter` object to log
debug sample images. When the script runs, it creates a task named `pytorch tensorboard toy example`, which is
associated with the `examples` project.

View File

@ -2,7 +2,7 @@
title: PyTorch Ignite TensorboardLogger
---
The [cifar_ignite.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/ignite/cifar_ignite.py) example
The [cifar_ignite.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/ignite/cifar_ignite.py) example
script integrates ClearML into code that uses [PyTorch Ignite](https://github.com/pytorch/ignite).
The example script does the following:

View File

@ -2,7 +2,7 @@
title: PyTorch Lightning
---
The [pytorch-lightning](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch-lightning/pytorch_lightning_example.py)
The [pytorch-lightning](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch-lightning/pytorch_lightning_example.py)
script demonstrates the integration of ClearML into code that uses [PyTorch Lightning](https://www.pytorchlightning.ai/).
The example script does the following:

View File

@ -2,7 +2,7 @@
title: scikit-learn with Joblib
---
The [sklearn_joblib_example.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/scikit-learn/sklearn_joblib_example.py)
The [sklearn_joblib_example.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/scikit-learn/sklearn_joblib_example.py)
demonstrates the integration of ClearML into code that uses `scikit-learn` and `joblib` to store a model and model snapshots,
and `matplotlib` to create a scatter diagram. When the script runs, it creates a task named
`scikit-learn joblib example` in the `examples` project.

View File

@ -2,7 +2,7 @@
title: scikit-learn with Matplotlib
---
The [sklearn_matplotlib_example.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/scikit-learn/sklearn_matplotlib_example.py)
The [sklearn_matplotlib_example.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/scikit-learn/sklearn_matplotlib_example.py)
script demonstrates the integration of ClearML into code that uses `scikit-learn` and `matplotlib`.
The example does the following:

View File

@ -2,7 +2,7 @@
title: TensorBoardX with PyTorch
---
The [pytorch_tensorboardX.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorboardx/pytorch_tensorboardX.py)
The [pytorch_tensorboardX.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/tensorboardx/pytorch_tensorboardX.py)
example demonstrates the integration of ClearML into code that uses PyTorch and TensorBoardX.
The script does the following:

View File

@ -2,7 +2,7 @@
title: TensorBoardX Video
---
The [moveiepy_tensorboardx.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorboardx/moviepy_tensorboardx.py)
The [moveiepy_tensorboardx.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/tensorboardx/moviepy_tensorboardx.py)
example demonstrates the integration of ClearML into code, which creates a TensorBoardX `SummaryWriter` object to log
video data.

View File

@ -8,12 +8,12 @@ instructions.
:::
Integrate ClearML into code that uses [Keras Tuner](https://www.tensorflow.org/tutorials/keras/keras_tuner). By
specifying `ClearMLTunerLogger` (see [kerastuner.py](https://github.com/allegroai/clearml/blob/master/clearml/external/kerastuner.py))
specifying `ClearMLTunerLogger` (see [kerastuner.py](https://github.com/clearml/clearml/blob/master/clearml/external/kerastuner.py))
as the Keras Tuner logger, ClearML automatically logs scalars and hyperparameter optimization.
## ClearMLTunerLogger
Take a look at [keras_tuner_cifar.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/kerastuner/keras_tuner_cifar.py)
Take a look at [keras_tuner_cifar.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/kerastuner/keras_tuner_cifar.py)
example script, which demonstrates the integration of ClearML in a code that uses Keras Tuner.
The script does the following:

View File

@ -2,7 +2,7 @@
title: TensorBoard PR Curve
---
The [tensorboard_pr_curve.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorflow/tensorboard_pr_curve.py)
The [tensorboard_pr_curve.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/tensorflow/tensorboard_pr_curve.py)
example demonstrates the integration of ClearML into code that uses TensorFlow and TensorBoard.
The example script does the following:

View File

@ -2,7 +2,7 @@
title: TensorBoard Toy
---
The [tensorboard_toy.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorflow/tensorboard_toy.py)
The [tensorboard_toy.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/tensorflow/tensorboard_toy.py)
example demonstrates ClearML's automatic logging of TensorBoard scalars, histograms, images, and text, as well as
all other console output and TensorFlow Definitions.

View File

@ -2,7 +2,7 @@
title: TensorFlow MNIST
---
The [tensorflow_mnist.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorflow/tensorflow_mnist.py)
The [tensorflow_mnist.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/tensorflow/tensorflow_mnist.py)
example demonstrates the integration of ClearML into code that uses TensorFlow and Keras to train a neural network on
the Keras built-in [MNIST](https://www.tensorflow.org/api_docs/python/tf/keras/datasets/mnist) handwritten digits dataset.

View File

@ -2,7 +2,7 @@
title: XGBoost Metrics
---
The [xgboost_metrics.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/xgboost/xgboost_metrics.py)
The [xgboost_metrics.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/xgboost/xgboost_metrics.py)
example demonstrates the integration of ClearML into code that uses XGBoost to train a network on the scikit-learn [iris](https://scikit-learn.org/stable/modules/generated/sklearn.datasets.load_iris.html#sklearn.datasets.load_iris)
classification dataset. ClearML automatically captures models and scalars logged with XGBoost.

View File

@ -2,7 +2,7 @@
title: XGBoost and scikit-learn
---
The [xgboost_sample.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/xgboost/xgboost_sample.py)
The [xgboost_sample.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/xgboost/xgboost_sample.py)
example demonstrates integrating ClearML into code that uses [XGBoost](https://xgboost.readthedocs.io/en/stable/).
The example does the following:

View File

@ -20,7 +20,7 @@ and running, users can send Tasks to be executed on Google Colab's hardware.
1. Run the first cell, which installs all the necessary packages:
```
!pip install git+https://github.com/allegroai/clearml
!pip install git+https://github.com/clearml/clearml
!pip install clearml-agent
```
1. Run the second cell, which exports this environment variable:

View File

@ -17,7 +17,7 @@ private credentials (assuming the entire code base, including `.git` already exi
## Installation
1. Download the latest plugin version from the [Releases page](https://github.com/allegroai/clearml-pycharm-plugin/releases).
1. Download the latest plugin version from the [Releases page](https://github.com/clearml/clearml-pycharm-plugin/releases).
1. Install the plugin in PyCharm from local disk:

View File

@ -93,13 +93,13 @@ Now, let's execute some code in the remote session!
1. Open up a new Notebook.
1. In the first cell of the notebook, clone the [ClearML repository](https://github.com/allegroai/clearml):
1. In the first cell of the notebook, clone the [ClearML repository](https://github.com/clearml/clearml):
```
!git clone https://github.com/allegroai/clearml.git
!git clone https://github.com/clearml/clearml.git
```
1. In the second cell of the notebook, run this [script](https://github.com/allegroai/clearml/blob/master/examples/frameworks/keras/keras_tensorboard.py)
1. In the second cell of the notebook, run this [script](https://github.com/clearml/clearml/blob/master/examples/frameworks/keras/keras_tensorboard.py)
from the cloned repository:
```

View File

@ -7,7 +7,7 @@ slug: /guides
To help learn and use ClearML, we provide example scripts that demonstrate how to use ClearML's various features.
Examples scripts are in the [examples](https://github.com/allegroai/clearml/tree/master/examples) folder of the GitHub `clearml`
Examples scripts are in the [examples](https://github.com/clearml/clearml/tree/master/examples) folder of the GitHub `clearml`
repository. They are also preloaded in the **ClearML Server**.
Each examples folder in the GitHub ``clearml`` repository contains a ``requirements.txt`` file for example scripts in that folder.

View File

@ -2,7 +2,7 @@
title: Hyperparameter Optimization
---
The [hyper_parameter_optimizer.py](https://github.com/allegroai/clearml/blob/master/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py)
The [hyper_parameter_optimizer.py](https://github.com/clearml/clearml/blob/master/examples/optimization/hyper-parameter-optimization/hyper_parameter_optimizer.py)
example script demonstrates hyperparameter optimization (HPO), which is automated by using ClearML.
## Set the Search Strategy for Optimization
@ -24,7 +24,7 @@ The following search strategies can be used:
* Random uniform sampling of hyperparameter strategy - [`automation.RandomSearch`](../../../references/sdk/hpo_optimization_randomsearch.md)
* Full grid sampling strategy of every hyperparameter combination - [`automation.GridSearch`](../../../references/sdk/hpo_optimization_gridsearch.md).
* Custom - Use a custom class and inherit from the ClearML automation base strategy class, [`SearchStrategy`](https://github.com/allegroai/clearml/blob/master/clearml/automation/optimization.py#L310)
* Custom - Use a custom class and inherit from the ClearML automation base strategy class, [`SearchStrategy`](https://github.com/clearml/clearml/blob/master/clearml/automation/optimization.py#L310)
The search strategy class that is chosen will be passed to the [`automation.HyperParameterOptimizer`](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
object later.

View File

@ -2,20 +2,20 @@
title: Pipeline from Tasks
---
The [pipeline_from_tasks.py](https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_from_tasks.py)
The [pipeline_from_tasks.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/pipeline_from_tasks.py)
example demonstrates a simple pipeline, where each step is a [ClearML Task](../../fundamentals/task.md).
The pipeline is implemented using the [PipelineController](../../references/sdk/automation_controller_pipelinecontroller.md)
class. Steps are added to a PipelineController object, which launches and monitors the steps when executed.
This example incorporates four tasks, each of which is created using a different script:
* **Controller Task** ([pipeline_from_tasks.py](https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_from_tasks.py)) -
* **Controller Task** ([pipeline_from_tasks.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/pipeline_from_tasks.py)) -
Implements the pipeline controller, adds the steps (tasks) to the pipeline, and runs the pipeline.
* **Step 1** ([step1_dataset_artifact.py](https://github.com/allegroai/clearml/blob/master/examples/pipeline/step1_dataset_artifact.py)) -
* **Step 1** ([step1_dataset_artifact.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/step1_dataset_artifact.py)) -
Downloads data and stores the data as an artifact.
* **Step 2** ([step2_data_processing.py](https://github.com/allegroai/clearml/blob/master/examples/pipeline/step2_data_processing.py)) -
* **Step 2** ([step2_data_processing.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/step2_data_processing.py)) -
Loads the stored data (from Step 1), processes it, and stores the processed data as artifacts.
* **Step 3** ([step3_train_model.py](https://github.com/allegroai/clearml/blob/master/examples/pipeline/step3_train_model.py)) -
* **Step 3** ([step3_train_model.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/step3_train_model.py)) -
Loads the processed data (from Step 2) and trains a network.
When the controller task is executed, it clones the step tasks, and enqueues the newly cloned tasks for execution. Note
@ -100,7 +100,7 @@ The sections below describe in more detail what happens in the controller task a
## Step 1 - Downloading the Data
The pipeline's first step ([step1_dataset_artifact.py](https://github.com/allegroai/clearml/blob/master/examples/pipeline/step1_dataset_artifact.py))
The pipeline's first step ([step1_dataset_artifact.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/step1_dataset_artifact.py))
does the following:
1. Download data using [`StorageManager.get_local_copy()`](../../references/sdk/storage.md#storagemanagerget_local_copy):
@ -108,7 +108,7 @@ does the following:
```python
# simulate local dataset, download one, so we have something local
local_iris_pkl = StorageManager.get_local_copy(
remote_url='https://github.com/allegroai/events/raw/master/odsc20-east/generic/iris_dataset.pkl'
remote_url='https://github.com/clearml/events/raw/master/odsc20-east/generic/iris_dataset.pkl'
)
```
1. Store the data as an artifact named `dataset` using [`Task.upload_artifact()`](../../references/sdk/task.md#upload_artifact):
@ -119,7 +119,7 @@ does the following:
## Step 2 - Processing the Data
The pipeline's second step ([step2_data_processing.py](https://github.com/allegroai/clearml/blob/master/examples/pipeline/step2_data_processing.py))
The pipeline's second step ([step2_data_processing.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/step2_data_processing.py))
does the following:
1. Connect its configuration parameters with the ClearML task:
@ -154,7 +154,7 @@ does the following:
## Step 3 - Training the Network
The pipeline's third step ([step3_train_model.py](https://github.com/allegroai/clearml/blob/master/examples/pipeline/step3_train_model.py))
The pipeline's third step ([step3_train_model.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/step3_train_model.py))
does the following:
1. Connect its configuration parameters with the ClearML task. This allows the [pipeline controller](#the-pipeline-controller)
to override the `dataset_task_id` value as the pipeline is run.

View File

@ -2,7 +2,7 @@
title: Pipeline from Decorators
---
The [pipeline_from_decorator.py](https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_from_decorator.py)
The [pipeline_from_decorator.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/pipeline_from_decorator.py)
example demonstrates the creation of a pipeline in ClearML using the [`PipelineDecorator`](../../references/sdk/automation_controller_pipelinecontroller.md#class-automationcontrollerpipelinedecorator)
class.
@ -58,7 +58,7 @@ PipelineDecorator.set_default_execution_queue('default')
# PipelineDecorator.debug_pipeline()
executing_pipeline(
pickle_url='https://github.com/allegroai/events/raw/master/odsc20-east/generic/iris_dataset.pkl',
pickle_url='https://github.com/clearml/events/raw/master/odsc20-east/generic/iris_dataset.pkl',
)
```

View File

@ -2,7 +2,7 @@
title: Pipeline from Functions
---
The [pipeline_from_functions.py](https://github.com/allegroai/clearml/blob/master/examples/pipeline/pipeline_from_functions.py)
The [pipeline_from_functions.py](https://github.com/clearml/clearml/blob/master/examples/pipeline/pipeline_from_functions.py)
example script demonstrates the creation of a pipeline using the [PipelineController](../../references/sdk/automation_controller_pipelinecontroller.md)
class.
@ -45,7 +45,7 @@ the function will be automatically logged as required packages for the pipeline
pipe.add_parameter(
name='url',
description='url to pickle file',
default='https://github.com/allegroai/events/raw/master/odsc20-east/generic/iris_dataset.pkl'
default='https://github.com/clearml/events/raw/master/odsc20-east/generic/iris_dataset.pkl'
)
```

View File

@ -2,7 +2,7 @@
title: 3D Plots Reporting
---
The [3d_plots_reporting.py](https://github.com/allegroai/clearml/blob/master/examples/reporting/3d_plots_reporting.py)
The [3d_plots_reporting.py](https://github.com/clearml/clearml/blob/master/examples/reporting/3d_plots_reporting.py)
example demonstrates reporting a series as a surface plot and as a 3D scatter plot.
When the script runs, it creates a task named `3D plot reporting` in the `examples` project.

View File

@ -2,7 +2,7 @@
title: Artifacts Reporting
---
The [artifacts.py](https://github.com/allegroai/clearml/blob/master/examples/reporting/artifacts.py) example demonstrates
The [artifacts.py](https://github.com/clearml/clearml/blob/master/examples/reporting/artifacts.py) example demonstrates
uploading objects (other than models) to storage as task artifacts.
These artifacts include:

View File

@ -2,7 +2,7 @@
title: Using Logger - Jupyter Notebook
---
The [jupyter_logging_example.ipynb](https://github.com/allegroai/clearml/blob/master/examples/reporting/jupyter_logging_example.ipynb)
The [jupyter_logging_example.ipynb](https://github.com/clearml/clearml/blob/master/examples/reporting/jupyter_logging_example.ipynb)
script demonstrates the integration of ClearML's explicit reporting module, `Logger`, in a Jupyter Notebook. All ClearML
explicit reporting works with Jupyter Notebook.

View File

@ -4,7 +4,7 @@ title: Explicit Reporting Tutorial
In this tutorial, learn how to extend ClearML automagical capturing of inputs and outputs with explicit reporting.
In this example, you will add the following to the [pytorch_mnist.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_mnist.py)
In this example, you will add the following to the [pytorch_mnist.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/pytorch_mnist.py)
example script from ClearML's GitHub repo:
* Setting an output destination for model checkpoints (snapshots).
@ -14,12 +14,12 @@ example script from ClearML's GitHub repo:
## Prerequisites
* The [clearml](https://github.com/allegroai/clearml) repository is cloned.
* The [clearml](https://github.com/clearml/clearml) repository is cloned.
* The `clearml` package is installed.
## Before Starting
Make a copy of [pytorch_mnist.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_mnist.py)
Make a copy of [pytorch_mnist.py](https://github.com/clearml/clearml/blob/master/examples/frameworks/pytorch/pytorch_mnist.py)
to add explicit reporting to it.
```bash

View File

@ -2,7 +2,7 @@
title: HTML Reporting
---
The [html_reporting.py](https://github.com/allegroai/clearml/blob/master/examples/reporting/html_reporting.py) example
The [html_reporting.py](https://github.com/clearml/clearml/blob/master/examples/reporting/html_reporting.py) example
demonstrates reporting local HTML files and HTML by URL using [`Logger.report_media()`](../../references/sdk/logger.md#report_media).
ClearML reports these HTML debug samples in the **ClearML Web UI** **>** task's **DEBUG SAMPLES** tab.
@ -15,7 +15,7 @@ When the script runs, it creates a task named `html samples reporting` in the `e
Report HTML by URL using [`Logger.report_media()`](../../references/sdk/logger.md#report_media)'s `url` parameter.
See the example script's [`report_html_url`](https://github.com/allegroai/clearml/blob/master/examples/reporting/html_reporting.py#L16)
See the example script's [`report_html_url`](https://github.com/clearml/clearml/blob/master/examples/reporting/html_reporting.py#L16)
function, which reports the ClearML documentation's home page.
```python
@ -37,7 +37,7 @@ Report the following using `Logger.report_media()`'s `local_path` parameter:
### Interactive HTML
See the example script's [`report_html_periodic_table`](https://github.com/allegroai/clearml/blob/master/examples/reporting/html_reporting.py#L26) function, which reports a file created from Bokeh sample data.
See the example script's [`report_html_periodic_table`](https://github.com/clearml/clearml/blob/master/examples/reporting/html_reporting.py#L26) function, which reports a file created from Bokeh sample data.
```python
Logger.current_logger().report_media(
title="html",
@ -49,7 +49,7 @@ Logger.current_logger().report_media(
### Bokeh GroupBy HTML
See the example script's [`report_html_groupby`](https://github.com/allegroai/clearml/blob/master/examples/reporting/html_reporting.py#L117) function, which reports a Pandas GroupBy with nested HTML, created from Bokeh sample data.
See the example script's [`report_html_groupby`](https://github.com/clearml/clearml/blob/master/examples/reporting/html_reporting.py#L117) function, which reports a Pandas GroupBy with nested HTML, created from Bokeh sample data.
```python
Logger.current_logger().report_media(
title="html",
@ -61,7 +61,7 @@ Logger.current_logger().report_media(
### Bokeh Graph HTML
See the example script's [`report_html_graph`](https://github.com/allegroai/clearml/blob/master/examples/reporting/html_reporting.py#L162) function, which reports a Bokeh plot created from Bokeh sample data.
See the example script's [`report_html_graph`](https://github.com/clearml/clearml/blob/master/examples/reporting/html_reporting.py#L162) function, which reports a Bokeh plot created from Bokeh sample data.
```python
Logger.current_logger().report_media(
@ -74,7 +74,7 @@ Logger.current_logger().report_media(
### Bokeh Image HTML
See the example script's [`report_html_image`](https://github.com/allegroai/clearml/blob/master/examples/reporting/html_reporting.py#L195) function, which reports an image created from Bokeh sample data.
See the example script's [`report_html_image`](https://github.com/clearml/clearml/blob/master/examples/reporting/html_reporting.py#L195) function, which reports an image created from Bokeh sample data.
```python
Logger.current_logger().report_media(

View File

@ -2,7 +2,7 @@
title: Hyperparameters Reporting
---
The [hyper_parameters.py](https://github.com/allegroai/clearml/blob/master/examples/reporting/hyper_parameters.py) example
The [hyper_parameters.py](https://github.com/clearml/clearml/blob/master/examples/reporting/hyper_parameters.py) example
script demonstrates:
* ClearML's automatic logging of `argparse` command line options and TensorFlow Definitions
* Logging user-defined hyperparameters with a parameter dictionary and connecting the dictionary to a Task.

View File

@ -2,7 +2,7 @@
title: Image Reporting
---
The [image_reporting.py](https://github.com/allegroai/clearml/blob/master/examples/reporting/image_reporting.py) example
The [image_reporting.py](https://github.com/clearml/clearml/blob/master/examples/reporting/image_reporting.py) example
demonstrates reporting (uploading) images in several formats, including:
* NumPy arrays
* uint8

View File

@ -2,7 +2,7 @@
title: Manual Matplotlib Reporting
---
The [matplotlib_manual_reporting.py](https://github.com/allegroai/clearml/blob/master/examples/reporting/matplotlib_manual_reporting.py)
The [matplotlib_manual_reporting.py](https://github.com/clearml/clearml/blob/master/examples/reporting/matplotlib_manual_reporting.py)
example demonstrates using ClearML to log plots and images generated by Matplotlib and Seaborn.
## Plots

View File

@ -2,7 +2,7 @@
title: Media Reporting
---
The [media_reporting.py](https://github.com/allegroai/clearml/blob/master/examples/reporting/media_reporting.py) example
The [media_reporting.py](https://github.com/clearml/clearml/blob/master/examples/reporting/media_reporting.py) example
demonstrates reporting (uploading) images, audio, and video. Use [`Logger.report_media()`](../../references/sdk/logger.md#report_media)
to upload from:
* Local path

View File

@ -2,7 +2,7 @@
title: Model Reporting
---
The [model_reporting.py](https://github.com/allegroai/clearml/blob/master/examples/reporting/model_reporting.py) example
The [model_reporting.py](https://github.com/clearml/clearml/blob/master/examples/reporting/model_reporting.py) example
demonstrates logging a model using the [OutputModel](../../references/sdk/model_outputmodel.md)
class.

View File

@ -2,7 +2,7 @@
title: Tables Reporting (Pandas and CSV Files)
---
The [pandas_reporting.py](https://github.com/allegroai/clearml/blob/master/examples/reporting/pandas_reporting.py) example demonstrates reporting tabular data from Pandas DataFrames and CSV files as tables.
The [pandas_reporting.py](https://github.com/clearml/clearml/blob/master/examples/reporting/pandas_reporting.py) example demonstrates reporting tabular data from Pandas DataFrames and CSV files as tables.
ClearML reports these tables, and displays them in the **ClearML Web UI** **>** task's **PLOTS**
tab.

View File

@ -2,7 +2,7 @@
title: Plotly Reporting
---
The [plotly_reporting.py](https://github.com/allegroai/clearml/blob/master/examples/reporting/plotly_reporting.py) example
The [plotly_reporting.py](https://github.com/clearml/clearml/blob/master/examples/reporting/plotly_reporting.py) example
demonstrates ClearML's Plotly integration and reporting.
Report Plotly plots in ClearML by calling the [`Logger.report_plotly()`](../../references/sdk/logger.md#report_plotly) method, and passing a complex

Some files were not shown because too many files have changed in this diff Show More