Update documentation

This commit is contained in:
allegroai 2021-05-19 01:31:01 +03:00
parent 039d62cdca
commit 3e1916c620
61 changed files with 220 additions and 397 deletions

View File

@ -53,14 +53,15 @@ To run a session inside a Docker container, use the `--docker` flag and enter th
session.
### Passing requirements
`clearml-session` can download required Python packages. If the code you are going to execute in the remote session, has
required packages, they can be specified. If there is a `requirement.txt` file, the file can be attached to the
command using `--requirements </file/location.txt>`. Alternatively, packages can be manually entered, using `--packages "<package_name>"`
(for example `--packages "keras" "clearml"`).
`clearml-session` can download required Python packages.
A `requirement.txt` file can be attached to the command using `--requirements </file/location.txt>`.
Alternatively, packages can be manually specified, using `--packages "<package_name>"`
(for example `--packages "keras" "clearml"`) and they'll be automatically installed.
### Passing Git credentials
To send local .git-credentials file to the interactive session, add a `--git-credentials` flag and set it to `True`.
This way, git references can be tracked, including untracked changes.
This is helpful if working on private git repositories and allows for seemless cloning and tracking of git references,
including untracked changes.
### Re-launching and shutting down sessions
If a `clearml-session` was launched locally and is still running on a remote machine, users can easily reconnect to it.

View File

@ -1,3 +0,0 @@
---
title: Remote Pycharm Debugging
---

View File

@ -503,4 +503,14 @@ clearml-agent daemon --services-mode --queue services --create-queue --docker <d
Do not enqueue training or inference Tasks into the services queue. They will put an unnecessary load on the server.
:::
### Setting Server Credentials
Self hosted [ClearML Server](deploying_clearml/clearml_server.md) comes by default with a services queue.
By default, the server is open and does not require username and password, but it can be [password protected](deploying_clearml/clearml_server_security#user-access-security).
In case it is password protected the services agent will need to be configured with server credentials (associated with a user).
To do that, set these environment variables on the ClearML Server machine with the appropriate credentials:
```
CLEARML_API_ACCESS_KEY
CLEARML_API_SECRET_KEY
```

View File

@ -11,16 +11,14 @@ ClearML Data Management solves two important challenges:
**We believe Data is not code**. It should not be stored in a git tree, because progress on datasets is not always linear.
Moreover, it can be difficult and inefficient to find on a git tree the commit associated with a certain version of a dataset.
The data usage in experiments needs to have high observability and needs to be understood, not just by data scientists.
`clearml-data` allows to easily create new flexible datasets, from which users can add and remove files. These datasets can
be retrieved simply from any machine with physical or network access to the data. Additionally, datasets can be set up to
inherit from other datasets, so data lineages can be created, and users can track when and how their data changes.
A `clearml-data` dataset is a collection of files, stored on a central storage location (S3 \ GS \ Azure \ Network Storage).
Datasets can be set up to inherit from other datasets, so data lineages can be created,
and users can track when and how their data changes.<br/>
Dataset changes are stored using differentiable storage, meaning a version will store the change-set from its previous dataset parents
`clearml-data` utilizes existing object storage like S3/GS/Azure and even plain file system shares.
Datasets are stored in a binary differential format, allowing storage optimization and networking. Local copies
of datasets are always cached, so the same data never needs to be downloaded twice.
Local copies of datasets are always cached, so the same data never needs to be downloaded twice.
When a dataset is pulled it will automatically pull all parent datasets and merge them into one output folder for you to work with
ClearML-data offers two interfaces:
- `clearml-data` - CLI utility for creating, uploading, and managing datasets.
@ -249,10 +247,11 @@ Datasets can be searched by project, name, ID, and tags.
### Python API
All API commands should be imported with<br/> `from clearml import Dataset`
All API commands should be imported with<br/>
`from clearml import Dataset`
#### `Dataset.get(dataset_id='<DS_ID>').get_local_copy()`
#### `Dataset.get(dataset_id=DS_ID).get_local_copy()`
Returns a path to dataset in cache, and downloads it if it is not already in cache.
@ -265,7 +264,7 @@ Returns a path to dataset in cache, and downloads it if it is not already in cac
<br/>
#### `Dataset.get(dataset_id='<DS_ID>').get_mutable_local_copy()`
#### `Dataset.get(dataset_id=DS_ID).get_mutable_local_copy()`
Downloads the dataset to a specific folder (non-cached). If the folder already has contents, specify whether to overwrite
its contents with the dataset contents.

View File

@ -32,7 +32,7 @@ title: FAQ
**Graphs and Logs**
* [The first log lines are missing from the experiment log tab. Where did they go?](#first-log-lines-missing)
* [The first log lines are missing from the experiment console tab. Where did they go?](#first-log-lines-missing)
* [Can I create a graph comparing hyperparameters vs model accuracy?](#compare-graph-parameters)
* [I want to add more graphs, not just with TensorBoard. Is this supported?](#more-graph-types)
* [How can I report more than one scatter 2D series on the same plot?](#multiple-scatter2D)
@ -357,12 +357,12 @@ values are `True`, `False`, and a dictionary for fine-grain control. See [Task.i
## Graphs and Logs
**The first log lines are missing from the experiment log tab. Where did they go?** <a id="first-log-lines-missing"></a>
**The first log lines are missing from the experiment console tab. Where did they go?** <a id="first-log-lines-missing"></a>
Due to speed/optimization issues, we opted to display only the last several hundred log lines.
You can always download the full log as a file using the **ClearML Web UI**. In the **ClearML Web UI** **>** experiment
info panel *>* **RESULTS** tab **>** **LOG** sub-tab, use the **Download full log** feature.
info panel *>* **RESULTS** tab **>** **CONSOLE** sub-tab, use the **Download full log** feature.
<br/>

View File

@ -62,8 +62,8 @@ All the hyperparameters appear in **CONFIGURATIONS** **>** **HYPER PARAMETERS**.
![image](../../img/examples_pytorch_distributed_example_01a.png)
## Log
## Console
Output to the console, including the text messages printed from the main Task object and each subprocess appear in **RESULTS** **>** **LOG**.
Output to the console, including the text messages printed from the main Task object and each subprocess appear in **RESULTS** **>** **CONSOLE**.
![image](../../img/examples_pytorch_distributed_example_06.png)

View File

@ -9,7 +9,7 @@ script demonstrates multiple subprocesses interacting and reporting to a main Ta
which always returns the main Task.
* The Task in each subprocess reports the following to the main Task:
* Hyperparameters - Additional, different hyperparameters.
* Log - Text logged to the console as the Task in each subprocess executes.
* Console - Text logged to the console as the Task in each subprocess executes.
* When the script runs, it creates an experiment named `Popen example` which is associated with the `examples` project.
## Hyperparameters
@ -28,8 +28,8 @@ Parameter dictionaries appear in **General**.
![image](../../img/examples_subprocess_example_01a.png)
## Log
## Console
Output to the console, including the text messages from the Task in each subprocess, appear in **RESULTS** **>** **LOG**.
Output to the console, including the text messages from the Task in each subprocess, appear in **RESULTS** **>** **CONSOLE**.
![image](../../img/examples_subprocess_example_02.png)

View File

@ -18,9 +18,9 @@ which are titled **:monitor: machine**.
![image](../../../img/examples_keras_16.png)
## Log
## Console
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **LOG**.
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_keras_15.png)

View File

@ -23,6 +23,6 @@ Histograms output to TensorBoard. They appear in **RESULTS** **>** **PLOTS**.
## Logs
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **LOG**.
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_reporting_fastai_03.png)

View File

@ -36,7 +36,7 @@ Histograms for layer density appear in **RESULTS** **>** **PLOTS**.
## Log
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **LOG**.
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **CONSOLE**.
![image](../../../img/keras_colab_01.png)

View File

@ -63,9 +63,9 @@ The TensorFlow Definitions appear in the **TF_DEFINE** subsection.
![image](../../../img/examples_keras_jupyter_21.png)
## Log
## CONSOLE
Text printed to the console for training appears in **RESULTS** **>** **LOG**.
Text printed to the console for training appears in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_keras_jupyter_07.png)

View File

@ -40,9 +40,9 @@ TensorFlow Definitions appear in **TF_DEFINE**.
![image](../../../img/examples_keras_00a.png)
## Log
## Console
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **LOG**.
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_keras_03.png)

View File

@ -1,66 +0,0 @@
---
title: Manual Model Upload
---
The [manual_model_upload.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/keras/manual_model_upload.py)
example demonstrates **ClearML**'s tracking of a manually configured model created with Keras, including:
* Model checkpoints (snapshots),
* Hyperparameters
* Console output.
When the script runs, it creates an experiment named `Model configuration and upload`, which is associated with the `examples` project.
Configure **ClearML** for model checkpoint (snapshot) storage in any of the following ways ([debug sample](../../../references/sdk/logger.md#set_default_upload_destination)
storage is different):
* In the configuration file, set [default_output_uri](../../../configs/clearml_conf.md#sdkdevelopment).
* In code, when [initializing a Task](../../../references/sdk/task.md#taskinit), use the `output_uri` parameter.
* In the **ClearML Web UI**, when [modifying an experiment](../../../webapp/webapp_exp_tuning.md#output-destination).
## Configuration
This example shows two ways to connect a configuration, using the [Task.connect_configuration](../../../references/sdk/task.md#connect_configuration)
method.
* Connect a configuration file by providing the file's path. **ClearML Server** stores a copy of the file.
```python
# Connect a local configuration file
config_file = os.path.join('..', '..', 'reporting', 'data_samples', 'sample.json')
config_file = task.connect_configuration(config_file)
```
* Create a configuration dictionary and provide the dictionary.
```python
model_config_dict = {
'value': 13.37,
'dict': {'sub_value': 'string', 'sub_integer': 11},
'list_of_ints': [1, 2, 3, 4],
}
model_config_dict = task.connect_configuration(model_config_dict)
```
If the configuration changes, **ClearML** tracks it.
```python
model_config_dict['new value'] = 10
model_config_dict['value'] *= model_config_dict['new value']
```
The configuration appears in **CONFIGURATIONS** **>** **CONFIGURATION OBJECTS**.
![image](../../../img/examples_manual_model_upload_01.png)
## Artifacts
Model artifacts associated with the experiment appear in the experiment info panel (in the **EXPERIMENTS** tab), and in the model info panel (in the **MODELS** tab).
The experiment info panel shows model tracking, including the model name and design:
![image](../../../img/examples_manual_model_upload_02.png)
The model info panel contains the model details, including the model URL, framework, and snapshot locations.
![image](../../../img/examples_manual_model_upload_03.png)

View File

@ -1,79 +0,0 @@
---
title: Manual Model Upload
---
The [manual_model_upload.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/manual_model_upload.py)
example demonstrates **ClearML**'s tracking of a manually configured model created with PyTorch, including model checkpoints
(snapshots), and output to the console. When the script runs, it creates an experiment named `Model configuration and upload`,
which is associated with the `examples` project.
Configure **ClearML** for model checkpoint (snapshot) storage in any of the following ways ([debug sample](../../../references/sdk/logger.md#set_default_upload_destination) storage is different):
* In the configuration file, set [default_output_uri](../../../configs/clearml_conf.md#sdkdevelopment).
* In code, when [initializing a Task](../../../references/sdk/task.md#taskinit), use the `output_uri` parameter.
* In the **ClearML Web UI**, when [modifying an experiment](../../../webapp/webapp_exp_tuning.md#output-destination).
## Configuration
This example shows two ways to connect a configuration, using the [Task.connect_configuration](../../../references/sdk/task.md#connect_configuration)
method.
* Connect a configuration file by providing the file's path. **ClearML Server** stores a copy of the file.
```python
# Connect a local configuration file
config_file = os.path.join('..', '..', 'reporting', 'data_samples', 'sample.json')
config_file = task.connect_configuration(config_file)
```
* Create a configuration dictionary and plug it into the method.
```python
model_config_dict = {
'value': 13.37,
'dict': {'sub_value': 'string', 'sub_integer': 11},
'list_of_ints': [1, 2, 3, 4],
}
model_config_dict = task.connect_configuration(model_config_dict)
```
If the configuration changes, **ClearML** tracks it.
```python
model_config_dict['new value'] = 10
model_config_dict['value'] *= model_config_dict['new value']
```
## Artifacts
Model artifacts associated with the experiment appear in the info panel of the **EXPERIMENTS** tab and in the info panel
in the **MODELS** tab.
The model info panel contains model details, including:
* Model design (which is also in the experiment info panel)
* Label enumeration
* Model URL
* Framework
* Snapshot locations.
### General model information
![image](../../../img/examples_pytorch_manual_model_upload_03.png)
### Model design
![image](../../../img/examples_pytorch_manual_model_upload_04.png)
### Label enumeration
Connect a label enumeration dictionary by calling the [Task.connect_label_enumeration](../../../references/sdk/task.md#connect_label_enumeration)
method.
```python
# store the label enumeration of the training model
labels = {'background': 0, 'cat': 1, 'dog': 2}
task.connect_label_enumeration(labels)
```
![image](../../../img/examples_pytorch_manual_model_upload_05.png)

View File

@ -44,8 +44,8 @@ TensorFlow Definitions appear in the **TF_DEFINE** subsection.
![image](../../../../../img/examples_audio_classification_UrbanSound8K_01a.png)
## Log
## Console
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **LOG**.
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **CONSOLE**.
![image](../../../../../img/examples_audio_classification_UrbanSound8K_02.png)

View File

@ -71,9 +71,9 @@ These hyperparameters are those in the optimizer Task, where the `HyperParameter
![image](../../../../../img/examples_hyperparameter_search_01.png)
### Log
### Console
All console output from `Hyper-Parameter Optimization` appears in **RESULTS** tab, **LOG** sub-tab.
All console output from `Hyper-Parameter Optimization` appears in **RESULTS** tab, **CONSOLE** sub-tab.
![image](../../../../../img/examples_hyperparameter_search_03.png)

View File

@ -43,8 +43,8 @@ TensorFlow Definitions appear in the **TF_DEFINE** subsection.
![image](../../../../../img/examples_image_classification_CIFAR10_01a.png)
## Log
## Console
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **LOG**.
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **CONSOLE**.
![image](../../../../../img/examples_image_classification_CIFAR10_04.png)

View File

@ -46,8 +46,8 @@ Parameter dictionaries appear in the **General** subsection.
![image](../../../../../img/download_and_preprocessing_01.png)
## Log
## Console
Output to the console appears in **RESULTS** **>** **LOG**.
Output to the console appears in **RESULTS** **>** **CONSOLE**.
![image](../../../../../img/download_and_preprocessing_06.png)

View File

@ -29,9 +29,9 @@ Parameter dictionaries appear in the **General** subsection.
![image](../../../../../img/text_classification_AG_NEWS_01a.png)
## Log
## Console
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **LOG**.
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **CONSOLE**.
![image](../../../../../img/text_classification_AG_NEWS_02.png)

View File

@ -71,6 +71,6 @@ Task.current_task().connect(param)
## Log
Output to the console, including the text messages printed from the main Task object and each subprocess, appears in **RESULTS** **>** **LOG**.
Output to the console, including the text messages printed from the main Task object and each subprocess, appears in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_pytorch_distributed_example_06.png)

View File

@ -41,9 +41,9 @@ page **>** **RESULTS** **>** **SCALARS**.
![image](../../../img/examples_pytorch_mnist_01.png)
## Log
## Console
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **LOG**.
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_pytorch_mnist_06.png)

View File

@ -33,9 +33,9 @@ These scalars, along with the resource utilization plots, which are titled **:mo
![image](../../../img/examples_pytorch_tensorboard_01.png)
## Log
## Console
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **LOG**.
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_pytorch_tensorboard_06.png)

View File

@ -31,7 +31,7 @@ appear in the experiment's page in the **web UI**, under **RESULTS** **>** **SCA
## Log
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **LOG**.
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_pytorch_tensorboardx_02.png)

View File

@ -27,9 +27,9 @@ The loss and accuracy metric scalar plots appear in the experiment's page in the
![image](../../../img/examples_pytorch_tensorboardx_01.png)
## Log
## Console
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **LOG**.
Text printed to the console for training progress, as well as all other console output, appear in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_pytorch_tensorboardx_02.png)

View File

@ -1,86 +0,0 @@
---
title: Manual Model Upload
---
The [manual_model_upload.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorflow/manual_model_upload.py)
example demonstrates **ClearML**'s tracking of a manually configured model created with TensorFlow, including:
* Model checkpoints (snapshots)
* Hyperparameters
* Output to the console.
When the script runs, it creates an experiment named `Model configuration and upload`, which is associated with the `examples` project.
Configure **ClearML** for model checkpoints (model snapshot) storage in any of the following ways ([debug sample](../../../references/sdk/logger.md#set_default_upload_destination)
storage is different):
* In the configuration file, set [default_output_uri](../../../configs/clearml_conf.md#sdkdevelopment).
* In code, when [initializing a Task](../../../references/sdk/task.md#taskinit), use the `output_uri` parameter.
* In the **ClearML Web UI**, when [modifying an experiment](../../../webapp/webapp_exp_tuning.md#output-destination).
## Configuration
This example shows two ways to connect a configuration, using the [Task.connect_configuration](../../../references/sdk/task.md#connect_configuration) method:
* Connect a configuration file by providing the file's path. **ClearML Server** stores a copy of the file.
```python
# Connect a local configuration file
config_file = os.path.join('..', '..', 'reporting', 'data_samples', 'sample.json')
config_file = task.connect_configuration(config_file)
```
* Create a configuration dictionary and provide the dictionary.
```python
model_config_dict = {
'value': 13.37,
'dict': {'sub_value': 'string', 'sub_integer': 11},
'list_of_ints': [1, 2, 3, 4],
}
model_config_dict = task.connect_configuration(model_config_dict)
```
If the configuration changes, **ClearML** track it.
```python
model_config_dict['new value'] = 10
model_config_dict['value'] *= model_config_dict['new value']
```
The configuration appears in the experiment's page in the **ClearML web UI**, under **CONFIGURATIONS** **>**
**CONFIGURATION OBJECTS**.
![image](../../../img/examples_manual_model_upload_01.png)
The output model's configuration appears in **ARTIFACTS** **>** **Output Model**.
## Artifacts
Model artifacts associated with the experiment appear in the info panel of the **EXPERIMENTS** tab) and in the info panel
of the **MODELS** tab.
The experiment info panel shows model tracking, including the model name and design (in this case, no design was stored).
![image](../../../img/examples_manual_model_upload_30.png)
The model info panel contains the model details, including:
* Model design
* label enumeration
* Model URL
* Framework
* Snapshot locations.
### General model information
![image](../../../img/examples_pytorch_manual_model_upload_03.png)
### Label enumeration
Connect a label enumeration dictionary by calling the [Task.connect_label_enumeration](../../../references/sdk/task.md#connect_label_enumeration) method.
```python
# store the label enumeration of the training model
labels = {'background': 0, 'cat': 1, 'dog': 2}
task.connect_label_enumeration(labels)
```
![image](../../../img/examples_pytorch_manual_model_upload_05.png)

View File

@ -31,8 +31,8 @@ In the **ClearML Web UI**, the PR Curve summaries appear in the experiment's pag
![image](../../../img/examples_tensorboard_pr_curve_04.png)
## Log
## Console
All other console output appears in **RESULTS** **>** **LOG**.
All other console output appears in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_tensorboard_pr_curve_05.png)

View File

@ -29,9 +29,9 @@ The loss and accuracy metric scalar plots appear in the experiment's page in the
![image](../../../img/examples_tensorflow_mnist_01.png)
## Log
## Console
All console output appears in **RESULTS** **>** **LOG**.
All console output appears in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_tensorflow_mnist_05.png)

View File

@ -31,9 +31,9 @@ The feature importance plot and tree plot appear in the project's page in the **
![image](../../../img/examples_xgboost_sample_06.png)
## Log
## Console
All other console output appear in **RESULTS** **>** **LOG**.
All other console output appear in **RESULTS** **>** **CONSOLE**.
![image](../../../img/examples_xgboost_sample_05.png)

View File

@ -14,7 +14,7 @@ The **ClearML PyCharm plugin** enables syncing a local execution configuration t
**To install the ClearML PyCharm plugin, do the following:**
1. Download the latest plugin version from the [Releases page](https://github.com/allegroai/trains-pycharm-plugin/releases).
1. Download the latest plugin version from the [Releases page](https://github.com/allegroai/clearml-pycharm-plugin/releases).
1. Install the plugin in PyCharm from local disk:

View File

@ -239,7 +239,7 @@ After extending the Python experiment script, run it and view the results in the
including the file path, size, hash, metadata, and a preview.
1. In the **OTHER** section, click **Loss**. The uploaded numpy array appears, including its related information.
1. Click the **RESULTS** tab.
1. Click the **LOG** sub-tab, and see the debugging message showing the Pandas DataFrame sample.
1. Click the **CONSOLE** sub-tab, and see the debugging message showing the Pandas DataFrame sample.
1. Click the **SCALARS** sub-tab, and see the scalar plots for epoch logging loss.
1. Click the **PLOTS** sub-tab, and see the confusion matrix and histogram.

View File

@ -54,7 +54,7 @@ It shows the server links are:
![image](../../img/examples_execute_jupyter_notebook_server_02.png)
The Jupyter Note Server console output appears in **RESULTS** **>** **LOG**, including log entries for the notebooks run
The Jupyter Note Server console output appears in **RESULTS** **>** **CONSOLE**, including log entries for the notebooks run
on the server.
To test the Jupyter Notebook, we ran a notebook named audio_preprocessing_example.ipynb. The log shows it was saved:

Binary file not shown.

After

Width:  |  Height:  |  Size: 101 KiB

View File

@ -2,43 +2,17 @@
title: Storage
---
import ImageSwitcher from '/ImageSwitcher.js';
ClearML is able to interface with the most popular storage solutions in the market for storing model checkpoints, artifacts
and charts.
Supported storage mediums include:
<div className="supported-storages">
<div>
<img src="/icons/ico-local-and-shared.svg" alt="Storage icon" />
Local and shared folders
</div>
<div>
<img src="/icons/ico-aws-s3.svg" alt="Storage icon" />
S3 buckets
</div>
<div>
<img src="/icons/ico-google-cloud-storage.svg" alt="Storage icon" />
Google Cloud Storage
</div>
<div>
<img src="/icons/ico-azure-storage.svg" alt="Storage icon" />
Azure Storage
</div>
<div>
<img src="/icons/ico-nas.svg" alt="Storage icon" />
http(s)
</div>
<div>
<img src="/icons/ico-minio.svg" alt="Storage icon" />
minio
</div>
<div>
<img src="/icons/ico-ceph.svg" alt="Storage icon" />
ceph
</div>
</div>
---
<ImageSwitcher alt="ClearML Supported Storage"
lightImageSrc="/icons/ClearML_Supported_Storage--on-light.png"
darkImageSrc="/icons/ClearML_Supported_Storage--on-dark.png"
/>
:::note
Once uploading an object to a storage medium, each machine that uses the object must have access to it.

View File

@ -0,0 +1,5 @@
---
title: API Definitions
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: REST API
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: PipelineController
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: ClearmlJob
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: Dataset
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: OptimizerBOHB
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: GridSearch
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: HyperParameterOptimizer
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: RandomSearch
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: OptimizerOptuna
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: DiscreteParameterRange
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: ParameterSet
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: UniformIntegerParameterRange
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: UniformParameterRange
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: Logger
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: InputModel
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: Model
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: OutputModel
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: AwsAutoScaler
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: Monitor
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: StorageManager
---
**AutoGenerated PlaceHolder**

View File

@ -0,0 +1,5 @@
---
title: Task
---
**AutoGenerated PlaceHolder**

View File

@ -9,7 +9,7 @@ including:
* [Configuration](#configuration) - Hyperparameters, user properties, and configuration objects.
* [Artifacts](#artifacts) - Input model, output model, model snapshot locations, other artifacts.
* [General information](#general-information) - Information about the experiment, for example: the experiment start, create, and last update times and dates, user creating the experiment, and its description.
* [Logs](#log) - stdout, stderr, output to the console from libraries, and **ClearML** explicit reporting.
* [Console](#console) - stdout, stderr, output to the console from libraries, and **ClearML** explicit reporting.
* [Scalars](#scalars) - Metric plots.
* [Plots](#other-plots) - Other plots and data, for example: Matplotlib, Plotly, and **ClearML** explicit reporting.
* [Debug samples](#debug-samples) - Images, audio, video, and HTML.
@ -332,9 +332,9 @@ General experiment details appear in the **INFO** tab. This includes information
### Log
### Console
The complete experiment log containing everything printed to stdout and strerr appears in the **LOG** tab. The full log
The complete experiment log containing everything printed to stdout and strerr appears in the **CONSOLE** tab. The full log
is downloadable. To view the end of the log, click **Jump to end**.
<details className="cml-expansion-panel screenshot">

View File

@ -186,13 +186,13 @@ module.exports = {
sidebarPath: require.resolve('./sidebars.js'),
// Please change this to your repo.
editUrl:
'https://github.com/allegroai/clearml_docs/edit/master/website/',
'https://github.com/allegroai/clearml-docs/edit/main/',
},
API: {
sidebarPath: require.resolve('./sidebars.js'),
// Please change this to your repo.
editUrl:
'https://github.com/allegroai/clearml_docs/edit/master/website/',
'https://github.com/allegroai/clearml-docs/edit/main/',
},
blog: {
blogTitle: 'ClearML Tutorials',
@ -202,7 +202,7 @@ module.exports = {
showReadingTime: true,
// Please change this to your repo.
editUrl:
'https://github.com/allegroai/clearml_docs/edit/master/website/tutorials/',
'https://github.com/allegroai/clearml-docs/edit/main/tutorials/',
},
theme: {
customCss: require.resolve('./src/css/custom.css'),

View File

@ -63,12 +63,11 @@ module.exports = {
{'Autokeras': ['guides/frameworks/autokeras/integration_autokeras', 'guides/frameworks/autokeras/autokeras_imdb_example']},
{'FastAI': ['guides/frameworks/fastai/fastai_with_tensorboard']},
{
'Keras': ['guides/frameworks/keras/allegro_clearml_keras_tb_example', 'guides/frameworks/keras/jupyter', 'guides/frameworks/keras/keras_tensorboard',
'guides/frameworks/keras/manual_model_upload']
'Keras': ['guides/frameworks/keras/allegro_clearml_keras_tb_example', 'guides/frameworks/keras/jupyter', 'guides/frameworks/keras/keras_tensorboard']
},
{'Matplotlib': ['guides/frameworks/matplotlib/allegro_clearml_matplotlib_example', 'guides/frameworks/matplotlib/matplotlib_example']},
{
'Pytorch': ['guides/frameworks/pytorch/manual_model_upload', 'guides/frameworks/pytorch/pytorch_distributed_example', 'guides/frameworks/pytorch/pytorch_matplotlib',
'Pytorch': ['guides/frameworks/pytorch/pytorch_distributed_example', 'guides/frameworks/pytorch/pytorch_matplotlib',
'guides/frameworks/pytorch/pytorch_mnist', 'guides/frameworks/pytorch/pytorch_tensorboard', 'guides/frameworks/pytorch/pytorch_tensorboardx',
'guides/frameworks/pytorch/tensorboard_toy_pytorch']
},
@ -84,12 +83,12 @@ module.exports = {
{'Scikit-Learn': ['guides/frameworks/scikit-learn/sklearn_joblib_example', 'guides/frameworks/scikit-learn/sklearn_matplotlib_example']},
{'TensorboardX': ['guides/frameworks/tensorboardx/tensorboardx']},
{
'Tensorflow': ['guides/frameworks/tensorflow/manual_model_upload', 'guides/frameworks/tensorflow/tensorboard_pr_curve', 'guides/frameworks/tensorflow/tensorboard_toy',
'Tensorflow': ['guides/frameworks/tensorflow/tensorboard_pr_curve', 'guides/frameworks/tensorflow/tensorboard_toy',
'guides/frameworks/tensorflow/tensorflow_mnist', 'guides/frameworks/tensorflow/integration_keras_tuner']
},
{'XGboost': ['guides/frameworks/xgboost/xgboost_sample']}
]},
{'IDEs': ['guides/ide/remote_jupyter_tutorial']},
{'IDEs': ['guides/ide/remote_jupyter_tutorial', 'guides/ide/integration_pycharm']},
{'Optimization': ['guides/optimization/hyper-parameter-optimization/examples_hyperparam_opt']},
{'Pipelines': ['guides/pipeline/pipeline_controller']},

View File

@ -181,17 +181,17 @@ html[data-theme="light"] .hero .button.button--primary{
/* header social icons */
.header-ico--github {
background-image: url(/img/ico-github.svg);
background-image: url(docs/latest/img/ico-github.svg);
margin-right: calc(var(--ifm-navbar-item-padding-horizontal) * 2);
}
.header-ico--twitter {
background-image: url(/img/ico-twitter.svg);
background-image: url(docs/latest/img/ico-twitter.svg);
}
.header-ico--youtube {
background-image: url(/img/ico-youtube.svg);
background-image: url(docs/latest/img/ico-youtube.svg);
}
.header-ico--slack {
background-image: url(/img/ico-slack.svg);
background-image: url(docs/latest/img/ico-slack.svg);
}
.header-ico {
width:24px;
@ -255,7 +255,7 @@ html[data-theme="light"] .hero .button.button--primary{
html[data-theme="light"] [class^="sidebarLogo"] {
background: url(/img/logo--on-light.svg) 1rem center no-repeat;
background: url(docs/latest/img/logo--on-light.svg) 1rem center no-repeat;
background-size: 6rem;
}
html[data-theme="light"] [class^="sidebarLogo"] > img {
@ -283,7 +283,7 @@ html[data-theme="dark"] .navbar-sidebar {
background-color: #141722;
}
html[data-theme="light"] .navbar-sidebar .navbar__brand {
background: url(/img/logo--on-light.svg) 0 center no-repeat;
background: url(docs/latest/img/logo--on-light.svg) 0 center no-repeat;
background-size: 5.5rem;
}
html[data-theme="light"] .navbar-sidebar .navbar__logo {
@ -315,16 +315,16 @@ html[data-theme="dark"] .navbar-sidebar .menu__link.header-ico--slack:before {
.navbar-sidebar .menu__link.header-ico--slack:hover {
background: url(/img/ico-slack.svg) no-repeat center;
background: url(docs/latest/img/ico-slack.svg) no-repeat center;
}
.navbar-sidebar .menu__link.header-ico--twitter:hover {
background: url(/img/ico-twitter.svg) no-repeat center;
background: url(docs/latest/img/ico-twitter.svg) no-repeat center;
}
.navbar-sidebar .menu__link.header-ico--github:hover {
background: url(/img/ico-github.svg) no-repeat center;
background: url(docs/latest/img/ico-github.svg) no-repeat center;
}
.navbar-sidebar .menu__link.header-ico--youtube:hover {
background: url(/img/ico-youtube.svg) no-repeat center;
background: url(docs/latest/img/ico-youtube.svg) no-repeat center;
}
.menu__link.header-ico {
@ -414,22 +414,22 @@ html[data-theme="dark"] .footer__copyright {
/* social links icons */
.footer__link-item[href*="slack"] {
padding-left: 1.4rem;
background: url(/img/ico-slack.svg) no-repeat left center;
background: url(docs/latest/img/ico-slack.svg) no-repeat left center;
background-size: 1rem;
}
.footer__link-item[href*="youtube"] {
padding-left: 1.4rem;
background: url(/img/ico-youtube.svg) no-repeat left center;
background: url(docs/latest/img/ico-youtube.svg) no-repeat left center;
background-size: 1rem;
}
.footer__link-item[href*="twitter"] {
padding-left: 1.4rem;
background: url(/img/ico-twitter.svg) no-repeat left center;
background: url(docs/latest/img/ico-twitter.svg) no-repeat left center;
background-size: 1rem;
}
.footer__link-item[href*="stackoverflow"] {
padding-left: 1.4rem;
background: url(/img/ico-stackoverflow.svg) no-repeat left center;
background: url(docs/latest/img/ico-stackoverflow.svg) no-repeat left center;
background-size: 1rem;
}
@ -442,8 +442,8 @@ html[data-theme="light"] .footer__link-item[href*="stackoverflow"] {
/* ===MARKDOWN=== */
.markdown img[src*="png"],
.markdown img[src*="gif"] {
.markdown img.medium-zoom-image[src*="png"],
.markdown img.medium-zoom-image[src*="gif"] {
border:1px solid #2c3246;
}
.getting-started-buttons > .col {
@ -582,7 +582,7 @@ html[data-theme="light"] .icon {
height: 24px;
transform: rotate(0);
transition: 0.25s;
background: url(/icons/ico-chevron-down.svg) no-repeat center;
background: url(docs/latest/icons/ico-chevron-down.svg) no-repeat center;
}
/* expansion content */
@ -594,23 +594,23 @@ html[data-theme="light"] .icon {
/* icon types */
/* -> info */
.cml-expansion-panel.info .cml-expansion-panel-summary:before {
background-image: url(/icons/ico-info-circle.svg);
background-image: url(docs/latest/icons/ico-info-circle.svg);
}
/* -> tips */
.cml-expansion-panel.tips .cml-expansion-panel-summary:before {
background-image: url(/icons/ico-tips.svg);
background-image: url(docs/latest/icons/ico-tips.svg);
}
/* -> alert */
.cml-expansion-panel.alert .cml-expansion-panel-summary:before {
background-image: url(/icons/ico-alert.svg);
background-image: url(docs/latest/icons/ico-alert.svg);
}
/* -> screenshot */
.cml-expansion-panel.screenshot .cml-expansion-panel-summary:before {
background-image: url(/icons/ico-image.svg);
background-image: url(docs/latest/icons/ico-image.svg);
}
/* -> configuration */
.cml-expansion-panel.configuration .cml-expansion-panel-summary:before {
background-image: url(/icons/ico-config.svg);
background-image: url(docs/latest/icons/ico-config.svg);
}
/* light mode */
@ -701,48 +701,12 @@ html[data-theme="dark"] .alert--info {
/* show visual identification for external links */
.markdown a[href^="https://"] {
background: url(/icons/ico-open-in-new.svg) center right no-repeat;
background: url(docs/latest/icons/ico-open-in-new.svg) center right no-repeat;
background-size: 0.9rem;
margin-right: 0.4rem;
padding-right: 1rem;
}
/* supported storages */
.supported-storages {
margin: 1rem auto;
padding: 0;
display: flex;
flex-wrap: wrap;
justify-content: center;
gap: 0.5rem;
.markdown .img-swt {
margin-bottom: 2.4rem;
}
.supported-storages > div {
padding: 1.6rem 1rem 0;
width: 8rem;
min-height: 8rem;
text-align: center;
font-size: 0.875rem;
font-weight: 500;
line-height: 1.2;
display: flex;
flex-direction: column;
align-items: center;
background-color: var(--ifm-toc-background-color);
border-radius: 1rem;
}
html[data-theme="light"] .supported-storages > div {
background-color: var(--ifm-color-secondary);
}
.supported-storages img {
width: 48px;
height: 48px;
margin-bottom: 0.5rem;
}
@media only screen and (min-width: 767px) {
.supported-storages {
width: 80%;
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 27 KiB