mirror of
https://github.com/clearml/clearml-docs
synced 2025-06-26 18:17:44 +00:00
Refactor integrations section (#600)
This commit is contained in:
45
docs/integrations/click.md
Normal file
45
docs/integrations/click.md
Normal file
@@ -0,0 +1,45 @@
|
||||
---
|
||||
title: Click
|
||||
---
|
||||
|
||||
[`click`](https://click.palletsprojects.com) is a python package for creating command-line interfaces. ClearML integrates
|
||||
seamlessly with `click` and automatically logs its command-line parameters.
|
||||
|
||||
All you have to do is add two lines of code:
|
||||
|
||||
```python
|
||||
from clearml import Task
|
||||
task = Task.init(task_name="<task_name>", project_name="<project_name>")
|
||||
```
|
||||
|
||||
For example:
|
||||
|
||||
```python
|
||||
import click
|
||||
from clearml import Task
|
||||
|
||||
@click.command()
|
||||
@click.option('--count', default=1, help='Number of greetings.')
|
||||
@click.option('--name', prompt='Your name', help='The person to greet.')
|
||||
|
||||
def hello(count, name):
|
||||
task = Task.init(project_name='examples', task_name='Click single command')
|
||||
|
||||
for x in range(count):
|
||||
click.echo("Hello {}!".format(name))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
hello()
|
||||
```
|
||||
|
||||
When this code is executed, ClearML logs your command-line arguments, which you can view in the
|
||||
[WebApp](../webapp/webapp_overview.md), in the experiment's **Configuration > Hyperparameters > Args** section.
|
||||
|
||||

|
||||
|
||||
In the UI, you can clone the task multiple times and set the clones' parameter values for execution by the [ClearML Agent](../clearml_agent.md).
|
||||
When the clone is executed, the executing agent will use the new parameter values as if set by the command-line.
|
||||
|
||||
See [code examples](https://github.com/allegroai/clearml/blob/master/examples/frameworks/click) demonstrating integrating
|
||||
ClearML with code that uses `click`.
|
||||
52
docs/integrations/hydra.md
Normal file
52
docs/integrations/hydra.md
Normal file
@@ -0,0 +1,52 @@
|
||||
---
|
||||
title: Hydra
|
||||
---
|
||||
|
||||
|
||||
[Hydra](https://github.com/facebookresearch/hydra) is a Python framework for managing experiment parameters. ClearML integrates seamlessly
|
||||
with Hydra and automatically logs the `OmegaConf` which holds all the configuration files, as well as
|
||||
values overridden during runtime.
|
||||
|
||||
All you have to do is add two lines of code:
|
||||
|
||||
```python
|
||||
from clearml import Task
|
||||
task = Task.init(task_name="<task_name>", project_name="<project_name>")
|
||||
```
|
||||
|
||||
ClearML logs the OmegaConf as a blob and can be viewed in the
|
||||
[WebApp](../webapp/webapp_overview.md), in the experiment's **CONFIGURATION > CONFIGURATION OBJECTS > OmegaConf** section.
|
||||
|
||||

|
||||
|
||||
## Modifying Hydra Values
|
||||
|
||||
In the UI, you can clone a task multiple times and modify it for execution by the [ClearML Agent](../clearml_agent.md).
|
||||
The agent executes the code with the modifications you made in the UI, even overriding hardcoded values.
|
||||
|
||||
Clone your experiment, then modify your Hydra parameters via the UI in one of the following ways:
|
||||
* Modify the OmegaConf directly:
|
||||
1. In the experiment’s **CONFIGURATION > HYPERPARAMETERS > HYDRA** section, set `_allow_omegaconf_edit_` to `True`
|
||||
1. In the experiment’s **CONFIGURATION > CONFIGURATION OBJECTS > OmegaConf** section, modify the OmegaConf values
|
||||
* Add an experiment hyperparameter:
|
||||
1. In the experiment’s **CONFIGURATION > HYPERPARAMETERS > HYDRA** section, make sure `_allow_omegaconf_edit_` is set
|
||||
to `False`
|
||||
1. In the same section, click `Edit`, which gives you the option to add parameters. Input parameters from the OmegaConf
|
||||
that you want to modify using dot notation. For example, if your OmegaConf looks like this:
|
||||
|
||||
<br/>
|
||||
|
||||
```
|
||||
dataset:
|
||||
user: root
|
||||
main:
|
||||
number: 80
|
||||
```
|
||||
Specify the `number` parameter with `dataset.main.number`, then set its new value
|
||||
|
||||
|
||||
Enqueue the customized experiment for execution. The task will use the new values during execution. If you use the
|
||||
second option mentioned above, notice that the OmegaConf in **CONFIGURATION > CONFIGURATION OBJECTS > OmegaConf** changes
|
||||
according to your added parameters.
|
||||
|
||||
See code example [here](https://github.com/allegroai/clearml/blob/master/examples/frameworks/hydra/hydra_example.py).
|
||||
@@ -1,38 +0,0 @@
|
||||
---
|
||||
title: Integrations
|
||||
---
|
||||
|
||||
ClearML integrates with many frameworks and tools out of the box! <br/>
|
||||
|
||||
Just follow the [getting started](/getting_started/ds/ds_first_steps.md) to automatically capture metrics, models and artifacts or check out examples for each library.
|
||||
|
||||

|
||||
|
||||
**Frameworks**
|
||||
- [PyTorch](https://github.com/allegroai/clearml/tree/master/examples/frameworks/pytorch)
|
||||
- [PyTorch Lightning](https://github.com/allegroai/clearml/tree/master/examples/frameworks/pytorch-lightning)
|
||||
- [PyTorch Ignite](https://github.com/allegroai/clearml/tree/master/examples/frameworks/ignite)
|
||||
- [TensorFlow](https://github.com/allegroai/clearml/tree/master/examples/frameworks/tensorflow)
|
||||
- [Keras](https://github.com/allegroai/clearml/tree/master/examples/frameworks/keras)
|
||||
- [scikit-learn](https://github.com/allegroai/clearml/tree/master/examples/frameworks/scikit-learn)
|
||||
- [FastAI](https://github.com/allegroai/clearml/tree/master/examples/frameworks/fastai)
|
||||
- [LightGBM](https://github.com/allegroai/clearml/tree/master/examples/frameworks/lightgbm)
|
||||
- [XGBoost](https://github.com/allegroai/clearml/tree/master/examples/frameworks/xgboost)
|
||||
- [MegEngine](https://github.com/allegroai/clearml/tree/master/examples/frameworks/megengine)
|
||||
- [CatBoost](https://github.com/allegroai/clearml/tree/master/examples/frameworks/catboost)
|
||||
- [OpenMMLab](https://github.com/allegroai/clearml/tree/master/examples/frameworks/openmmlab)
|
||||
- [Hydra](https://github.com/allegroai/clearml/tree/master/examples/frameworks/hydra)
|
||||
- [Python Fire](https://github.com/allegroai/clearml/tree/master/examples/frameworks/fire)
|
||||
- [click](https://github.com/allegroai/clearml/tree/master/examples/frameworks/click)
|
||||
|
||||
**HPO**
|
||||
- [Optuna](https://github.com/allegroai/clearml/tree/master/examples/optimization/hyper-parameter-optimization)
|
||||
- [Keras Tuner](https://github.com/allegroai/clearml/tree/master/examples/frameworks/kerastuner)
|
||||
- [AutoKeras](https://github.com/allegroai/clearml/tree/master/examples/frameworks/autokeras)
|
||||
|
||||
**Plotting**
|
||||
- [Tensorboard](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorflow/tensorboard_toy.py)
|
||||
- [TensorboardX](https://github.com/allegroai/clearml/tree/master/examples/frameworks/tensorboardx)
|
||||
- [matplotlib](https://github.com/allegroai/clearml/tree/master/examples/frameworks/matplotlib)
|
||||
|
||||
|
||||
44
docs/integrations/openmmv.md
Normal file
44
docs/integrations/openmmv.md
Normal file
@@ -0,0 +1,44 @@
|
||||
---
|
||||
title: OpenMMLab
|
||||
---
|
||||
|
||||
[OpenMMLab](https://github.com/open-mmlab) is a computer vision framework. You can integrate ClearML into your
|
||||
code using the `mmcv` package's [`ClearMLLoggerHook`](https://mmcv.readthedocs.io/en/master/_modules/mmcv/runner/hooks/logger/clearml.html)
|
||||
class. This class is used to create a ClearML Task and to automatically log metrics.
|
||||
|
||||
For example, the following code sets up the configuration for logging metrics periodically to ClearML, and then registers
|
||||
the ClearML hook to a [runner](https://mmcv.readthedocs.io/en/v1.3.8/runner.html?highlight=register_training_hooks#epochbasedrunner),
|
||||
which manages training in `mmcv`:
|
||||
|
||||
```python
|
||||
log_config = dict(
|
||||
interval=100,
|
||||
hooks=[
|
||||
dict(
|
||||
type='ClearMLLoggerHook',
|
||||
init_kwargs=dict(
|
||||
project_name='examples',
|
||||
task_name='OpenMMLab cifar10',
|
||||
output_uri=True
|
||||
)
|
||||
),
|
||||
]
|
||||
)
|
||||
|
||||
# register hooks to runner and those hooks will be invoked automatically
|
||||
runner.register_training_hooks(
|
||||
lr_config=lr_config,
|
||||
optimizer_config=optimizer_config,
|
||||
checkpoint_config=checkpoint_config,
|
||||
log_config=log_config # ClearMLLogger hook
|
||||
)
|
||||
```
|
||||
|
||||
The `init_kwargs` dictionary can include any parameter from [`Task.init()`](../references/sdk/task.md#taskinit).
|
||||
|
||||
This creates a [ClearML Task](../fundamentals/task.md) `OpenMMLab cifar10` in the `examples` project.
|
||||
You can view the captured metrics in the experiment's **Scalars** tab in the [WebApp](../webapp/webapp_overview.md).
|
||||
|
||||

|
||||
|
||||
See OpenMMLab code example [here](https://github.com/allegroai/clearml/blob/master/examples/frameworks/openmmlab/openmmlab_cifar10.py).
|
||||
44
docs/integrations/optuna.md
Normal file
44
docs/integrations/optuna.md
Normal file
@@ -0,0 +1,44 @@
|
||||
---
|
||||
title: Optuna
|
||||
---
|
||||
|
||||
[Optuna](https://optuna.readthedocs.io/en/latest) is a [hyperparameter optimization](../fundamentals/hpo.md) framework,
|
||||
which makes use of different samplers such as grid search, random, bayesian, and evolutionary algorithms. You can integrate
|
||||
Optuna into ClearML's automated hyperparameter optimization.
|
||||
|
||||
The [HyperParameterOptimizer](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class contains ClearML’s
|
||||
hyperparameter optimization modules. Its modular design enables using different optimizers, including existing software
|
||||
frameworks, like Optuna, enabling simple,
|
||||
accurate, and fast hyperparameter optimization. The Optuna ([`automation.optuna.OptimizerOptuna`](../references/sdk/hpo_optuna_optuna_optimizeroptuna.md)),
|
||||
optimizer allows you to simultaneously optimize many hyperparameters efficiently by relying on early stopping (pruning)
|
||||
and smart resource allocation.
|
||||
|
||||
To use optuna in ClearML's hyperparameter optimization, you must first install it. When you instantiate `HyperParameterOptimizer`,
|
||||
pass `OptimizerOptuna` as the `optimizer_class` argument:
|
||||
|
||||
```python
|
||||
from clearml.automation import (
|
||||
DiscreteParameterRange, HyperParameterOptimizer, UniformIntegerParameterRange
|
||||
)
|
||||
from clearml.automation.optuna import OptimizerOptuna
|
||||
|
||||
an_optimizer = HyperParameterOptimizer(
|
||||
# This is the experiment we want to optimize
|
||||
base_task_id=args['template_task_id'],
|
||||
hyper_parameters=[
|
||||
UniformIntegerParameterRange('layer_1', min_value=128, max_value=512, step_size=128),
|
||||
DiscreteParameterRange('batch_size', values=[96, 128, 160]),
|
||||
DiscreteParameterRange('epochs', values=[30]),
|
||||
],
|
||||
objective_metric_title='validation',
|
||||
objective_metric_series='accuracy',
|
||||
objective_metric_sign='max',
|
||||
max_number_of_concurrent_tasks=2,
|
||||
optimizer_class=OptimizerOptuna, # input optuna as search strategy
|
||||
execution_queue='1xGPU',
|
||||
total_max_jobs=10,
|
||||
)
|
||||
```
|
||||
|
||||
See the Hyperparameter Optimization [tutorial](../guides/optimization/hyper-parameter-optimization/examples_hyperparam_opt.md).
|
||||
|
||||
28
docs/integrations/python_fire.md
Normal file
28
docs/integrations/python_fire.md
Normal file
@@ -0,0 +1,28 @@
|
||||
---
|
||||
title: Python Fire
|
||||
---
|
||||
|
||||
Python Fire is a Python package for creating command-line interfaces.
|
||||
ClearML integrates seamlessly with `fire` and automatically logs its command-line parameters.
|
||||
|
||||
All you have to do is add two lines of code:
|
||||
|
||||
```python
|
||||
from clearml import Task
|
||||
task = Task.init(task_name="<task_name>", project_name="<project_name>")
|
||||
```
|
||||
|
||||
When the code runs, ClearML logs your command-line arguments, which you can view in the [WebApp](../webapp/webapp_overview.md), in the experiment's
|
||||
**Configuration > Hyperparameters > Args** section.
|
||||
|
||||

|
||||
|
||||
In the UI, you can clone the task multiple times and set the clones' parameter values for execution by the [ClearML Agent](../clearml_agent.md).
|
||||
When the clone is executed, the executing agent will use the new parameter values as if set by the command-line.
|
||||
|
||||
See [code examples](https://github.com/allegroai/clearml/blob/master/examples/frameworks/fire) demonstrating integrating
|
||||
ClearML with code that uses `fire`.
|
||||
|
||||
|
||||
|
||||
|
||||
20
docs/integrations/seaborn.md
Normal file
20
docs/integrations/seaborn.md
Normal file
@@ -0,0 +1,20 @@
|
||||
---
|
||||
title: Seaborn
|
||||
---
|
||||
|
||||
[seaborn](https://seaborn.pydata.org/) is a Python library for data visualization.
|
||||
ClearML automatically captures plots created using `seaborn`. All you have to do is add two
|
||||
lines of code to your script:
|
||||
|
||||
```python
|
||||
from clearml import Task
|
||||
task = Task.init(task_name="<task_name>", project_name="<project_name>")
|
||||
```
|
||||
|
||||
This will create a [ClearML Task](../fundamentals/task.md) that captures your script's information, including Git details,
|
||||
uncommitted code, python environment, your `seaborn` plots, and more. View the seaborn plots in the [WebApp](../webapp/webapp_overview.md),
|
||||
in the experiment's **Plots** tab.
|
||||
|
||||

|
||||
|
||||
View code example [here](https://github.com/allegroai/clearml/blob/master/examples/frameworks/matplotlib/matplotlib_example.py).
|
||||
Reference in New Issue
Block a user