mirror of
https://github.com/clearml/clearml-docs
synced 2025-04-14 20:53:13 +00:00
Refactor integrations section (#600)
This commit is contained in:
parent
d4873c9712
commit
814328c6e8
@ -180,7 +180,9 @@ or check these pages out:
|
||||
- Develop on remote machines with [ClearML Session](../../apps/clearml_session.md)
|
||||
- Structure your work and put it into [Pipelines](../../pipelines/pipelines.md)
|
||||
- Improve your experiments with [Hyperparameter Optimization](../../fundamentals/hpo.md)
|
||||
- Check out ClearML's integrations to [external libraries](../../integrations/libraries.md).
|
||||
- Check out ClearML's integrations with your favorite ML frameworks like [TensorFlow](../../guides/frameworks/tensorflow/tensorflow_mnist.md),
|
||||
[PyTorch](../../guides/frameworks/pytorch/pytorch_mnist.md), [Keras](../../guides/frameworks/keras/keras_tensorboard.md),
|
||||
and more
|
||||
|
||||
## YouTube Playlist
|
||||
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: AutoKeras Integration
|
||||
title: AutoKeras
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
Integrate ClearML into code that uses [autokeras](https://github.com/keras-team/autokeras). Initialize a ClearML
|
||||
Task in a code, and ClearML automatically logs scalars, plots, and images reported to TensorBoard, Matplotlib, Plotly,
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: CatBoost
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The [catboost_example.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/catboost/catboost_example.py)
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: FastAI
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
The [fastai_with_tensorboard_example.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/fastai/legacy/fastai_with_tensorboard_example.py)
|
||||
example demonstrates the integration of ClearML into code that uses FastAI v1 and TensorBoard.
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: Keras with TensorBoard
|
||||
title: Keras
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The example below demonstrates the integration of ClearML into code which uses Keras and TensorBoard.
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: LightGBM
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The [lightgbm_example](https://github.com/allegroai/clearml/blob/master/examples/frameworks/lightgbm/lightgbm_example.py)
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: Matplotlib
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The example below demonstrates integrating ClearML into code that uses `matplotlib` to plot scatter diagrams, and
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: MegEngine
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The [megengine_mnist.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/megengine/megengine_mnist.py)
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: PyTorch MNIST
|
||||
title: PyTorch
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The [pytorch_mnist.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_mnist.py) example
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: PyTorch with TensorBoard
|
||||
title: TensorBoard
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The [pytorch_tensorboard.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch/pytorch_tensorboard.py)
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: PyTorch Ignite TensorboardLogger
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The [cifar_ignite.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/ignite/cifar_ignite.py) example
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: PyTorch Ignite ClearMLLogger
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The `ignite` repository contains the [mnist_with_clearml_logger.py](https://github.com/pytorch/ignite/blob/master/examples/contrib/mnist/mnist_with_clearml_logger.py)
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: PyTorch Lightning
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The [pytorch-lightning](https://github.com/allegroai/clearml/blob/master/examples/frameworks/pytorch-lightning/pytorch_lightning_example.py)
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: scikit-learn with Joblib
|
||||
title: Scikit-Learn
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The [sklearn_joblib_example.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/scikit-learn/sklearn_joblib_example.py)
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: TensorBoardX with PyTorch
|
||||
title: TensorBoardX
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The [pytorch_tensorboardX.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorboardx/pytorch_tensorboardX.py)
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: Keras Tuner Integration
|
||||
title: Keras Tuner
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
Integrate ClearML into code that uses [Keras Tuner](https://www.tensorflow.org/tutorials/keras/keras_tuner). By
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: TensorFlow MNIST
|
||||
title: TensorFlow
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The [tensorflow_mnist.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorflow/tensorflow_mnist.py)
|
||||
|
@ -1,5 +1,6 @@
|
||||
---
|
||||
title: XGBoost Metric Reporting
|
||||
title: XGBoost
|
||||
displayed_sidebar: mainSidebar
|
||||
---
|
||||
|
||||
The [xgboost_metrics.py](https://github.com/allegroai/clearml/blob/master/examples/frameworks/xgboost/xgboost_metrics.py)
|
||||
|
BIN
docs/img/integrations_click_configs.png
Normal file
BIN
docs/img/integrations_click_configs.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 30 KiB |
BIN
docs/img/integrations_fire_params.png
Normal file
BIN
docs/img/integrations_fire_params.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 44 KiB |
BIN
docs/img/integrations_hydra_configs.png
Normal file
BIN
docs/img/integrations_hydra_configs.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 54 KiB |
BIN
docs/img/integrations_seaborn_plots.png
Normal file
BIN
docs/img/integrations_seaborn_plots.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 102 KiB |
BIN
docs/img/itegration_openmmlab_scalars.png
Normal file
BIN
docs/img/itegration_openmmlab_scalars.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 60 KiB |
45
docs/integrations/click.md
Normal file
45
docs/integrations/click.md
Normal file
@ -0,0 +1,45 @@
|
||||
---
|
||||
title: Click
|
||||
---
|
||||
|
||||
[`click`](https://click.palletsprojects.com) is a python package for creating command-line interfaces. ClearML integrates
|
||||
seamlessly with `click` and automatically logs its command-line parameters.
|
||||
|
||||
All you have to do is add two lines of code:
|
||||
|
||||
```python
|
||||
from clearml import Task
|
||||
task = Task.init(task_name="<task_name>", project_name="<project_name>")
|
||||
```
|
||||
|
||||
For example:
|
||||
|
||||
```python
|
||||
import click
|
||||
from clearml import Task
|
||||
|
||||
@click.command()
|
||||
@click.option('--count', default=1, help='Number of greetings.')
|
||||
@click.option('--name', prompt='Your name', help='The person to greet.')
|
||||
|
||||
def hello(count, name):
|
||||
task = Task.init(project_name='examples', task_name='Click single command')
|
||||
|
||||
for x in range(count):
|
||||
click.echo("Hello {}!".format(name))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
hello()
|
||||
```
|
||||
|
||||
When this code is executed, ClearML logs your command-line arguments, which you can view in the
|
||||
[WebApp](../webapp/webapp_overview.md), in the experiment's **Configuration > Hyperparameters > Args** section.
|
||||
|
||||

|
||||
|
||||
In the UI, you can clone the task multiple times and set the clones' parameter values for execution by the [ClearML Agent](../clearml_agent.md).
|
||||
When the clone is executed, the executing agent will use the new parameter values as if set by the command-line.
|
||||
|
||||
See [code examples](https://github.com/allegroai/clearml/blob/master/examples/frameworks/click) demonstrating integrating
|
||||
ClearML with code that uses `click`.
|
52
docs/integrations/hydra.md
Normal file
52
docs/integrations/hydra.md
Normal file
@ -0,0 +1,52 @@
|
||||
---
|
||||
title: Hydra
|
||||
---
|
||||
|
||||
|
||||
[Hydra](https://github.com/facebookresearch/hydra) is a Python framework for managing experiment parameters. ClearML integrates seamlessly
|
||||
with Hydra and automatically logs the `OmegaConf` which holds all the configuration files, as well as
|
||||
values overridden during runtime.
|
||||
|
||||
All you have to do is add two lines of code:
|
||||
|
||||
```python
|
||||
from clearml import Task
|
||||
task = Task.init(task_name="<task_name>", project_name="<project_name>")
|
||||
```
|
||||
|
||||
ClearML logs the OmegaConf as a blob and can be viewed in the
|
||||
[WebApp](../webapp/webapp_overview.md), in the experiment's **CONFIGURATION > CONFIGURATION OBJECTS > OmegaConf** section.
|
||||
|
||||

|
||||
|
||||
## Modifying Hydra Values
|
||||
|
||||
In the UI, you can clone a task multiple times and modify it for execution by the [ClearML Agent](../clearml_agent.md).
|
||||
The agent executes the code with the modifications you made in the UI, even overriding hardcoded values.
|
||||
|
||||
Clone your experiment, then modify your Hydra parameters via the UI in one of the following ways:
|
||||
* Modify the OmegaConf directly:
|
||||
1. In the experiment’s **CONFIGURATION > HYPERPARAMETERS > HYDRA** section, set `_allow_omegaconf_edit_` to `True`
|
||||
1. In the experiment’s **CONFIGURATION > CONFIGURATION OBJECTS > OmegaConf** section, modify the OmegaConf values
|
||||
* Add an experiment hyperparameter:
|
||||
1. In the experiment’s **CONFIGURATION > HYPERPARAMETERS > HYDRA** section, make sure `_allow_omegaconf_edit_` is set
|
||||
to `False`
|
||||
1. In the same section, click `Edit`, which gives you the option to add parameters. Input parameters from the OmegaConf
|
||||
that you want to modify using dot notation. For example, if your OmegaConf looks like this:
|
||||
|
||||
<br/>
|
||||
|
||||
```
|
||||
dataset:
|
||||
user: root
|
||||
main:
|
||||
number: 80
|
||||
```
|
||||
Specify the `number` parameter with `dataset.main.number`, then set its new value
|
||||
|
||||
|
||||
Enqueue the customized experiment for execution. The task will use the new values during execution. If you use the
|
||||
second option mentioned above, notice that the OmegaConf in **CONFIGURATION > CONFIGURATION OBJECTS > OmegaConf** changes
|
||||
according to your added parameters.
|
||||
|
||||
See code example [here](https://github.com/allegroai/clearml/blob/master/examples/frameworks/hydra/hydra_example.py).
|
@ -1,38 +0,0 @@
|
||||
---
|
||||
title: Integrations
|
||||
---
|
||||
|
||||
ClearML integrates with many frameworks and tools out of the box! <br/>
|
||||
|
||||
Just follow the [getting started](/getting_started/ds/ds_first_steps.md) to automatically capture metrics, models and artifacts or check out examples for each library.
|
||||
|
||||

|
||||
|
||||
**Frameworks**
|
||||
- [PyTorch](https://github.com/allegroai/clearml/tree/master/examples/frameworks/pytorch)
|
||||
- [PyTorch Lightning](https://github.com/allegroai/clearml/tree/master/examples/frameworks/pytorch-lightning)
|
||||
- [PyTorch Ignite](https://github.com/allegroai/clearml/tree/master/examples/frameworks/ignite)
|
||||
- [TensorFlow](https://github.com/allegroai/clearml/tree/master/examples/frameworks/tensorflow)
|
||||
- [Keras](https://github.com/allegroai/clearml/tree/master/examples/frameworks/keras)
|
||||
- [scikit-learn](https://github.com/allegroai/clearml/tree/master/examples/frameworks/scikit-learn)
|
||||
- [FastAI](https://github.com/allegroai/clearml/tree/master/examples/frameworks/fastai)
|
||||
- [LightGBM](https://github.com/allegroai/clearml/tree/master/examples/frameworks/lightgbm)
|
||||
- [XGBoost](https://github.com/allegroai/clearml/tree/master/examples/frameworks/xgboost)
|
||||
- [MegEngine](https://github.com/allegroai/clearml/tree/master/examples/frameworks/megengine)
|
||||
- [CatBoost](https://github.com/allegroai/clearml/tree/master/examples/frameworks/catboost)
|
||||
- [OpenMMLab](https://github.com/allegroai/clearml/tree/master/examples/frameworks/openmmlab)
|
||||
- [Hydra](https://github.com/allegroai/clearml/tree/master/examples/frameworks/hydra)
|
||||
- [Python Fire](https://github.com/allegroai/clearml/tree/master/examples/frameworks/fire)
|
||||
- [click](https://github.com/allegroai/clearml/tree/master/examples/frameworks/click)
|
||||
|
||||
**HPO**
|
||||
- [Optuna](https://github.com/allegroai/clearml/tree/master/examples/optimization/hyper-parameter-optimization)
|
||||
- [Keras Tuner](https://github.com/allegroai/clearml/tree/master/examples/frameworks/kerastuner)
|
||||
- [AutoKeras](https://github.com/allegroai/clearml/tree/master/examples/frameworks/autokeras)
|
||||
|
||||
**Plotting**
|
||||
- [Tensorboard](https://github.com/allegroai/clearml/blob/master/examples/frameworks/tensorflow/tensorboard_toy.py)
|
||||
- [TensorboardX](https://github.com/allegroai/clearml/tree/master/examples/frameworks/tensorboardx)
|
||||
- [matplotlib](https://github.com/allegroai/clearml/tree/master/examples/frameworks/matplotlib)
|
||||
|
||||
|
44
docs/integrations/openmmv.md
Normal file
44
docs/integrations/openmmv.md
Normal file
@ -0,0 +1,44 @@
|
||||
---
|
||||
title: OpenMMLab
|
||||
---
|
||||
|
||||
[OpenMMLab](https://github.com/open-mmlab) is a computer vision framework. You can integrate ClearML into your
|
||||
code using the `mmcv` package's [`ClearMLLoggerHook`](https://mmcv.readthedocs.io/en/master/_modules/mmcv/runner/hooks/logger/clearml.html)
|
||||
class. This class is used to create a ClearML Task and to automatically log metrics.
|
||||
|
||||
For example, the following code sets up the configuration for logging metrics periodically to ClearML, and then registers
|
||||
the ClearML hook to a [runner](https://mmcv.readthedocs.io/en/v1.3.8/runner.html?highlight=register_training_hooks#epochbasedrunner),
|
||||
which manages training in `mmcv`:
|
||||
|
||||
```python
|
||||
log_config = dict(
|
||||
interval=100,
|
||||
hooks=[
|
||||
dict(
|
||||
type='ClearMLLoggerHook',
|
||||
init_kwargs=dict(
|
||||
project_name='examples',
|
||||
task_name='OpenMMLab cifar10',
|
||||
output_uri=True
|
||||
)
|
||||
),
|
||||
]
|
||||
)
|
||||
|
||||
# register hooks to runner and those hooks will be invoked automatically
|
||||
runner.register_training_hooks(
|
||||
lr_config=lr_config,
|
||||
optimizer_config=optimizer_config,
|
||||
checkpoint_config=checkpoint_config,
|
||||
log_config=log_config # ClearMLLogger hook
|
||||
)
|
||||
```
|
||||
|
||||
The `init_kwargs` dictionary can include any parameter from [`Task.init()`](../references/sdk/task.md#taskinit).
|
||||
|
||||
This creates a [ClearML Task](../fundamentals/task.md) `OpenMMLab cifar10` in the `examples` project.
|
||||
You can view the captured metrics in the experiment's **Scalars** tab in the [WebApp](../webapp/webapp_overview.md).
|
||||
|
||||

|
||||
|
||||
See OpenMMLab code example [here](https://github.com/allegroai/clearml/blob/master/examples/frameworks/openmmlab/openmmlab_cifar10.py).
|
44
docs/integrations/optuna.md
Normal file
44
docs/integrations/optuna.md
Normal file
@ -0,0 +1,44 @@
|
||||
---
|
||||
title: Optuna
|
||||
---
|
||||
|
||||
[Optuna](https://optuna.readthedocs.io/en/latest) is a [hyperparameter optimization](../fundamentals/hpo.md) framework,
|
||||
which makes use of different samplers such as grid search, random, bayesian, and evolutionary algorithms. You can integrate
|
||||
Optuna into ClearML's automated hyperparameter optimization.
|
||||
|
||||
The [HyperParameterOptimizer](../references/sdk/hpo_optimization_hyperparameteroptimizer.md) class contains ClearML’s
|
||||
hyperparameter optimization modules. Its modular design enables using different optimizers, including existing software
|
||||
frameworks, like Optuna, enabling simple,
|
||||
accurate, and fast hyperparameter optimization. The Optuna ([`automation.optuna.OptimizerOptuna`](../references/sdk/hpo_optuna_optuna_optimizeroptuna.md)),
|
||||
optimizer allows you to simultaneously optimize many hyperparameters efficiently by relying on early stopping (pruning)
|
||||
and smart resource allocation.
|
||||
|
||||
To use optuna in ClearML's hyperparameter optimization, you must first install it. When you instantiate `HyperParameterOptimizer`,
|
||||
pass `OptimizerOptuna` as the `optimizer_class` argument:
|
||||
|
||||
```python
|
||||
from clearml.automation import (
|
||||
DiscreteParameterRange, HyperParameterOptimizer, UniformIntegerParameterRange
|
||||
)
|
||||
from clearml.automation.optuna import OptimizerOptuna
|
||||
|
||||
an_optimizer = HyperParameterOptimizer(
|
||||
# This is the experiment we want to optimize
|
||||
base_task_id=args['template_task_id'],
|
||||
hyper_parameters=[
|
||||
UniformIntegerParameterRange('layer_1', min_value=128, max_value=512, step_size=128),
|
||||
DiscreteParameterRange('batch_size', values=[96, 128, 160]),
|
||||
DiscreteParameterRange('epochs', values=[30]),
|
||||
],
|
||||
objective_metric_title='validation',
|
||||
objective_metric_series='accuracy',
|
||||
objective_metric_sign='max',
|
||||
max_number_of_concurrent_tasks=2,
|
||||
optimizer_class=OptimizerOptuna, # input optuna as search strategy
|
||||
execution_queue='1xGPU',
|
||||
total_max_jobs=10,
|
||||
)
|
||||
```
|
||||
|
||||
See the Hyperparameter Optimization [tutorial](../guides/optimization/hyper-parameter-optimization/examples_hyperparam_opt.md).
|
||||
|
28
docs/integrations/python_fire.md
Normal file
28
docs/integrations/python_fire.md
Normal file
@ -0,0 +1,28 @@
|
||||
---
|
||||
title: Python Fire
|
||||
---
|
||||
|
||||
Python Fire is a Python package for creating command-line interfaces.
|
||||
ClearML integrates seamlessly with `fire` and automatically logs its command-line parameters.
|
||||
|
||||
All you have to do is add two lines of code:
|
||||
|
||||
```python
|
||||
from clearml import Task
|
||||
task = Task.init(task_name="<task_name>", project_name="<project_name>")
|
||||
```
|
||||
|
||||
When the code runs, ClearML logs your command-line arguments, which you can view in the [WebApp](../webapp/webapp_overview.md), in the experiment's
|
||||
**Configuration > Hyperparameters > Args** section.
|
||||
|
||||

|
||||
|
||||
In the UI, you can clone the task multiple times and set the clones' parameter values for execution by the [ClearML Agent](../clearml_agent.md).
|
||||
When the clone is executed, the executing agent will use the new parameter values as if set by the command-line.
|
||||
|
||||
See [code examples](https://github.com/allegroai/clearml/blob/master/examples/frameworks/fire) demonstrating integrating
|
||||
ClearML with code that uses `fire`.
|
||||
|
||||
|
||||
|
||||
|
20
docs/integrations/seaborn.md
Normal file
20
docs/integrations/seaborn.md
Normal file
@ -0,0 +1,20 @@
|
||||
---
|
||||
title: Seaborn
|
||||
---
|
||||
|
||||
[seaborn](https://seaborn.pydata.org/) is a Python library for data visualization.
|
||||
ClearML automatically captures plots created using `seaborn`. All you have to do is add two
|
||||
lines of code to your script:
|
||||
|
||||
```python
|
||||
from clearml import Task
|
||||
task = Task.init(task_name="<task_name>", project_name="<project_name>")
|
||||
```
|
||||
|
||||
This will create a [ClearML Task](../fundamentals/task.md) that captures your script's information, including Git details,
|
||||
uncommitted code, python environment, your `seaborn` plots, and more. View the seaborn plots in the [WebApp](../webapp/webapp_overview.md),
|
||||
in the experiment's **Plots** tab.
|
||||
|
||||

|
||||
|
||||
View code example [here](https://github.com/allegroai/clearml/blob/master/examples/frameworks/matplotlib/matplotlib_example.py).
|
15
sidebars.js
15
sidebars.js
@ -58,7 +58,20 @@ module.exports = {
|
||||
'model_registry',
|
||||
{'ClearML Serving':['clearml_serving/clearml_serving', 'clearml_serving/clearml_serving_setup', 'clearml_serving/clearml_serving_cli', 'clearml_serving/clearml_serving_tutorial']},
|
||||
{'CLI Tools': ['apps/clearml_session', 'apps/clearml_task', 'apps/clearml_param_search']},
|
||||
'integrations/libraries',
|
||||
{'Integrations': [
|
||||
'guides/frameworks/autokeras/integration_autokeras',
|
||||
'guides/frameworks/catboost/catboost', 'integrations/click', 'guides/frameworks/fastai/fastai_with_tensorboard',
|
||||
'integrations/hydra',
|
||||
'guides/frameworks/keras/keras_tensorboard', 'guides/frameworks/tensorflow/integration_keras_tuner',
|
||||
'guides/frameworks/lightgbm/lightgbm_example', 'guides/frameworks/matplotlib/matplotlib_example',
|
||||
'guides/frameworks/megengine/megengine_mnist', 'integrations/openmmv', 'integrations/optuna',
|
||||
'integrations/python_fire', 'guides/frameworks/pytorch/pytorch_mnist',
|
||||
{'PyTorch Ignite':['guides/frameworks/pytorch_ignite/integration_pytorch_ignite', 'guides/frameworks/pytorch_ignite/pytorch_ignite_mnist']},
|
||||
'guides/frameworks/pytorch_lightning/pytorch_lightning_example', 'guides/frameworks/scikit-learn/sklearn_joblib_example',
|
||||
'guides/frameworks/pytorch/pytorch_tensorboard', 'guides/frameworks/tensorboardx/tensorboardx', 'guides/frameworks/tensorflow/tensorflow_mnist',
|
||||
'integrations/seaborn', 'guides/frameworks/xgboost/xgboost_metrics'
|
||||
]
|
||||
},
|
||||
'integrations/storage',
|
||||
{'WebApp': ['webapp/webapp_overview', 'webapp/webapp_home',
|
||||
{
|
||||
|
Loading…
Reference in New Issue
Block a user