mirror of
https://github.com/clearml/clearml-docs
synced 2025-04-06 22:26:36 +00:00
Small edits (#691)
This commit is contained in:
parent
e6257d2843
commit
a8be5b50c8
@ -642,7 +642,6 @@ logger.report_scatter2d(
|
|||||||
xaxis="title x",
|
xaxis="title x",
|
||||||
yaxis="title y"
|
yaxis="title y"
|
||||||
)
|
)
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## GIT and Storage
|
## GIT and Storage
|
||||||
|
@ -18,8 +18,7 @@ ClearML automatically captures scalars logged by CatBoost. These scalars can be
|
|||||||

|

|
||||||
|
|
||||||
## Hyperparameters
|
## Hyperparameters
|
||||||
ClearML automatically logs command line options defined with argparse. They appear in **CONFIGURATIONS > HYPER
|
ClearML automatically logs command line options defined with argparse. They appear in **CONFIGURATIONS > HYPERPARAMETERS > Args**.
|
||||||
PARAMETERS > Args**.
|
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
|
@ -13,7 +13,7 @@ The example script does the following:
|
|||||||
|
|
||||||
## Scalars
|
## Scalars
|
||||||
|
|
||||||
The scalars logged in the experiment can be visualized in a plot, which appears in the ClearML web UI, in the **experiment's page > SCALARS**.
|
The scalars logged in the experiment can be visualized in a plot, which appears in the ClearML web UI, in the experiment's **SCALARS** tab.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
|
@ -28,7 +28,8 @@ on `Task.current_task` (the main Task). The dictionary contains the `dist.rank`
|
|||||||
|
|
||||||
```python
|
```python
|
||||||
Task.current_task().upload_artifact(
|
Task.current_task().upload_artifact(
|
||||||
'temp {:02d}'.format(dist.get_rank()), artifact_object={'worker_rank': dist.get_rank()})
|
'temp {:02d}'.format(dist.get_rank()), artifact_object={'worker_rank': dist.get_rank()}
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
All of these artifacts appear in the main Task, **ARTIFACTS** **>** **OTHER**.
|
All of these artifacts appear in the main Task, **ARTIFACTS** **>** **OTHER**.
|
||||||
@ -43,7 +44,8 @@ same title (`loss`), but a different series name (containing the subprocess' `ra
|
|||||||
|
|
||||||
```python
|
```python
|
||||||
Task.current_task().get_logger().report_scalar(
|
Task.current_task().get_logger().report_scalar(
|
||||||
'loss', 'worker {:02d}'.format(dist.get_rank()), value=loss.item(), iteration=i)
|
'loss', 'worker {:02d}'.format(dist.get_rank()), value=loss.item(), iteration=i
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
The single scalar plot for loss appears in **SCALARS**.
|
The single scalar plot for loss appears in **SCALARS**.
|
||||||
|
@ -70,25 +70,29 @@ clearml_logger.attach(
|
|||||||
* Log metrics for training:
|
* Log metrics for training:
|
||||||
|
|
||||||
```python
|
```python
|
||||||
clearml_logger.attach(train_evaluator,
|
clearml_logger.attach(
|
||||||
|
train_evaluator,
|
||||||
log_handler=OutputHandler(
|
log_handler=OutputHandler(
|
||||||
tag="training",
|
tag="training",
|
||||||
metric_names=["nll", "accuracy"],
|
metric_names=["nll", "accuracy"],
|
||||||
global_step_transform=global_step_from_engine(trainer)
|
global_step_transform=global_step_from_engine(trainer)
|
||||||
),
|
),
|
||||||
event_name=Events.EPOCH_COMPLETED)
|
event_name=Events.EPOCH_COMPLETED
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
* Log metrics for validation:
|
* Log metrics for validation:
|
||||||
|
|
||||||
```python
|
```python
|
||||||
clearml_logger.attach(evaluator,
|
clearml_logger.attach(
|
||||||
|
evaluator,
|
||||||
log_handler=OutputHandler(
|
log_handler=OutputHandler(
|
||||||
tag="validation",
|
tag="validation",
|
||||||
metric_names=["nll", "accuracy"],
|
metric_names=["nll", "accuracy"],
|
||||||
global_step_transform=global_step_from_engine(trainer)
|
global_step_transform=global_step_from_engine(trainer)
|
||||||
),
|
),
|
||||||
event_name=Events.EPOCH_COMPLETED)
|
event_name=Events.EPOCH_COMPLETED
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
To log optimizer parameters, use the `attach_opt_params_handler` method:
|
To log optimizer parameters, use the `attach_opt_params_handler` method:
|
||||||
|
@ -29,7 +29,8 @@ tuner = kt.Hyperband(
|
|||||||
logger=ClearMLTunerLogger(),
|
logger=ClearMLTunerLogger(),
|
||||||
objective='val_accuracy',
|
objective='val_accuracy',
|
||||||
max_epochs=10,
|
max_epochs=10,
|
||||||
hyperband_iterations=6)
|
hyperband_iterations=6
|
||||||
|
)
|
||||||
```
|
```
|
||||||
|
|
||||||
When the script runs, it logs:
|
When the script runs, it logs:
|
||||||
|
@ -23,7 +23,7 @@ The following search strategies can be used:
|
|||||||
documentation.
|
documentation.
|
||||||
|
|
||||||
* Random uniform sampling of hyperparameter strategy - [automation.RandomSearch](../../../references/sdk/hpo_optimization_randomsearch.md)
|
* Random uniform sampling of hyperparameter strategy - [automation.RandomSearch](../../../references/sdk/hpo_optimization_randomsearch.md)
|
||||||
* Full grid sampling strategy of every hyperparameter combination - Grid search [automation.GridSearch](../../../references/sdk/hpo_optimization_gridsearch.md).
|
* Full grid sampling strategy of every hyperparameter combination - [automation.GridSearch](../../../references/sdk/hpo_optimization_gridsearch.md).
|
||||||
* Custom - Use a custom class and inherit from the ClearML automation base strategy class, automation.optimization.SearchStrategy.
|
* Custom - Use a custom class and inherit from the ClearML automation base strategy class, automation.optimization.SearchStrategy.
|
||||||
|
|
||||||
The search strategy class that is chosen will be passed to the [automation.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
|
The search strategy class that is chosen will be passed to the [automation.HyperParameterOptimizer](../../../references/sdk/hpo_optimization_hyperparameteroptimizer.md)
|
||||||
@ -73,8 +73,8 @@ can be [reproduced](../../../webapp/webapp_exp_reproducing.md) and [tuned](../..
|
|||||||
|
|
||||||
Set the Task type to `optimizer`, and create a new experiment (and Task object) each time the optimizer runs (`reuse_last_task_id=False`).
|
Set the Task type to `optimizer`, and create a new experiment (and Task object) each time the optimizer runs (`reuse_last_task_id=False`).
|
||||||
|
|
||||||
When the code runs, it creates an experiment named **Automatic Hyper-Parameter Optimization** that is associated with
|
When the code runs, it creates an experiment named **Automatic Hyper-Parameter Optimization** in
|
||||||
the project **Hyper-Parameter Optimization**, which can be seen in the **ClearML Web UI**.
|
the **Hyper-Parameter Optimization** project, which can be seen in the **ClearML Web UI**.
|
||||||
|
|
||||||
```python
|
```python
|
||||||
# Connecting CLEARML
|
# Connecting CLEARML
|
||||||
@ -174,7 +174,6 @@ Specify the remaining parameters, including the time limit per Task (minutes), p
|
|||||||
max_iteration_per_job=30,
|
max_iteration_per_job=30,
|
||||||
|
|
||||||
) # done creating HyperParameterOptimizer
|
) # done creating HyperParameterOptimizer
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## Running as a Service
|
## Running as a Service
|
||||||
|
@ -56,7 +56,6 @@ Logger.current_logger().report_media(
|
|||||||
iteration=iteration,
|
iteration=iteration,
|
||||||
local_path="bar_pandas_groupby_nested.html",
|
local_path="bar_pandas_groupby_nested.html",
|
||||||
)
|
)
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Bokeh Graph HTML
|
### Bokeh Graph HTML
|
||||||
|
@ -74,7 +74,6 @@ parameters['new_param'] = 'this is new'
|
|||||||
|
|
||||||
# changing the value of a parameter (new value will be stored instead of previous one)
|
# changing the value of a parameter (new value will be stored instead of previous one)
|
||||||
parameters['float'] = '9.9'
|
parameters['float'] = '9.9'
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
Parameters from dictionaries connected to Tasks appear in **HYPERPARAMETERS** **>** **General**.
|
Parameters from dictionaries connected to Tasks appear in **HYPERPARAMETERS** **>** **General**.
|
||||||
|
@ -319,7 +319,6 @@ frame.meta['road_hazard'] = 'yes'
|
|||||||
# update the SingeFrame
|
# update the SingeFrame
|
||||||
frames.append(frame)
|
frames.append(frame)
|
||||||
myDatasetVersion.update_frames(frames)
|
myDatasetVersion.update_frames(frames)
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user