mirror of
https://github.com/clearml/clearml-docs
synced 2025-01-30 22:18:02 +00:00
Add HiggingFace Transformers example (#716)
This commit is contained in:
parent
e844872d4e
commit
dbf6c85a81
53
docs/guides/frameworks/huggingface/transformers.md
Normal file
53
docs/guides/frameworks/huggingface/transformers.md
Normal file
@ -0,0 +1,53 @@
|
||||
---
|
||||
title: Transformers
|
||||
---
|
||||
|
||||
The [HuggingFace Transformers example](https://github.com/allegroai/clearml/blob/master/examples/frameworks/huggingface/transformers.ipynb)
|
||||
demonstrates how to integrate ClearML into your Transformer's PyTorch [Trainer](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/trainer)
|
||||
code. When ClearML is installed in an environment, the Trainer by default uses the built-in [`ClearMLCallback`](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/callback#transformers.integrations.ClearMLCallback),
|
||||
so ClearML automatically logs Transformers models, parameters, scalars, and more.
|
||||
|
||||
When the example runs, it creates a ClearML task called `Trainer` in the `HuggingFace Transformers` projects. To change
|
||||
the task’s name or project, use the `CLEARML_PROJECT` and `CLEARML_TASK` environment variables respectively.
|
||||
|
||||
For more information about integrating ClearML into your Transformers code, see [HuggingFace Transformers](../../../integrations/transformers.md).
|
||||
|
||||
## WebApp
|
||||
|
||||
### Hyperparameters
|
||||
|
||||
ClearML automatically captures all the PyTorch trainer [parameters](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/trainer#transformers.TrainingArguments).
|
||||
Notice in the code example that only a few of the `TrainingArguments` are explicitly set:
|
||||
|
||||
```python
|
||||
training_args = TrainingArguments(
|
||||
output_dir="path/to/save/folder/",
|
||||
learning_rate=2e-5,
|
||||
per_device_train_batch_size=8,
|
||||
per_device_eval_batch_size=8,
|
||||
num_train_epochs=2,
|
||||
)
|
||||
```
|
||||
|
||||
ClearML captures the arguments specified in the preceding code, as well the rest of the `TrainingArguments` and their default
|
||||
values.
|
||||
|
||||
View the parameters in the experiment's **CONFIGURATION** tab **> Hyperparameters** section.
|
||||
|
||||
![Transformers params](../../../img/examples_transformers_params.png)
|
||||
|
||||
|
||||
### Models
|
||||
|
||||
In order for ClearML to log the models created during training, the `CLEARML_LOG_MODEL` environment variable is set to `True`.
|
||||
|
||||
ClearML automatically captures the model snapshots created by the Trainer, and saves them as artifacts. View the snapshots in the
|
||||
experiment's **ARTIFACTS** tab.
|
||||
|
||||
![Transformers models](../../../img/examples_transformers_artifacts.png)
|
||||
|
||||
### Scalars
|
||||
|
||||
ClearML automatically captures the Trainer's scalars, which can be viewed in the experiment's **Scalars** tab.
|
||||
|
||||
![Transformers scalars](../../../img/integrations_transformers_scalars.png)
|
BIN
docs/img/examples_transformers_artifacts.png
Normal file
BIN
docs/img/examples_transformers_artifacts.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 38 KiB |
BIN
docs/img/examples_transformers_params.png
Normal file
BIN
docs/img/examples_transformers_params.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 93 KiB |
@ -152,7 +152,8 @@ module.exports = {
|
||||
'guides/frameworks/autokeras/autokeras_imdb_example',
|
||||
'guides/frameworks/catboost/catboost',
|
||||
'guides/frameworks/fastai/fastai_with_tensorboard',
|
||||
{'Keras': ['guides/frameworks/keras/jupyter', 'guides/frameworks/keras/keras_tensorboard']},
|
||||
{'HuggingFace': ['guides/frameworks/huggingface/transformers']},
|
||||
{'Keras': ['guides/frameworks/keras/jupyter', 'guides/frameworks/keras/keras_tensorboard']},
|
||||
'guides/frameworks/lightgbm/lightgbm_example',
|
||||
'guides/frameworks/matplotlib/matplotlib_example',
|
||||
'guides/frameworks/megengine/megengine_mnist',
|
||||
|
Loading…
Reference in New Issue
Block a user