HuggingFace's [Transformers](https://huggingface.co/docs/transformers/index) is a popular deep learning framework. You can
seamlessly integrate ClearML into your Transformer's PyTorch [Trainer](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/trainer)
code using the built-in [`ClearMLCallback`](https://huggingface.co/docs/transformers/v4.34.1/en/main_classes/callback#transformers.integrations.ClearMLCallback).
ClearML automatically logs Transformer's models, parameters, scalars, and more.
All you have to do is install and set up ClearML:
1. Install the `clearml` python package:
```commandline
pip install clearml
```
1. To keep track of your experiments and/or data, ClearML needs to communicate to a server. You have 2 server options:
* Sign up for free to the [ClearML Hosted Service](https://app.clear.ml/)
* Set up your own server, see [here](../deploying_clearml/clearml_server.md).
1. Connect the ClearML SDK to the server by creating credentials (go to the top right in the UI to **Settings > Workspace > Create new credentials**),
then execute the command below and follow the instructions:
```commandline
clearml-init
```
That’s it! In every training run from now on, the ClearML experiment