Using only your command line and __zero__ additional lines of code, you can easily integrate the ClearML platform
into your experiment. With the `clearml-task` command, you can create a [Task](https://allegro.ai/clearml/docs/docs/concepts_fundamentals/concepts_fundamentals_tasks.html)
using any script from **any python code or repository and launch it on a remote machine**.
You will be launching this [script](https://github.com/allegroai/events/blob/master/webinar-0620/keras_mnist.py)
on a remote machine. You will be using the following command-line options:
1. Give the experiment a name and select a project, for example: `--project keras_examples --name remote_test`. If the project
doesn't exist, a new project will be created with the selected name.
2. Select the repository with your code. For example: `--repo https://github.com/allegroai/events.git` You can specify a
branch and/or commit using `--branch <branch_name> --commit <commit_id>`. If you do not specify the
branch / commit, it will use by default the latest commit from the master branch,
3. Specify which script in the repository needs to be run, for example: `--script /webinar-0620/keras_mnist.py`
By default, the execution working directory will be the root of the repository. If you need to change it,
add `--cwd <folder>`
4. If you need, pass an argument to your scripts, use `--args`, followed by the arguments.
The names of the arguments should match the argparse arguments, but without the '--' prefix. Instead
of --key=value -> use `--args key=value`, for example `--args batch_size=64 epochs=1`
5. Select the queue for your Task's execution, for example: `--queue default`. If a queue isn't chosen, the Task
will not be executed, it will be left in [draft mode](https://allegro.ai/clearml/docs/docs/concepts_fundamentals/concepts_fundamentals_tasks.html?highlight=draft#task-states-and-state-transitions),
and you can enqueue and execute the Task at a later point.
6. Add required packages. If your repo has a requirements.txt file, you don't need to do anything; `clearml-task`
will automatically find the file and put it in your Task. If your repo does __not__ have a requirements file and
there are packages that are necessary for the execution of your code, use --packages <package_name>. For example:
You will be launching a single local script file (no git repo needed) on a remote machine:
1. Give the experiment a name and select a project (`--project examples --name remote_test`)
2. Select the script file on your machine, `--script /path/to/my/script.py`
3. If you require specific packages to run your code, you can specify them manually with `--packages "package_name" "package_name2`,
for example: `packages "keras" "tensorflow>2.2"`
or you can pass a requirements file `--requirements /path/to/my/requirements.txt`
4. If you need to pass arguments, like in the repo case, add `--args key=value` and make sure that the key names match
the argparse arguments (`--args batch_size=64 epochs=1`)
5. If you have a docker container with an entire environment in which you want your script to run inside,
add e.g. `--docker nvcr.io/nvidia/pytorch:20.11-py3`
6. Select the queue for your Task's execution, for example: `--queue dual_gpu`. If a queue isn't chosen, the Task
will not be executed, it will be left in [draft mode](https://allegro.ai/clearml/docs/docs/concepts_fundamentals/concepts_fundamentals_tasks.html?highlight=draft#task-states-and-state-transitions),
and you can enqueue and execute it at a later point.