for different packages / package versions, or worse - manage different dockers for different package versions.
Not to mention, when working on remote machines, executing experiments and tracking what's running where and making sure machines are fully utilized at all times
ClearML Agent was designed to deal with these and more! It is a module responsible for executing experiments
on remote machines, on premise, or in the cloud!
It will set up the environment for the specific Task (inside a docker, or bare-metal) install the required python packages and execute & monitor the process itself.
If you've already created credentials, you can copy-paste the default agent section from [here](https://github.com/allegroai/clearml-agent/blob/master/docs/clearml.conf#L15) (this is optional. If the section is not provided the default values will be used)
Creating a new "job" to be executed, is essentially cloning a Task in the system, then enqueueing the Task in one of the execution queues for the agent to execute it.
When cloning a Task we are creating another copy of the Task in a *draft* mode, allowing us to edit the Task's environment definitions.
We can edit the git / code references, control the python packages to be installed, specify docker container image to be used, or change the hyper-parameters and configuration files.
Once we are done, enqueuing the Task in one of the execution queues will put it in the execution queue.
Multiple agents can listen to the same queue (or even multiple queues), but only a single agent will pick the Task to be executed.
You can clone an experiments from our [examples](https://app.community.clear.ml/projects/764d8edf41474d77ad671db74583528d/experiments) project and enqueue it to a queue!
### Accessing Previously Executed Experiments
All executed Tasks in the system can be accessed based on the unique Task ID, or by searching for the Task based on its properties.
Artifacts are the base of ClearML's [Data Management](../../clearml_data/clearml_data.md) solution and as a way to communicate complex objects between different
[ClearML Data](../../clearml_data/clearml_data.md) allows you to version your data so it's never lost, fetch it from every machine with minimal code changes,
Logging data can be done via command line, or via code. If any preprocessing code is involved, ClearML logs it as well! Once data is logged, it can be used by other experiments.