---
title: Tracking Experiments and Visualizing Results
---
While an experiment is running, and any time after it finishes, track it and visualize the results in the ClearML Web UI,
including:
* [Execution details](#execution) - Code, the base Docker image used for [ClearML Agent](../clearml_agent.md), output destination for artifacts, and the logging level.
* [Configuration](#configuration) - Hyperparameters, user properties, and configuration objects.
* [Artifacts](#artifacts) - Input model, output model, model snapshot locations, other artifacts.
* [General information](#general-information) - Information about the experiment, for example: the experiment start, create, and last update times and dates, user creating the experiment, and its description.
* [Console](#console) - stdout, stderr, output to the console from libraries, and ClearML explicit reporting.
* [Scalars](#scalars) - Metric plots.
* [Plots](#plots) - Other plots and data, for example: Matplotlib, Plotly, and ClearML explicit reporting.
* [Debug samples](#debug-samples) - Images, audio, video, and HTML.
## Viewing Modes
The ClearML Web UI provides two viewing modes for experiment details:
* The info panel
* Full screen details mode.
Both modes contain all experiment details. When either view is open, switch to the other mode by clicking
(**View in experiments table / full screen**), or clicking (**menu**) > **View in experiments
table / full screen**.
### Info Panel
The info panel keeps the experiment table in view so that [experiment actions](webapp_exp_table.md#experiment-actions)
can be performed from the table (as well as the menu in the info panel).
![Info panel](../img/webapp_tracking_40.png)
Click to
hide details in the experiment table, so only the experiment names and statuses are displayed
![Compressed info panel](../img/webapp_tracking_41.png)
### Full Screen Details View
The full screen details view allows for easier viewing and working with experiment tracking and results. The experiments
table is not visible when the full screen details view is open. Perform experiment actions from the menu.
![Full screen view](../img/webapp_tracking_33.png)
## Execution
An experiment's **EXECUTION** tab of lists the following:
* Source code
* Uncommitted changes
* Installed Python packages
* Container details
* Output details
In full-screen mode, the source code and output details are grouped in the **DETAILS** section.
### Source Code
The Source Code section of an experiment's **EXECUTION** tab includes:
* The experiment's repository
* Commit ID
* Script path
* Working directory
* Binary (Python executable)
![Source code section](../img/webapp_exp_source_code.png)
### Uncommitted Changes
ClearML displays the git diff of the experiment in the Uncommitted Changes section.
![Uncommitted changes section](../img/webapp_exp_uncommitted_changes.png)
### Installed Packages
The Installed Packages section lists the experiment's installed Python packages and their versions.
![Installed packages section](../img/webapp_exp_installed_packages.png)
When a ClearML agent executing an experiment ends up using a different set of python packages than was originally
specified, both the original specification (`original pip` or `original conda`), and the packages the agent ended up
using to set up an environment (`pip` or `conda`) are available. Select which requirements to view in the dropdown menu.
![Packages used by agent](../img/webapp_exp_installed_packages_2.png.png)
### Container
The Container section list the following information:
* Image - a pre-configured Docker that ClearML Agent will use to remotely execute this experiment (see [Building Docker containers](../clearml_agent.md#exporting-a-task-into-a-standalone-docker-container))
* Arguments - add Docker arguments
* Setup shell script - a bash script to be executed inside the Docker before setting up the experiment's environment
![Container section](../img/webapp_exp_container.png)
### Output
The Output details include:
* The output destination used for storing model checkpoints (snapshots) and artifacts (see also, [default_output_uri](../configs/clearml_conf.md#config_default_output_uri)
in the configuration file, and `output_uri` in [`Task.init`](../references/sdk/task.md#taskinit) parameters).
![Execution details section](../img/webapp_exp_output.png)
## Configuration
All parameters and configuration objects appear in the **CONFIGURATION** tab.
### Hyperparameters
Hyperparameters are grouped by their type and appear in **CONFIGURATION** **>** **HYPERPARAMETERS**. Once an experiment
is run and stored in ClearML Server, any of these hyperparameters can be [modified](webapp_exp_tuning.md#modifying-experiments).
#### Command Line Arguments
The **Args** group shows automatically logged argument parser parameters (e.g. `argparse`, `click`, `hydra`).
Hover over (menu) on a
parameter's line, and the type, description, and default value appear, if they were provided.
![Command line arguments configuration group](../img/webapp_tracking_22.png)
#### Environment Variables
If the `CLEARML_LOG_ENVIRONMENT` variable was set, the **Environment** group will show environment variables (see [this FAQ](../faq.md#track-env-vars)).
![Environment variables configuration group](../img/webapp_tracking_23.png)
#### Custom Parameter Groups
Custom parameter groups show parameter dictionaries if the parameters were connected to the Task, using
[`Task.connect()`](../references/sdk/task.md#connect) with a `name` argument provided. `General` is the default section
if a name is not provided.
![Custom parameters group](../img/webapp_tracking_25.png)
#### TensorFlow Definitions
The **TF_DEFINE** parameter group shows automatic TensorFlow logging.
![TF_DEFINE parameter group](../img/webapp_tracking_26.png)
### User Properties
User properties allow to store any descriptive information in a key-value pair format. They are editable in any experiment,
except experiments whose status is *Published* (read-only).
![User properties section](../img/webapp_tracking_21.png)
### Configuration Objects
ClearML tracks experiment (Task) model configuration objects, which appear in **Configuration Objects** **>** **General**.
These objects include those that are automatically tracked, and those connected to a Task in code (see [`Task.connect_configuration`](../references/sdk/task.md#connect_configuration)).
![Configuration objects](../img/webapp_tracking_24.png)
ClearML supports providing a name for a Task model configuration object (see the `name`
parameter in [`Task.connect_configuration`](../references/sdk/task.md#connect_configuration)).
![Custom configuration objects](../img/webapp_tracking_28.png)
## Artifacts
Artifacts tracked in an experiment appear in the **ARTIFACTS** tab, and include models and other artifacts.
Artifacts location is stored in the `FILE PATH` field.
The UI provides locally stored artifacts with a 'copy to clipboard' action ()
to facilitate local storage access (since web applications are prohibited from accessing the local disk for security reasons).
The UI provides Network hosted (e.g. https://, s3:// etc. URIs) artifacts with a download action ()
to retrieve these files.
### Models
The input and output models appear in the **ARTIFACTS** tab. Models are associated with the experiment, but to see further model details,
including design, label enumeration, and general information, go to the **MODELS** tab, by clicking the model name, which is a hyperlink to those details.
**To retrieve a model:**
1. In the **ARTIFACTS** tab **>** **MODELS** **>** **Input Model** or **Output Model**, click the model name hyperlink.
1. In the model details **>** **GENERAL** tab **>** **MODEL URL**, either:
* Download the model, if it is stored in remote storage.
* Copy its location to the clipboard ,
if it is in a local file.
![Models in Artifacts tab](../img/webapp_exp_artifacts_01.png)
### Other Artifacts
Other artifacts, which are uploaded but not dynamically tracked after the upload, appear in the **OTHER** section.
They include the file path, file size, and hash.
**To retrieve Other artifacts:**
In the **ARTIFACTS** tab **>** **OTHER** **>** Select an artifact **>** Either:
* Download the artifact , if it is stored in remote storage.
* Copy its location to the clipboard ,
if it is in a local file.
![Other artifacts section](../img/webapp_tracking_30.png)
## General Information
General experiment details appear in the **INFO** tab. This includes information describing the stored experiment:
* The parent experiment
* Project name
* Creation, start, and last update dates and times
* User who created the experiment
* Experiment state (status)
* Whether the experiment is archived
* Runtime properties - Information about the machine running the experiment, including:
* Operating system
* CUDA driver version
* Number of CPU cores
* Number of GPUs
* CPU / GPU type
* Memory size
* Host name
* Processor
* Python version
* Experiment Progress
![Info tab](../img/webapp_tracking_31.png)
## Experiment Results
:::tip Embedding ClearML Visualization
You can embed experiment plots and debug samples into ClearML [Reports](webapp_reports.md). These visualizations are
updated live as the experiment(s) updates. The Enterprise Plan and Hosted Service support embedding resources in external
tools (e.g. Notion). See [Plot Controls](#plot-controls).
:::
### Console
The complete experiment log containing everything printed to stdout and stderr appears in the **CONSOLE** tab. The full log
is downloadable. To view the end of the log, click **Jump to end**.
| Set data hover mode: