Small edits (#987)

This commit is contained in:
pollfly 2024-12-15 11:53:14 +02:00 committed by GitHub
parent 058e78e313
commit a1d2843a3b
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
7 changed files with 11 additions and 11 deletions

View File

@ -65,7 +65,7 @@ See the [HyperParameterOptimizer SDK reference page](../references/sdk/hpo_optim
### Pipeline
ClearML's `automation` module includes classes that support creating pipelines:
* [PipelineController](../pipelines/pipelines_sdk_tasks.md) - A pythonic interface for
* [PipelineController](../pipelines/pipelines_sdk_tasks.md) - A Pythonic interface for
defining and configuring a pipeline controller and its steps. The controller and steps can be functions in your
python code, or ClearML [tasks](../fundamentals/task.md).
* [PipelineDecorator](../pipelines/pipelines_sdk_function_decorators.md) - A set

View File

@ -170,7 +170,7 @@ If the `secure.conf` file does not exist, create your own in ClearML Server's `/
an alternate folder you configured), and input the modified configuration
:::
The default secret for the system's apiserver component can be overridden by setting the following environment variable:
You can override the default secret for the system's `apiserver` component by setting the following environment variable:
`CLEARML__SECURE__CREDENTIALS__APISERVER__USER_SECRET="my-new-secret"`
:::note

View File

@ -21,7 +21,7 @@ During early stages of model development, while code is still being modified hea
- **Workstation with a GPU**, usually with a limited amount of memory for small batch-sizes. Use this workstation to train
the model and ensure that you choose a model that makes sense, and the training procedure works. Can be used to provide initial models for testing.
The abovementioned setups might be folded into each other and that's great! If you have a GPU machine for each researcher, that's awesome!
These setups can be folded into each other and that's great! If you have a GPU machine for each researcher, that's awesome!
The goal of this phase is to get a code, dataset, and environment set up, so you can start digging to find the best model!
- [ClearML SDK](../../clearml_sdk/clearml_sdk.md) should be integrated into your code (check out [Getting Started](ds_first_steps.md)).

View File

@ -6,7 +6,7 @@ title: First Steps
## Install ClearML
First, [sign up for free](https://app.clear.ml)
First, [sign up for free](https://app.clear.ml).
Install the `clearml` python package:
```bash

View File

@ -46,7 +46,7 @@ We can change the tasks name by clicking it here, and add a description or ge
First of all, source code is captured. If youre working in a git repository well save your git information along with any uncommitted changes. If youre running an unversioned script, `clearml` will save the script instead.
Together with the python packages your coded uses, thisll allow you to recreate your experiment on any machine.
Together with the Python packages your code uses, this will allow you to recreate your experiment on any machine.
Similarly, all of the output the code produces will also be captured.

View File

@ -58,7 +58,7 @@ to open the app's instance launch form.
* **Base Docker Image** (optional) - Available when `Use docker mode` is selected: Default Docker image in which the
ClearML Agent will run. Provide an image stored in a Docker artifactory so instances can automatically fetch it
* **Compute Resources**
* Resource Name - Assign a name to the resource type. This name will appear in the Autoscaler dashboard
* Resource Name - Assign a name to the resource type. This name will appear in the autoscaler dashboard
* EC2 Instance Type - See [Instance Types](https://aws.amazon.com/ec2/instance-types) for full list of types
* Run in CPU mode - Check box to run with CPU only
* Use Spot Instance - Select to use a spot instance. Otherwise, a reserved instance is used.
@ -98,7 +98,7 @@ to open the app's instance launch form.
instance. Read more [here](https://docs.aws.amazon.com/vpc/latest/userguide/VPC_SecurityGroups.html)
* VPC Subnet ID - The subnet ID For the created instance. If more than one ID is provided, the instance will be started in the first available subnet. For more information, see [What is Amazon VPC?](https://docs.aws.amazon.com/vpc/latest/userguide/what-is-amazon-vpc.html)
* \+ Add Item - Define another resource type
* **IAM Instance Profile** (optional) - Set an IAM instance profile for all instances spun by the Autoscaler
* **IAM Instance Profile** (optional) - Set an IAM instance profile for all instances spun by the autoscaler
* Arn - Amazon Resource Name specifying the instance profile
* Name - Name identifying the instance profile
* **Autoscaler Instance Name** (optional) - Name for the Autoscaler instance. This will appear in the instance list
@ -129,7 +129,7 @@ The Configuration Vault is available under the ClearML Enterprise plan.
You can utilize the [configuration vault](../settings/webapp_settings_profile.md#configuration-vault) to set the following:
* `aws_region`
* `aws_credentials_key_id` and `aws_secret_access_key` - AWS credentials for the Autoscaler
* `aws_credentials_key_id` and `aws_secret_access_key` - AWS credentials for the autoscaler
* `extra_vm_bash_script` - A bash script to execute after launching the EC2 instance. This script will be appended to
the one set in the `Init script` field of the instance launch form
* `extra_clearml_conf` - ClearML configuration to use by the ClearML Agent when executing your experiments. This
@ -202,7 +202,7 @@ auto_scaler.v1.aws {
#### Configure Instances Spawned by the Autoscaler
To configure instances spawned by the autoscaler, do any of the following:
* Add the configuration in the `auto_scaler.v1.aws.extra_clearml_conf` field of the configuration vault
* Run the Autoscaler using a [ClearML Service Account](../settings/webapp_settings_users.md#service-accounts). Add the
* Run the autoscaler using a [ClearML Service Account](../settings/webapp_settings_users.md#service-accounts). Add the
configuration to the service account's configuration vault, and set the autoscaler to run under that account
in the `Run with Service Account` field
* Admins can add the configuration to a [ClearML Administrator Vault](../settings/webapp_settings_admin_vaults.md)

View File

@ -58,7 +58,7 @@ to open the app's instance launch form.
* **Base Docker Image** (optional) - Available when `Use docker mode` is selected. Default Docker image in which the ClearML Agent will run. Provide an image stored in a
Docker artifactory so VM instances can automatically fetch it
* **Compute Resources**
* Resource Name - Assign a name to the resource type. This name will appear in the Autoscaler dashboard
* Resource Name - Assign a name to the resource type. This name will appear in the autoscaler dashboard
* GCP Machine Type - See list of [machine types](https://cloud.google.com/compute/docs/machine-types)
* Run in CPU mode - Select to have the autoscaler utilize only CPU VM instances
* GPU Type - See list of [supported GPUs by instance](https://cloud.google.com/compute/docs/gpus)
@ -106,7 +106,7 @@ to open the app's instance launch form.
:::important Enterprise Feature
You can utilize the [configuration vault](../settings/webapp_settings_profile.md#configuration-vault) to configure GCP
credentials for the Autoscaler in the following format:
credentials for the autoscaler in the following format:
```
auto_scaler.v1 {