Small edits (#433)

This commit is contained in:
pollfly 2023-01-10 10:29:40 +02:00 committed by GitHub
parent 6452838046
commit d0e4d14573
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
4 changed files with 20 additions and 21 deletions

View File

@ -94,7 +94,7 @@ continue running. When set to `true`, the agent crashes when encountering an exc
**`agent.disable_ssh_mount`** (*bool*)
* Set to `true` to disables the auto `.ssh` mount into the docker. The environment variable `CLEARML_AGENT_DISABLE_SSH_MOUNT`
* Set to `true` to disable the auto `.ssh` mount into the docker. The environment variable `CLEARML_AGENT_DISABLE_SSH_MOUNT`
overrides this configuration option.
___
@ -340,8 +340,8 @@ ___
**`agent.worker_name`** (*string*)
* Use to replace the hostname when creating a worker, if `agent.worker_id` is not specified. For example, if `worker_name`
is `MyMachine` and the process_id is `12345`, then the worker is name `MyMachine.12345`.
* Use to replace the hostname when creating a worker if `agent.worker_id` is not specified. For example, if `worker_name`
is `MyMachine` and the `process_id` is `12345`, then the worker is named `MyMachine.12345`.
Alternatively, specify the environment variable `CLEARML_WORKER_ID` to override this worker name.
@ -420,7 +420,7 @@ match_rules: [
**`agent.package_manager.conda_channels`** (*[string]*)
* If conda is used, then this is list of conda channels to use when installing Python packages.
* If conda is used, then this is the list of conda channels to use when installing Python packages.
---
@ -875,13 +875,13 @@ and limitations on bucket naming.
**`sdk.azure.storage.containers.account_name`** (*string*)
* For Azure Storage, this is account name.
* For Azure Storage, this is the account name.
---
**`sdk.azure.storage.containers.container_name`** (*string*)
* For Azure Storage, this the container name.
* For Azure Storage, this is the container name.
<br/>

View File

@ -19,13 +19,12 @@ class. The storage examples include:
### Downloading a File
To download a ZIP file from storage to the `global` cache context, call the [StorageManager.get_local_copy](../../references/sdk/storage.md#storagemanagerget_local_copy)
method, and specify the destination location as the `remote_url` argument:
class method, and specify the destination location as the `remote_url` argument:
```python
# create a StorageManager instance
manager = StorageManager()
from clearml import StorageManager
manager.get_local_copy(remote_url="s3://MyBucket/MyFolder/file.zip")
StorageManager.get_local_copy(remote_url="s3://MyBucket/MyFolder/file.zip")
```
:::note
@ -35,13 +34,13 @@ Zip and tar.gz files will be automatically extracted to cache. This can be contr
To download a file to a specific context in cache, specify the name of the context as the `cache_context` argument:
```python
manager.get_local_copy(remote_url="s3://MyBucket/MyFolder/file.ext", cache_context="test")
StorageManager.get_local_copy(remote_url="s3://MyBucket/MyFolder/file.ext", cache_context="test")
```
To download a non-compressed file, set the `extract_archive` argument to `False`.
```python
manager.get_local_copy(remote_url="s3://MyBucket/MyFolder/file.ext", extract_archive=False)
StorageManager.get_local_copy(remote_url="s3://MyBucket/MyFolder/file.ext", extract_archive=False)
```
By default, the `StorageManager` reports its download progress to the console every 5MB. You can change this using the
@ -51,11 +50,11 @@ class method, and specifying the chunk size in MB (not supported for Azure and G
### Uploading a File
To upload a file to storage, call the [StorageManager.upload_file](../../references/sdk/storage.md#storagemanagerupload_file)
method. Specify the full path of the local file as the `local_file` argument, and the remote URL as the `remote_url`
class method. Specify the full path of the local file as the `local_file` argument, and the remote URL as the `remote_url`
argument.
```python
manager.upload_file(
StorageManager.upload_file(
local_file="/mnt/data/also_file.ext", remote_url="s3://MyBucket/MyFolder"
)
```
@ -70,9 +69,9 @@ class method, and specifying the chunk size in MB (not supported for Azure and G
### Setting Cache Limits
To set a limit on the number of files cached, call the [StorageManager.set_cache_file_limit](../../references/sdk/storage.md#storagemanagerset_cache_file_limit)
method and specify the `cache_file_limit` argument as the maximum number of files. This does not limit the cache size,
class method and specify the `cache_file_limit` argument as the maximum number of files. This does not limit the cache size,
only the number of files.
```python
new_cache_limit = manager.set_cache_file_limit(cache_file_limit=100)
new_cache_limit = StorageManager.set_cache_file_limit(cache_file_limit=100)
```