Small edits (#784)

This commit is contained in:
pollfly 2024-02-26 17:24:58 +02:00 committed by GitHub
parent 91fcaa2f24
commit 90f2affe91
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
5 changed files with 12 additions and 12 deletions

View File

@ -103,9 +103,9 @@ clearml-data remove [-h] [--id ID] [--files [FILES [FILES ...]]]
## upload
Upload the local dataset changes to the server. By default, it's uploaded to the [ClearML Server](../deploying_clearml/clearml_server.md). You can specify a different storage
Upload the local dataset changes to the server. By default, it's uploaded to the ClearML file server. You can specify a different storage
medium by entering an upload destination. For example:
* A shared folder: `:/mnt/shared/folder`
* A shared folder: `/mnt/shared/folder`
* S3: `s3://bucket/folder`
* Non-AWS S3-like services (e.g. MinIO): `s3://host_addr:port/bucket`
* Google Cloud Storage: `gs://bucket-name/folder`

View File

@ -69,8 +69,8 @@ Use the `output_uri` parameter to specify a network storage target to upload the
* Google Cloud Storage: `gs://bucket-name/folder`
* Azure Storage: `azure://<account name>.blob.core.windows.net/path/to/file`
By default, the dataset uploads to ClearML's file server. The `output_uri` parameter of the [`Dataset.upload`](#uploading-files)
method overrides this parameter's value.
By default, the dataset uploads to ClearML's file server. The `output_uri` parameter of [`Dataset.upload()`](#uploading-files)
overrides this parameter's value.
The created dataset inherits the content of the `parent_datasets`. When multiple dataset parents are listed,
they are merged in order of specification. Each parent overrides any overlapping files from a previous parent dataset.
@ -98,8 +98,8 @@ squashed_dataset_2 = Dataset.squash(
)
```
In addition, the target storage location for the squashed dataset can be specified using the `output_uri` parameter of the
[`Dataset.squash`](../references/sdk/dataset.md#datasetsquash) method.
In addition, the target storage location for the squashed dataset can be specified using the `output_uri` parameter of
[`Dataset.squash()`](../references/sdk/dataset.md#datasetsquash).
## Accessing Datasets
Once a dataset has been created and uploaded to a server, the dataset can be accessed programmatically from anywhere.

View File

@ -62,7 +62,7 @@ Upload the dataset:
dataset.upload()
```
By default, the dataset is uploaded to the ClearML File Server. The dataset's destination can be changed by specifying the
By default, the dataset is uploaded to the ClearML file server. The dataset's destination can be changed by specifying the
target storage with the `output_url` parameter of the [`upload`](../../references/sdk/dataset.md#upload) method.
### Finalizing the Dataset

View File

@ -685,7 +685,7 @@ task = Task.init(project_name, task_name, output_uri="s3://bucket-name/folder")
task = Task.init(project_name, task_name, output_uri="gs://bucket-name/folder")
```
To use Cloud storage with ClearML, configure the storage credentials in your `~/clearml.conf`. For detailed information,
To use cloud storage with ClearML, configure the storage credentials in your `~/clearml.conf`. For detailed information,
see [ClearML Configuration Reference](configs/clearml_conf.md).
<a id="pycharm-remote-debug-detect-git"></a>

View File

@ -18,7 +18,7 @@ class. The storage examples include:
### Downloading a File
To download a ZIP file from storage to the `global` cache context, call the [StorageManager.get_local_copy](../../references/sdk/storage.md#storagemanagerget_local_copy)
To download a ZIP file from storage to the `global` cache context, call the [`StorageManager.get_local_copy`](../../references/sdk/storage.md#storagemanagerget_local_copy)
class method, and specify the destination location as the `remote_url` argument:
```python
@ -49,7 +49,7 @@ class method, and specifying the chunk size in MB (not supported for Azure and G
### Uploading a File
To upload a file to storage, call the [StorageManager.upload_file](../../references/sdk/storage.md#storagemanagerupload_file)
To upload a file to storage, call the [`StorageManager.upload_file`](../../references/sdk/storage.md#storagemanagerupload_file)
class method. Specify the full path of the local file as the `local_file` argument, and the remote URL as the `remote_url`
argument.
@ -59,7 +59,7 @@ StorageManager.upload_file(
)
```
Use the `retries parameter` to set the number of times file upload should be retried in case of failure.
Use the `retries` parameter to set the number of times file upload should be retried in case of failure.
By default, the `StorageManager` reports its upload progress to the console every 5MB. You can change this using the
[`StorageManager.set_report_upload_chunk_size`](../../references/sdk/storage.md#storagemanagerset_report_upload_chunk_size)
@ -68,7 +68,7 @@ class method, and specifying the chunk size in MB (not supported for Azure and G
### Setting Cache Limits
To set a limit on the number of files cached, call the [StorageManager.set_cache_file_limit](../../references/sdk/storage.md#storagemanagerset_cache_file_limit)
To set a limit on the number of files cached, call the [`StorageManager.set_cache_file_limit`](../../references/sdk/storage.md#storagemanagerset_cache_file_limit)
class method and specify the `cache_file_limit` argument as the maximum number of files. This does not limit the cache size,
only the number of files.