Skip to content
Snippets Groups Projects
Commit 9e53b637 authored by Jan Frenzel's avatar Jan Frenzel
Browse files

Added dtrclone command in object_storage.md.

parent df340218
No related branches found
No related tags found
2 merge requests!839Automated merge from preview to main,!819Added page about object storage (S3)
...@@ -275,13 +275,18 @@ y/n> n ...@@ -275,13 +275,18 @@ y/n> n
## Copying Data from/to Object Storage ## Copying Data from/to Object Storage
The following commands show how to create a bucket `mystorage` in your part of the object store and The following commands show how to create a bucket `mystorage` in your part of the object store:
copy a file `largedata.tar.gz` to it.
```console ```console
marie@login$ module load rclone marie@login$ module load rclone
marie@login$ rclone mkdir s3store:mystorage marie@login$ rclone mkdir s3store:mystorage
marie@login$ rclone copy --s3-acl "public-read" largedata.tar.gz s3store:mystorage ```
After these commands, you can copy a file `largedata.tar.gz` to it in a separate job with the help
of the [Datamover](datamover.md). Adjust the parameters `time` and `account` as required:
```console
marie@login$ dtrclone --time=0:10:00 --account=p_number_crunch copy --s3-acl "public-read" largedata.tar.gz s3store:mystorage
``` ```
!!! warning "Restricted access" !!! warning "Restricted access"
...@@ -289,11 +294,18 @@ marie@login$ rclone copy --s3-acl "public-read" largedata.tar.gz s3store:mystora ...@@ -289,11 +294,18 @@ marie@login$ rclone copy --s3-acl "public-read" largedata.tar.gz s3store:mystora
If you want to restrict access to your data, replace the last command with: If you want to restrict access to your data, replace the last command with:
```console ```console
marie@login$ rclone copy largedata.tar.gz s3store:mystorage marie@login$ dtrclone --time=0:10:00 --account=p_number_crunch copy largedata.tar.gz s3store:mystorage
``` ```
Then, it is not possible to access your data without providing your credentials. Then, it is not possible to access your data without providing your credentials.
For small files, you can also directly copy data:
```console
marie@login$ module load rclone
marie@login$ rclone copy --s3-acl "public-read" largedata.tar.gz s3store:mystorage
```
## Accessing the Object Storage ## Accessing the Object Storage
The following commands show different possibilities to access a file from object storage. The following commands show different possibilities to access a file from object storage.
...@@ -301,8 +313,7 @@ The following commands show different possibilities to access a file from object ...@@ -301,8 +313,7 @@ The following commands show different possibilities to access a file from object
### Copying a File from Object Storage to ZIH systems ### Copying a File from Object Storage to ZIH systems
```console ```console
marie@login$ module load rclone marie@login$ dtrclone --time=0:10:00 --account=p_number_crunch copy s3store:mystorage/largedata.tar.gz .
marie@login$ rclone copy s3store:mystorage/largedata.tar.gz .
``` ```
### Copying a File from Object Storage to Your Workstation ### Copying a File from Object Storage to Your Workstation
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment