Skip to content
Snippets Groups Projects
Commit 07d19e98 authored by Martin Schroschk's avatar Martin Schroschk
Browse files

Merge branch 'merge-main-preview' into 'preview'

Merge main preview

See merge request !898
parents 83f5884a f636903b
No related branches found
No related tags found
2 merge requests!899Preview,!898Merge main preview
......@@ -57,7 +57,7 @@ These `/data/horse` and `/data/walrus` can be accesed via workspaces. Please ref
!!! Warning
All old filesystems fill be shutdown by the end of 2023.
To work with your data from Taurus you might have to move/copy them to the new storages.
For this, we have four new [datamover nodes](/data_transfer/datamover) that have mounted all storages
......@@ -110,7 +110,7 @@ of the old and new system. (Do not use the datamovers from Taurus!)
??? "Migration from `/lustre/ssd` or `/beegfs`"
**You** are entirely responsible for the transfer of these data to the new location.
Start the dtrsync process as soon as possible. (And maybe repeat it at a later time.)
Start the dtrsync process as soon as possible. (And maybe repeat it at a later time.)
??? "Migration from `/lustre/scratch2` aka `/scratch`"
......@@ -120,7 +120,7 @@ of the old and new system. (Do not use the datamovers from Taurus!)
to `/data/walrus/warm_archive/ws`.
In case you need to update this (Gigabytes, not Terabytes!) please run `dtrsync` like in
`dtrsync -a /data/old/lustre/scratch2/ws/0/my-workspace/newest/ /data/horse/lustre/scratch2/ws/0/my-workspace/newest/`
`dtrsync -a /data/old/lustre/scratch2/ws/0/my-workspace/newest/ /data/horse/lustre/scratch2/ws/0/my-workspace/newest/`
??? "Migration from `/warm_archive`"
......
......@@ -28,7 +28,7 @@ or setting the option as argument, in case you invoke `mpirun` directly
mpirun --mca io ^ompio ...
```
## Mpirun on partition `alpha`and `ml`
## Mpirun on partition `alpha` and `ml`
Using `mpirun` on partitions `alpha` and `ml` leads to wrong resource distribution when more than
one node is involved. This yields a strange distribution like e.g. `SLURM_NTASKS_PER_NODE=15,1`
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment