Skip to content
Snippets Groups Projects

Add RAM speeds for each partition

Merged Maximilian Sander requested to merge Add_RAM_Speed_Information into preview
All threads resolved!
1 file
+ 5
5
Compare changes
  • Side-by-side
  • Inline
@@ -110,7 +110,7 @@ CPUs.
- 630 nodes, each with
- 2 x Intel Xeon Platinum 8470 (52 cores) @ 2.00 GHz, Multithreading enabled
- 512 GB RAM
- 512 GB RAM (8 x 32 GB DDR5-4800 MT/s per socket)
- 12 nodes provide 1.8 TB local storage on NVMe device at `/tmp`
- All other nodes are diskless and have no or very limited local storage (i.e. `/tmp`)
- Login nodes: `login[1-4].barnard.hpc.tu-dresden.de`
@@ -126,7 +126,7 @@ and is designed for AI and ML tasks.
- 34 nodes, each with
- 8 x NVIDIA A100-SXM4 Tensor Core-GPUs
- 2 x AMD EPYC CPU 7352 (24 cores) @ 2.3 GHz, Multithreading available
- 1 TB RAM
- 1 TB RAM (16 x 32 GB DDR4-2933 MT/s per socket)
- 3.5 TB local memory on NVMe device at `/tmp`
- Login nodes: `login[1-2].alpha.hpc.tu-dresden.de`
- Hostnames: `i[8001-8037].alpha.hpc.tu-dresden.de`
@@ -139,7 +139,7 @@ The cluster `Romeo` is a general purpose cluster by NEC based on AMD Rome CPUs.
- 192 nodes, each with
- 2 x AMD EPYC CPU 7702 (64 cores) @ 2.0 GHz, Multithreading available
- 512 GB RAM
- 512 GB RAM (8 x 32 GB DDR4-3200 MT/s per socket)
- 200 GB local memory on SSD at `/tmp`
- Login nodes: `login[1-2].romeo.hpc.tu-dresden.de`
- Hostnames: `i[7001-7190].romeo.hpc.tu-dresden.de`
@@ -153,7 +153,7 @@ architecture.
- 1 node, with
- 32 x Intel(R) Xeon(R) Platinum 8276M CPU @ 2.20 GHz (28 cores)
- 47 TB RAM
- 47 TB RAM (12 x 128 GB DDR4-2933 MT/s per socket)
- Configured as one single node
- 48 TB RAM (usable: 47 TB - one TB is used for cache coherence protocols)
- 370 TB of fast NVME storage available at `/nvme/<projectname>`
@@ -168,7 +168,7 @@ The cluster `Power9` by IBM is based on Power9 CPUs and provides NVIDIA V100 GPU
- 32 nodes, each with
- 2 x IBM Power9 CPU (2.80 GHz, 3.10 GHz boost, 22 cores)
- 256 GB RAM DDR4 2666 MHz
- 256 GB RAM (8 x 16 GB DDR4-2666 MT/s per socket)
- 6 x NVIDIA VOLTA V100 with 32 GB HBM2
- NVLINK bandwidth 150 GB/s between GPUs and host
- Login nodes: `login[1-2].power9.hpc.tu-dresden.de`
Loading