From 35fce918ba147d611fe578807b41a8117e467493 Mon Sep 17 00:00:00 2001 From: Martin Schroschk <martin.schroschk@tu-dresden.de> Date: Fri, 25 Jun 2021 11:42:35 +0200 Subject: [PATCH] SCS5Software: Fix checks --- .../docs/software/SCS5Software.md | 122 ++++++++---------- 1 file changed, 57 insertions(+), 65 deletions(-) diff --git a/doc.zih.tu-dresden.de/docs/software/SCS5Software.md b/doc.zih.tu-dresden.de/docs/software/SCS5Software.md index 6d6032084..c813a642e 100644 --- a/doc.zih.tu-dresden.de/docs/software/SCS5Software.md +++ b/doc.zih.tu-dresden.de/docs/software/SCS5Software.md @@ -13,23 +13,24 @@ Here are the major changes from the user's perspective: ## Host Keys -Due to the new operating system, the host keys of the login nodes have -also changed. If you have logged into tauruslogin6 before and still have -the old one saved in your `known_hosts` file, just remove it and accept -the new one after comparing its fingerprint with those listed under -[Login](Login#tableLogin2). +Due to the new operating system, the host keys of the login nodes have also changed. If you have +logged into tauruslogin6 before and still have the old one saved in your `known_hosts` file, just +remove it and accept the new one after comparing its fingerprint with those listed under +[Login](../access/Login.md#ssh-access). -## Using software modules +## Using Software Modules Starting with SCS5, we only provide -[Lmod](RuntimeEnvironment#Lmod:_An_Alternative_Module_Implementation) as -the environment module tool of choice. +[Lmod](../data_management/RuntimeEnvironment.md#lmod-an-alternative-module-implementation) as the +environment module tool of choice. As usual, you can get a list of the available software modules via: - module available - # or short: - ml av +```Bash +module available +# or short: +ml av +``` There is a special module that is always loaded (sticky) called **modenv**. It determines the module environment you can see. @@ -47,48 +48,45 @@ still work under SCS5. That's why those modenv versions are hidden. Example: - $ ml modenv/classic ansys/19.0 +```Bash +$ ml modenv/classic ansys/19.0 - The following have been reloaded with a version change: - 1) modenv/scs5 => modenv/classic +The following have been reloaded with a version change: + 1) modenv/scs5 => modenv/classic - Module ansys/19.0 loaded. +Module ansys/19.0 loaded. +``` **modenv/scs5** will be loaded by default and contains all the software that was built especially for SCS5. ### Which modules should I use? -If possible, please use the modules from **modenv/scs5**. In case there -is a certain software missing, you can write an email to -<hpcsupport@zih.tu-dresden.de> and we will try to install the latest -version of this particular software for you. +If possible, please use the modules from **modenv/scs5**. In case there is a certain software +missing, you can write an [email to hpcsupport](mailto:hpcsupport@zih.tu-dresden.de) and we will try +to install the latest version of this particular software for you. -However, if you still need *older* versions of some software, you have -to resort to using the modules in the old module environment -(**modenv/classic** most probably). We won't keep those around forever -though, so in the long-term, it is advisable to migrate your workflow to -up-to-date versions of the software used. +However, if you still need *older* versions of some software, you have to resort to using the +modules in the old module environment (**modenv/classic** most probably). We won't keep those around +forever though, so in the long-term, it is advisable to migrate your workflow to up-to-date versions +of the software used. ### Compilers, MPI-Libraries and Toolchains -Since we are mainly using EasyBuild to install software now, we are -following their toolchain schemes: -<http://easybuild.readthedocs.io/en/latest/Common-toolchains.html> +Since we are mainly using EasyBuild to install software now, we are following their +[toolchain schemes](http://easybuild.readthedocs.io/en/latest/Common-toolchains.html). -We mostly install software using the "intel" toolchain, because in most -cases, the resulting code performs best on our Intel-based -architectures. There are alternatives like GCC (foss), PGI or Clang/LLVM -though. +We mostly install software using the "intel" toolchain, because in most cases, the resulting code +performs best on our Intel-based architectures. There are alternatives like GCC (foss), PGI or +Clang/LLVM though. -Generally speaking, the toolchains in this new environment are separated -into more parts (modules) than you will be used to, coming from -modenv/classic. A full toolchain, like "intel", "foss" or "iomkl" -consists of several sub-modules making up the layers of +Generally speaking, the toolchains in this new environment are separated into more parts (modules) +than you will be used to, coming from modenv/classic. A full toolchain, like "intel", "foss" or +"iomkl" consists of several sub-modules making up the layers of -- compilers -- MPI library -- math library (providing BLAS/LAPACK/FFT routines etc.) +- compilers +- MPI library +- math library (providing BLAS/LAPACK/FFT routines etc.) For instance, the "intel" toolchain has the following structure: @@ -108,8 +106,8 @@ On the other hand, the "foss" toolchain looks like this: | mpi library | OpenMPI | | math libraries | OpenBLAS, FFTW | -If you want to combine the Intel compilers and MKL with OpenMPI, you'd -have to use the "iomkl" toolchain: +If you want to combine the Intel compilers and MKL with OpenMPI, you'd have to use the "iomkl" +toolchain: | | | |--------------|------------| @@ -118,40 +116,34 @@ have to use the "iomkl" toolchain: | mpi library | OpenMPI | | math library | imkl | -There are also subtoolchains that skip a layer or two, e.g. "iccifort" -only consists of the respective compilers, same as "GCC". Then there is -"iompi" that includes Intel compilers+OpenMPI but no math library, etc. +There are also subtoolchains that skip a layer or two, e.g. "iccifort" only consists of the +respective compilers, same as "GCC". Then there is "iompi" that includes Intel compilers+OpenMPI but +no math library, etc. #### What is this "GCCcore" I keep seeing and how does it relate to "GCC"? -GCCcore includes only the compilers/standard libraries of the GNU -compiler collection but without "binutils". It is used as a dependency -for many modules without getting in the way, e.g. the Intel compilers -also rely on libstdc++ from GCC, but you don't want to load two compiler -modules at the same time, so "intel" also depends on "GCCcore". You can -think of it as more of a runtime dependency rather than a full-fledged -compiler toolchain. If you want to compile your own code with the GNU -compilers, you have to load the module: "**GCC"** instead, "GCCcore" -won't be enough. - -There are [ongoing -discussions](https://github.com/easybuilders/easybuild-easyconfigs/issues/6366) -in the EasyBuild community to maybe change this in the future in order -to avoid the potential confusion this GCCcore module brings with it. +GCCcore includes only the compilers/standard libraries of the GNU compiler collection but without +"binutils". It is used as a dependency for many modules without getting in the way, e.g. the Intel +compilers also rely on libstdc++ from GCC, but you don't want to load two compiler modules at the +same time, so "intel" also depends on "GCCcore". You can think of it as more of a runtime dependency +rather than a full-fledged compiler toolchain. If you want to compile your own code with the GNU +compilers, you have to load the module: "**GCC"** instead, "GCCcore" won't be enough. + +There are [ongoing discussions](https://github.com/easybuilders/easybuild-easyconfigs/issues/6366) +in the EasyBuild community to maybe change this in the future in order to avoid the potential +confusion this GCCcore module brings with it. #### I have been using "bullxmpi" so far, where can I find it? -bullxmpi was more or less a rebranded OpenMPI 1.6 with some additions -from Bull. It is not supported anymore and Bull has abandoned it in -favor of a standard OpenMPI 2.0.2 build as their default in SCS5. You -should migrate your code to our OpenMPI module or maybe even try Intel -MPI instead. +bullxmpi was more or less a rebranded OpenMPI 1.6 with some additions from Bull. It is not supported +anymore and Bull has abandoned it in favor of a standard OpenMPI 2.0.2 build as their default in +SCS5. You should migrate your code to our OpenMPI module or maybe even try Intel MPI instead. #### Where have the analysis tools from Intel Parallel Studio XE gone? -Since "intel" is only a toolchain module now, it does not include the -entire Parallel Studio anymore. Tools like the Intel Advisor, Inspector, -Trace Analyzer or VTune Amplifier are available as separate modules now: +Since "intel" is only a toolchain module now, it does not include the entire Parallel Studio +anymore. Tools like the Intel Advisor, Inspector, Trace Analyzer or VTune Amplifier are available as +separate modules now: | product | module | |:----------------------|:----------| -- GitLab