diff --git a/doc.zih.tu-dresden.de/docs/access/jupyterhub.md b/doc.zih.tu-dresden.de/docs/access/jupyterhub.md index f9a916195ecbf814cf426beb4d26885500b3b3de..9f2e215b18bee65af4f832176e64bc9f6d4ea8e1 100644 --- a/doc.zih.tu-dresden.de/docs/access/jupyterhub.md +++ b/doc.zih.tu-dresden.de/docs/access/jupyterhub.md @@ -1,8 +1,8 @@ # JupyterHub -With our JupyterHub service we offer you a quick and easy way to work with Jupyter notebooks on ZIH -systems. This page covers starting and stopping JupyterHub sessions, error handling and customizing -the environment. +With our JupyterHub service we offer you a quick and easy way to work with +Jupyter notebooks on ZIH systems. This page covers starting and stopping +JupyterHub sessions, error handling and customizing the environment. We also provide a comprehensive documentation on how to use [JupyterHub for Teaching (git-pull feature, quickstart links, direct links to notebook files)](jupyterhub_for_teaching.md). @@ -13,16 +13,17 @@ We also provide a comprehensive documentation on how to use The JupyterHub service is provided *as-is*, use at your own discretion. -Please understand that JupyterHub is a complex software system of which we are not the developers -and don't have any downstream support contracts for, so we merely offer an installation of it but -cannot give extensive support in every case. +Please understand that JupyterHub is a complex software system of which we are +not the developers and don't have any downstream support contracts for, so we +merely offer an installation of it but cannot give extensive support in every +case. ## Access !!! note This service is only available for users with an active HPC project. - See [Application for Login and Resources](../application/overview.md), if you need to apply for - an HPC project. + See [Application for Login and Resources](../application/overview.md), if + you need to apply for an HPC project. JupyterHub is available at [https://taurus.hrsk.tu-dresden.de/jupyter](https://taurus.hrsk.tu-dresden.de/jupyter). @@ -60,7 +61,8 @@ presets in text files. ## Applications -You can choose between JupyterLab or classic Jupyter notebooks as outlined in the following. +You can choose between JupyterLab or classic Jupyter notebooks as outlined in +the following. ### JupyterLab @@ -80,30 +82,50 @@ several views: ### Classic Jupyter Notebook -Initially your `home` directory is listed. You can open existing notebooks or files by navigating to -the corresponding path and clicking on them. +Initially your `home` directory is listed. You can open existing notebooks or +files by navigating to the corresponding path and clicking on them.  {: align="center"} -Above the table on the right side is the button `New` which lets you create new notebooks, files, -directories or terminals. +Above the table on the right side is the button `New` which lets you create new +notebooks, files, directories or terminals. - + {: align="center"} ## Jupyter Notebooks in General -In JupyterHub you can create scripts in notebooks. Notebooks are programs which are split into -multiple logical code blocks. In between those code blocks you can insert text blocks for -documentation and each block can be executed individually. Each notebook is paired with a kernel -running the code. We currently offer one for Python, C++, MATLAB and R. +In JupyterHub you can create scripts in notebooks. Notebooks are programs which +are split into multiple logical code blocks. In between those code blocks you +can insert text blocks for documentation and each block can be executed +individually. Each notebook is paired with a kernel running the code. We +currently offer one for Python, C++, MATLAB and R. + +### Version Control of Jupyter Notebooks with Git + +Since Jupyter notebooks are files containing multiple blocks for input code, +documentation, output and further information, it is difficult to use them with +Git. Version tracking of the `.ipynb` notebook files can be improved with the +[Jupytext plugin](https://jupytext.readthedocs.io/en/latest/). Jupytext will +provide Markdown (`.md`) and Python (`.py`) conversions of notebooks on the fly, +next to `.ipynb`. Tracking these files will then provide a cleaner git history. +A further advantage is that Python notebook versions can be imported, allowing +to split larger notebooks into smaller ones, based on chained imports. + +!!! note + The Jupytext plugin is not installed on the ZIH system at the moment. + Currently it can be [installed](https://jupytext.readthedocs.io/en/latest/install.html) + by the users with `--user` parameter. + Therefore `ipynb` files need to be made available in a repository for shared + usage within the ZIH system. ## Stop a Session -It is good practice to stop your session once your work is done. This releases resources for other -users and your quota is less charged. If you just log out or close the window, your server continues -running and **will not stop** until the Slurm job runtime hits the limit (usually 8 hours). +It is good practice to stop your session once your work is done. This releases +resources for other users and your quota is less charged. If you just log out or +close the window, your server continues running and **will not stop** until the +Slurm job runtime hits the limit (usually 8 hours). At first you have to open the JupyterHub control panel. @@ -120,7 +142,8 @@ of your screen.  {: align="center"} -Now you are back on the JupyterHub page and you can stop your server by clicking on +Now you are back on the JupyterHub page and you can stop your server by clicking +on  {: align="center"} @@ -146,10 +169,11 @@ Useful pages for valid batch system parameters:  {: align="center"} -If the connection to your notebook server unexpectedly breaks, you will get this error message. -Sometimes your notebook server might hit a batch system or hardware limit and gets killed. Then -usually the log file of the corresponding batch job might contain useful information. These log -files are located in your `home` directory and have the name `jupyter-session-<jobid>.log`. +If the connection to your notebook server unexpectedly breaks, you will get this +error message. Sometimes your notebook server might hit a batch system or +hardware limit and gets killed. Then usually the log file of the corresponding +batch job might contain useful information. These log files are located in your +`home` directory and have the name `jupyter-session-<jobid>.log`. ## Advanced Tips @@ -163,8 +187,9 @@ exact standard environment through the spawner form:  {: align="center"} -This list shows all packages of the currently selected conda environment. This depends on your -settings for partition (CPU architecture) and standard environment. +This list shows all packages of the currently selected conda environment. This +depends on your settings for partition (CPU architecture) and standard +environment. There are three standard environments: @@ -172,8 +197,9 @@ There are three standard environments: - test - python-env-python3.8.6 -**Python-env-python3.8.6** virtual environment can be used for all x86 partitions(`gpu2`, `alpha`, -etc). It gives the opportunity to create a user kernel with the help of a Python environment. +**Python-env-python3.8.6** virtual environment can be used for all x86 +partitions(`gpu2`, `alpha`, etc). It gives the opportunity to create a user +kernel with the help of a Python environment. Here is a short list of some included software: @@ -185,8 +211,8 @@ Here is a short list of some included software: | PyTorch | 1.3.1 | 1.3.1 | | TensorFlow | 2.1.1 | 2.1.1 | | Keras | 2.3.1 | 2.3.1 | -| numpy | 1.17.5 | 1.17.4 | -| matplotlib | 3.3.1 | 3.0.3 | +| NumPy | 1.17.5 | 1.17.4 | +| Matplotlib | 3.3.1 | 3.0.3 | \* generic = all partitions except ml @@ -196,16 +222,17 @@ Here is a short list of some included software: !!! info - Interactive code interpreters which are used by Jupyter notebooks are called *kernels*. Creating - and using your own kernel has the benefit that you can install your own preferred Python - packages and use them in your notebooks. + Interactive code interpreters which are used by Jupyter notebooks are called + *kernels*. Creating and using your own kernel has the benefit that you can + install your own preferred Python packages and use them in your notebooks. We currently have two different architectures at ZIH systems. Build your kernel environment on the **same architecture** that you want to use later on with the kernel. In the examples below we use the name "my-kernel" for our user kernel. We recommend to prefix your kernels with keywords like `haswell`, `ml`, `romeo`, `venv`, `conda`. This way you -can later recognize easier how you built the kernel and on which hardware it will work. +can later recognize easier how you built the kernel and on which hardware it +will work. **Intel nodes** (e.g. partition `haswell`, `gpu2`): @@ -219,11 +246,12 @@ maria@login$ srun --pty --ntasks=1 --cpus-per-task=2 --mem-per-cpu=2541 --time=0 maria@login$ srun --pty --partition=ml --ntasks=1 --cpus-per-task=2 --mem-per-cpu=1443 --time=08:00:00 bash -l ``` -Create a virtual environment in your `home` directory. You can decide between Python virtualenvs or -conda environments. +Create a virtual environment in your `home` directory. You can decide between +Python virtualenv or conda environment. !!! note - Please take in mind that Python venv is the preferred way to create a Python virtual environment. + Please take in mind that Python venv is the preferred way to create a Python + virtual environment. #### Python Virtualenv @@ -261,7 +289,8 @@ marie@compute$ module load Anaconda3 marie@ml$ module load PythonAnaconda ``` -Continue with environment creation, package installation and kernel registration: +Continue with environment creation, package installation and kernel +registration: ```console marie@compute$ mkdir user-kernel # please use workspaces! @@ -303,11 +332,13 @@ You can switch kernels of existing notebooks in the kernel menu: {: align="center"} !!! note - Both python venv and conda virtual environments will be mention in the same list. + Both python venv and conda virtual environments will be mention in the same + list. ### Loading Modules You have now the option to preload modules from the [module system](../software/modules.md). -Select multiple modules that will be preloaded before your notebook server starts. The list of -available modules depends on the module environment you want to start the session in (`scs5` or -`ml`). The right module environment will be chosen by your selected partition. +Select multiple modules that will be preloaded before your notebook server +starts. The list of available modules depends on the module environment you want +to start the session in (`scs5` or `ml`). The right module environment will be +chosen by your selected partition. diff --git a/doc.zih.tu-dresden.de/docs/access/jupyterhub_for_teaching.md b/doc.zih.tu-dresden.de/docs/access/jupyterhub_for_teaching.md index 797d9fc8e455b14e40a5ec7f3737874b2ac500ae..1454db0082acceb8933b6fc4964535e345a1a83d 100644 --- a/doc.zih.tu-dresden.de/docs/access/jupyterhub_for_teaching.md +++ b/doc.zih.tu-dresden.de/docs/access/jupyterhub_for_teaching.md @@ -1,7 +1,7 @@ # JupyterHub for Teaching -On this page, we want to introduce to you some useful features if you want to use JupyterHub for -teaching. +On this page, we want to introduce to you some useful features if you want to +use JupyterHub for teaching. !!! note @@ -9,24 +9,28 @@ teaching. Please be aware of the following notes: -- ZIH systems operate at a lower availability level than your usual Enterprise Cloud VM. There can - always be downtimes, e.g. of the filesystems or the batch system. -- Scheduled downtimes are announced by email. Please plan your courses accordingly. -- Access to HPC resources is handled through projects. See your course as a project. Projects need - to be registered beforehand (more info on the page [Access](../application/overview.md)). +- ZIH systems operate at a lower availability level than your usual Enterprise +Cloud VM. There can always be downtimes, e.g. of the filesystems or the batch +system. +- Scheduled downtimes are announced by email. Please plan your courses +accordingly. +- Access to HPC resources is handled through projects. See your course as a +project. Projects need to be registered beforehand (more info on the page +[Access](../application/overview.md)). - Don't forget to [add your users](../application/project_management.md#manage-project-members-dis-enable) (e.g. students or tutors) to your project. - It might be a good idea to [request a reservation](../jobs_and_resources/overview.md#exclusive-reservation-of-hardware) - of part of the compute resources for your project/course to avoid unnecessary waiting times in - the batch system queue. + of part of the compute resources for your project/course to avoid unnecessary + waiting times in the batch system queue. ## Clone a Repository With a Link -This feature bases on [nbgitpuller](https://github.com/jupyterhub/nbgitpuller). Further information -can be found in the [external documentation about nbgitpuller](https://jupyterhub.github.io/nbgitpuller/). +This feature bases on [nbgitpuller](https://github.com/jupyterhub/nbgitpuller). +Further information can be found in the [external documentation about nbgitpuller](https://jupyterhub.github.io/nbgitpuller/). -This extension for Jupyter notebooks can clone every public git repository into the users work -directory. It's offering a quick way to distribute notebooks and other material to your students. +This extension for Jupyter notebooks can clone every public git repository into +the users work directory. It's offering a quick way to distribute notebooks and +other material to your students.  {: align="center"} @@ -61,8 +65,8 @@ The spawn form now offers a quick start mode by passing URL parameters. !!! example - The following link would create a jupyter notebook session on the `interactive` partition with the `test` - environment being loaded: + The following link would create a jupyter notebook session on the + `interactive` partition with the `test` environment being loaded: ``` https://taurus.hrsk.tu-dresden.de/jupyter/hub/spawn#/~(partition~'interactive~environment~'test) @@ -71,8 +75,8 @@ The spawn form now offers a quick start mode by passing URL parameters.  {: align="center"} -Every parameter of the advanced form can be set with this parameter. If the parameter is not -mentioned, the default value will be loaded. +Every parameter of the advanced form can be set with this parameter. If the +parameter is not mentioned, the default value will be loaded. | Parameter | Default Value | |:----------------|:-----------------------------------------| @@ -90,16 +94,16 @@ mentioned, the default value will be loaded. | `launch` | JupyterLab | | `workspace_scope` | *empty* (home directory) | -You can use the advanced form to generate a URL for the settings you want. The address bar contains -the encoded parameters starting with `#/`. +You can use the advanced form to generate a URL for the settings you want. The +address bar contains the encoded parameters starting with `#/`. ### Combination of Quickstart and Git-Pull Feature You can combine both features in a single link: -``` -https://taurus.hrsk.tu-dresden.de/jupyter/hub/user-redirect/git-pull?repo=https://github.com/jdwittenauer/ipython-notebooks&urlpath=/tree/ipython-notebooks/notebooks/language/Intro.ipynb#/~(partition~'interactive~environment~'test) -``` + ``` + https://taurus.hrsk.tu-dresden.de/jupyter/hub/user-redirect/git-pull?repo=https://github.com/jdwittenauer/ipython-notebooks&urlpath=/tree/ipython-notebooks/notebooks/language/Intro.ipynb#/~(partition~'interactive~environment~'test) + ```  {: align="center"} @@ -110,7 +114,7 @@ With the following link you will be redirected to a certain file in your home directory. [https://taurus.hrsk.tu-dresden.de/jupyter/user-redirect/notebooks/demo.ipynb] -(https://taurus.hrsk.tu-dresden.de/jupyter/user-redirect/notebooks/demo.ipynb) +(<https://taurus.hrsk.tu-dresden.de/jupyter/user-redirect/notebooks/demo.ipynb>) The file needs to exist, otherwise a 404 error will be thrown. @@ -119,3 +123,103 @@ The file needs to exist, otherwise a 404 error will be thrown. This link would redirect to `https://taurus.hrsk.tu-dresden.de/jupyter/user/{login}/notebooks/demo.ipynb`. + +## Create a Shared Python Environment + +To provide a consistent Python environment, you can create a shared [workspace](../data_lifecycle/workspaces.md) +and prepare a [Python virtual environment](../software/python_virtual_environments.md) +in it. Then use a custom Jupyter Kernel to use this environment in JupyterHub. +Please note the following: + +- Set the correct permissions to the workspace and all relevant subdirectories +and files via `chmod`. + +- Install all relevant Python packages in the shared Python virtual environment +(either pip or conda). Note that standard environments (as *production* or +*test*) are not available in that case. + +- Modules can also be loaded in the Jupyter spawner via preload modules +(considering the Python version of your virtual environment). + +Set up your shared Python virtual environment for JupyterHub: + +=== "virtualenv" + + ```console + marie@compute$ module load Python #Load default Python + [...] + marie@compute$ ws_allocate -F scratch python_virtual_environment_teaching 1 + Info: creating workspace. + /scratch/ws/1/python_virtual_environment_teaching + [...] + marie@compute$ virtualenv --system-site-packages /scratch/ws/1/python_virtual_environment_teaching/env #Create virtual environment + [...] + marie@compute$ source /scratch/ws/1/python_virtual_environment_teaching/env/bin/activate #Activate virtual environment. Example output: (envtest) bash-4.2$ + marie@compute$ pip install ipykernel + Collecting ipykernel + [...] + Successfully installed ... ipykernel-5.1.0 ipython-7.5.0 ... + marie@compute$ pip install --upgrade pip + marie@compute$ python -m ipykernel install --user --name my-teaching-kernel --display-name="my teaching kernel" + Installed kernelspec my-teaching-kernel in .../.local/share/jupyter/kernels/my-teaching-kernel + marie@compute$ pip install [...] #Now install additional packages for your notebooks + marie@compute$ deactivate + marie@compute$ chmod g+rx /scratch/ws/1/python_virtual_environment_teaching -R #Make the environment accesible for others + + ``` + +=== "conda" + + ```console + marie@compute$ module load Anaconda3 #Load Anaconda + [...] + marie@compute$ ws_allocate -F scratch conda_virtual_environment_teaching 1 + Info: creating workspace. + /scratch/ws/1/conda_virtual_environment_teaching + [...] + marie@compute$ conda create --prefix /scratch/ws/1/conda_virtual_environment_teaching/conda-env python=3.8 #create virtual environment with Python version 3.8 + [...] + marie@compute$ conda activate /scratch/ws/1/conda_virtual_environment_teaching/conda-env #activate conda-env virtual environment + marie@compute$ conda install ipykernel + [...] + marie@compute$ python -m ipykernel install --user --name my-teaching-kernel --display-name="my teaching kernel" + Installed kernelspec my-teaching-kernel in .../.local/share/jupyter/kernels/my-teaching-kernel + marie@compute$ conda install [...] # now install additional packages for your notebooks + marie@compute$ conda deactivate + marie@compute$ chmod g+rx /scratch/ws/1/conda_virtual_environment_teaching -R #Make the environment accesible for others + + ``` + +Now, users have to install the kernel in order to use the shared Python virtual +environment in JupyterHub: +=== "virtualenv" + + ```console + marie@compute$ module load Python #Load default Python + [...] + marie@compute$ source /scratch/ws/1/python_virtual_environment_teaching/env/bin/activate #Activate virtual environment. Example output: (envtest) bash-4.2$ + marie@compute$ python -m ipykernel install --user --name my-teaching-kernel --display-name="my teaching kernel" + Installed kernelspec my-teaching-kernel in .../.local/share/jupyter/kernels/my-teaching-kernel + marie@compute$ deactivate + + ``` + +=== "conda" + + ```console + marie@compute$ module load Anaconda3 #Load Anaconda + [...] + marie@compute$ conda activate /scratch/ws/1/conda_virtual_environment_teaching + marie@compute$ python -m ipykernel install --user --name my-teaching-kernel --display-name="my teaching kernel" + Installed kernelspec my-teaching-kernel in .../.local/share/jupyter/kernels/my-teaching-kernel + marie@compute$ conda deactivate + + ``` + +After spawning the Notebook, you can select the kernel with the created Python +virtual environment. + +!!! hint + You can also execute the commands for installing the kernel from the Jupyter + as described in [JupyterHub Teaching Example](jupyterhub_teaching_example.md). Then users do not + have to use the command line interface after the preparation. diff --git a/doc.zih.tu-dresden.de/docs/access/jupyterhub_teaching_example.md b/doc.zih.tu-dresden.de/docs/access/jupyterhub_teaching_example.md new file mode 100644 index 0000000000000000000000000000000000000000..639bc25ebac1b1082a2ec4692526589610341801 --- /dev/null +++ b/doc.zih.tu-dresden.de/docs/access/jupyterhub_teaching_example.md @@ -0,0 +1,176 @@ +# JupyterHub Teaching Example + +Setting up a Jupyter Lab Course involves additional steps, beyond JupyterHub, such as creating +course specific environments and allowing participants to link and activate these environments during +the course. This page includes a work through of these additional steps, with best practice examples +for each part. + +## Context + +- The common situation described here is that one or several Jupyter Lab Notebooks +(`ipynb` files) are available and prepared. Students are supposed to open these notebooks +through the [ZIH JupyterHub](../access/jupyterhub.md) and work through them during a course. + +- These notebooks are typically prepared for specific dependencies (Python packages) +that need to be activated by participants in the course, when opening the notebooks. + +- These environments can either be chosen based on the pre-configured +ZIH virtualenv/conda environments, +or built in advance. We will focus on the custom environment approach here. + +## Prerequisites + +- A public git repository with the notebook files (`ipynb`) and all other starting files required + by participants. One option to host the repository is the [GitLab of TU Chemnitz](https://gitlab.hrz.tu-chemnitz.de/). +- A [HPC project](../application/project_management.md) for teaching, + with students as registered participants +- For the tutor, a shell access to the HPC resources and project folder. + +## Preparation on the Lecturer's Side + +The following part describes several steps for the preparation of a course with the JupyterHub at +ZIH. + +### 1. Creating a custom Python environment + +Prepare a Python virtual environment (`virtualenv`) or conda virtual environment as described in +[Python virtual environments](../software/python_virtual_environments.md). Note, for preparing a +custom environment for a Jupyter Lab course, all participants will need to have read-access to this +environment. This is best done by storing the environment in either a [workspace](../data_lifecycle/workspaces.md) +with a limited lifetime or in a projects folder (e.g. `/projects/p_lv_jupyter_course/`) without a +limited lifetime. + +### 2. Clone the repository and store environment setup + +First prepare the `requirements.txt` or the `environment.yml` to persist the environment as +described in [Python virtual environments](../software/python_virtual_environments.md). + +Then clone the repository of your course to your home directory or into a directory in the projects +folder and add the file to the repository. + +=== "virtualenv" + ```console + marie@compute$ git clone git@gitlab.hrz.tu-chemnitz.de:zih/projects/p_lv_jupyter_course/clone_marie/jupyterlab_course.git + [...] + marie@compute$ cp requirements.txt /projects/p_lv_jupyter_course/clone_marie/jupyterlab_course + marie@compute$ cd /projects/p_lv_jupyter_course/clone_marie/jupyterlab_course + marie@compute$ git add requirements.txt + marie@compute$ git commit + marie@compute$ git push + + ``` +=== "conda" + ```console + marie@compute$ git clone git@gitlab.hrz.tu-chemnitz.de:zih/projects/p_lv_jupyter_course/clone_marie/jupyterlab_course.git + [...] + marie@compute$ cp requirements.txt /projects/p_lv_jupyter_course/clone_marie/jupyterlab_course + marie@compute$ cd /projects/p_lv_jupyter_course/clone_marie/jupyterlab_course + marie@compute$ git add environment.yml + marie@compute$ git commit + marie@compute$ git push + + ``` + +Now, you can re-create the environment and the whole course from the git repository in the future. + +To test the activation of the environment use: + +=== "virtualenv" + + ```console + marie@compute$ source /scratch/ws/1/python_virtual_environment_teaching/env/bin/activate #Activate virtual environment. Example output: (envtest) bash-4.2$ + + ``` +=== "conda" + + ```console + marie@compute$ conda activate /scratch/ws/1/conda_virtual_environment_teaching + + ``` + +### 3. Prepare an activation file + +Create a file to install the `ipykernel` to the user-folder, linking the central `workshop_env` to +the ZIH JupyterLab. An `activate_workshop_env.sh` should have the following content: + +```console +/projects/jupyterlab_course/workshop_env/bin/python -m ipykernel install --user --name workshop_env --display-name="workshop_env" +``` + +!!! note + The file for installing the kernel should also be added to the git repository. + +### 4. Prepare the spawn link + +Have a look at the instructions to prepare +[a custom spawn link in combination with the git-pull feature](jupyterhub_for_teaching.md#combination-of-quickstart-and-git-pull-feature). + +## Usage on the Student's Side + +### Preparing activation of the custom environment in notebooks + +When students open the notebooks (e.g. through a Spawn Link that pulls the Git files +and notebooks from our repository), the Python environment must be activated first by installing a +Jupyter kernel. This can be done inside the first notebook using a shell command (`.sh`). + +Therefore the students will need to run the `activation_workshop_env.sh` file, which can be done +in the first cell of the first notebook (e.g. inside `01_intro.ipynb`). + +In a code cell in `01_intro.ipynb`, add: + +```console +!cd .. && sh activate_workshop_env.sh +``` + +When students run this file, the following output signals a successful setup. + + +{: align="center"} + +Afterwards, the `workshop_env` Jupyter kernel can be selected in the top-right corner of Jupyter +Lab. + +!!! note + A few seconds may be needed until the environment becomes available in the list. + +## Test spawn link and environment activation + +During testing, it may be necessary to reset the workspace to the initial state. There are two steps +involved: + +First, remove the cloned git repository in user home folder. + +!!! warning + Check carefully the syntax below, to avoid removing the wrong files. + +```console +cd ~ +rm -rf ./jupyterlab_course.git +``` + +Second, the IPython Kernel must be un-linked from the user workshop_env. + +```console +jupyter kernelspec uninstall workshop_env +``` + +## Summary + +The following video shows an example of the process of opening the +spawn link and activating the environment, from the students perspective. +Note that this video shows the case for a conda virtual environment. + +<div align="center"> +<video width="446" height="240" controls muted> + <source src="../misc/startup_hub.webm" type="video/webm"> +Your browser does not support the video tag. +</video> +</div> + +!!! note + - The spawn link may not work the first time a user logs in. + + - Students must be advised to _not_ click "Start My Server" or edit the form, + if the server does not start automatically. + + - If the server does not start automatically, click (or copy & paste) the spawn link again. diff --git a/doc.zih.tu-dresden.de/docs/access/misc/kernelspec.png b/doc.zih.tu-dresden.de/docs/access/misc/kernelspec.png new file mode 100644 index 0000000000000000000000000000000000000000..ff58282d7c06029162907e8ca8df950d949358a3 Binary files /dev/null and b/doc.zih.tu-dresden.de/docs/access/misc/kernelspec.png differ diff --git a/doc.zih.tu-dresden.de/docs/access/misc/startup_hub.webm b/doc.zih.tu-dresden.de/docs/access/misc/startup_hub.webm new file mode 100644 index 0000000000000000000000000000000000000000..49f8a420ed372b7b7b4aec7e7bf0d8b4dc2861a1 Binary files /dev/null and b/doc.zih.tu-dresden.de/docs/access/misc/startup_hub.webm differ diff --git a/doc.zih.tu-dresden.de/docs/software/python_virtual_environments.md b/doc.zih.tu-dresden.de/docs/software/python_virtual_environments.md index 67b10817c738b414a3302388b5cca3392ff96bb1..da39056ec5bc6097b202ae858a8af5dd71bb7631 100644 --- a/doc.zih.tu-dresden.de/docs/software/python_virtual_environments.md +++ b/doc.zih.tu-dresden.de/docs/software/python_virtual_environments.md @@ -1,16 +1,17 @@ # Python Virtual Environments -Virtual environments allow users to install additional Python packages and create an isolated -run-time environment. We recommend using `virtualenv` for this purpose. In your virtual environment, -you can use packages from the [modules list](modules.md) or if you didn't find what you need you can -install required packages with the command: `pip install`. With the command `pip freeze`, you can -see a list of all installed packages and their versions. +Virtual environments allow users to install additional Python packages and +create an isolated run-time environment. We recommend using `virtualenv` for +this purpose. In your virtual environment, you can use packages from the +[modules list](modules.md) or if you didn't find what you need you can install +required packages with the command: `pip install`. With the command +`pip freeze`, you can see a list of all installed packages and their versions. There are two methods of how to work with virtual environments on ZIH systems: -1. **virtualenv** is a standard Python tool to create isolated Python environments. - It is the preferred interface for - managing installations and virtual environments on ZIH system and part of the Python modules. +1. **virtualenv** is a standard Python tool to create isolated Python +environments. It is the preferred interface for managing installations and +virtual environments on ZIH system and part of the Python modules. 2. **conda** is an alternative method for managing installations and virtual environments on ZIH system. conda is an open-source package @@ -25,13 +26,13 @@ conda manager is included in all versions of Anaconda and Miniconda. ## Python Virtual Environment -This example shows how to start working with **virtualenv** and Python virtual environment (using -the module system). +This example shows how to start working with **virtualenv** and Python virtual +environment (using the module system). !!! hint - We recommend to use [workspaces](../data_lifecycle/workspaces.md) for your virtual - environments. + We recommend to use [workspaces](../data_lifecycle/workspaces.md) for your + virtual environments. At first, we check available Python modules and load the preferred version: @@ -53,22 +54,45 @@ Info: creating workspace. [...] marie@compute$ virtualenv --system-site-packages /scratch/ws/1/python_virtual_environment/env #Create virtual environment [...] -marie@compute$ source /scratch/ws/1/python_virtual_environment/env/bin/activate #Activate virtual environment. Example output: (envtest) bash-4.2$ +marie@compute$ source /scratch/ws/1/python_virtual_environment/env/bin/activate #Activate virtual environment. Example output: (env) bash-4.2$ ``` -Now you can work in this isolated environment, without interfering with other tasks running on the -system. Note that the inscription (env) at the beginning of each line represents that you are in -the virtual environment. You can deactivate the environment as follows: +Now you can work in this isolated environment, without interfering with other +tasks running on the system. Note that the inscription (env) at the beginning of +each line represents that you are in the virtual environment. You can deactivate +the environment as follows: ```console (env) marie@compute$ deactivate #Leave the virtual environment ``` +### Persistence of Python Virtual Environment + +To persist a virtualenv, you can store the names and versions of installed +packages in a file. Then you can restore this virtualenv by installing the +packages from this file. Use the `pip freeze` command for storing: + +```console +(env) marie@compute$ pip freeze > requirements.txt #Store the currently installed packages +``` + +In order to recreate python virtual environment, use the `pip install` command to install the +packages from the file: + +```console +marie@compute$ module load Python #Load default Python +[...] +marie@compute$ virtualenv --system-site-packages /scratch/ws/1/python_virtual_environment/env_post #Create virtual environment +[...] +marie@compute$ source /scratch/ws/1/python_virtual_environment/env/bin/activate #Activate virtual environment. Example output: (env_post) bash-4.2$ +(env_post) marie@compute$ pip install -r requirements.txt #Install packages from the created requirements.txt file +``` + ## Conda Virtual Environment -This example shows how to start working with **conda** and virtual environment (with using module -system). At first, we use an interactive job and create a directory for the conda virtual -environment: +This example shows how to start working with **conda** and virtual environment +(with using module system). At first, we use an interactive job and create a +directory for the conda virtual environment: ```console marie@compute$ ws_allocate -F scratch conda_virtual_environment 1 @@ -77,7 +101,8 @@ Info: creating workspace. [...] ``` -Then, we load Anaconda, create an environment in our directory and activate the environment: +Then, we load Anaconda, create an environment in our directory and activate the +environment: ```console marie@compute$ module load Anaconda3 #load Anaconda module @@ -85,9 +110,10 @@ marie@compute$ conda create --prefix /scratch/ws/1/conda_virtual_environment/con marie@compute$ conda activate /scratch/ws/1/conda_virtual_environment/conda-env #activate conda-env virtual environment ``` -Now you can work in this isolated environment, without interfering with other tasks running on the -system. Note that the inscription (conda-env) at the beginning of each line represents that you -are in the virtual environment. You can deactivate the conda environment as follows: +Now you can work in this isolated environment, without interfering with other +tasks running on the system. Note that the inscription (conda-env) at the +beginning of each line represents that you are in the virtual environment. You +can deactivate the conda environment as follows: ```console (conda-env) marie@compute$ conda deactivate #Leave the virtual environment @@ -122,3 +148,54 @@ are in the virtual environment. You can deactivate the conda environment as foll 0.10.0+cu102 (my-torch-env) marie@alpha$ deactivate ``` + +### Persistence of Conda Virtual Environment + +To persist a conda virtual environment, you can define an `environments.yml` +file. Have a look a the [conda docs](https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html?highlight=environment.yml#create-env-file-manually) +for a description of the syntax. See an example for the `environments.yml` file +below. + +??? example + ```yml + name: workshop_env + channels: + - conda-forge + - defaults + dependencies: + - python>=3.7 + - pip + - colorcet + - 'geoviews-core=1.8.1' + - 'ipywidgets=7.6.*' + - geopandas + - hvplot + - pyepsg + - python-dotenv + - 'shapely=1.7.1' + - pip: + - python-hll + +``` + +After specifying the `name`, the conda [channel priority](https://docs.conda.io/projects/conda/en/latest/user-guide/concepts/channels.html) +is defined. In the example above, packages will be first installed from the +`conda-forge` channel, and if not found, from the `default` Anaconda channel. + +Below, dependencies can be specified. Optionally, <abbr title="Pinning is a +process that allows you to remain on a stable release while grabbing packages +from a more recent version."> pinning</abbr> can be used to delimit the packages +installed to compatible package versions. + +Finally, packages not available on conda can be specified (indented) below +`- pip:` + +Recreate the conda virtual environment with the packages from the created +`environment.yml` file: + +```console +marie@compute$ mkdir workshop_env #Create directory for environment +marie@compute$ module load Anaconda3 #Load Anaconda +marie@compute$ conda config --set channel_priority strict +marie@compute$ conda env create --prefix workshop_env --file environment.yml #Create conda env in directory with packages from environment.yml file +``` diff --git a/doc.zih.tu-dresden.de/mkdocs.yml b/doc.zih.tu-dresden.de/mkdocs.yml index 7cb8a72a85d94553c0bc2e46a23216885c74cfa1..bb9cf9322f34d721547e34ecd0608efcf4a4f584 100644 --- a/doc.zih.tu-dresden.de/mkdocs.yml +++ b/doc.zih.tu-dresden.de/mkdocs.yml @@ -15,6 +15,7 @@ nav: - JupyterHub: - JupyterHub: access/jupyterhub.md - JupyterHub for Teaching: access/jupyterhub_for_teaching.md + - JupyterHub Teaching Example: access/jupyterhub_teaching_example.md - Key Fingerprints: access/key_fingerprints.md - Security Restrictions: access/security_restrictions.md - Data Transfer: diff --git a/doc.zih.tu-dresden.de/wordlist.aspell b/doc.zih.tu-dresden.de/wordlist.aspell index 8d81f8032d8f01c58cc410c962e14e369b50c617..57057252475b77cf0998ac1dce28a74c5711adce 100644 --- a/doc.zih.tu-dresden.de/wordlist.aspell +++ b/doc.zih.tu-dresden.de/wordlist.aspell @@ -1,284 +1,138 @@ personal_ws-1.1 en 203 -ALLREDUCE -APIs -AVX Abaqus Addon Addons +ALLREDUCE Altix Amber Amdahl's -Analytics -Ansys -BLAS -BMC -BeeGFS -CCM -CFX -CLI -CMake -COMSOL -CONFIG -CPU -CPUID -CPUs -CSV -CUDA -CXFS -CentOS -Chemnitz -DCV -DDP -DDR -DFG -DMTCP -DNS -Dask -DataFrames -DataParallel -Dataheap -DistributedDataParallel -DockerHub -Dockerfile -Dockerfiles -EPYC -ESSL -EasyBlocks -EasyBuild -EasyConfig -Espresso -FFT -FFTW -FMA -Flink -FlinkExample -Fortran -GBit -GDB -GDDR -GFLOPS -GPU -GPUs -GROMACS -GUIs -Galilei -Gauss -Gaussian -GiB -GitHub -GitLab -GitLab's -Gloo -HBM -HDEEM -HDF -HDFS -HDFView -HPC -HPE -HPL -Horovod -Hostnames -IOPS -IPs -ISA -ImageNet -Infiniband -Instrumenter -Itanium -Jupyter -JupyterHub -JupyterLab -KNL -Keras -Kunststofftechnik -LAMMPS -LAPACK -LINPACK -Leichtbau -Linter -LoadLeveler -MEGWARE -MIMD -MKL -MNIST -MathKernel -MathWorks -Mathematica -Memcheck -MiB -Microarchitecture -Miniconda -MobaXTerm -Montecito -Mpi -Multiphysics -Multithreading -NAMD -NCCL -NFS -NGC -NODELIST -NRINGS -NUM -NUMA -NUMAlink -NVLINK -NVMe -NWChem -Neptun -NumPy -Nutzungsbedingungen -Nvidia -OME -OPARI -OTF -OmniOpt -OpenACC -OpenBLAS -OpenCL -OpenGL -OpenMP -OpenMPI -OpenSSH -Opteron -PAPI -PESSL -PGI -PMI -PSOCK -Pandarallel -Perf -PiB -Pika -PowerAI -Pre -Preload -Pthread -Pthreads -PuTTY -PyTorch -PythonAnaconda -Quantum -Quickstart -README -RHEL -RSA -RSS -RStudio -ResNet -Rmpi -Rsync -Runtime -SFTP -SGEMM -SGI -SHA -SHMEM -SLES -SLURMCluster -SMP -SMT -SSHFS -STAR -SUSE -SXM -Sandybridge -Saxonid -ScaDS -ScaLAPACK -Scalasca -SciPy -Scikit -Slurm -SparkExample -SubMathKernel -Superdome -TBB -TCP -TFLOPS -TensorBoard -TensorFlow -Theano -ToDo -Torchvision -Trition -VASP -VMSize -VMs -VPN -Valgrind -Vampir -VampirServer -VampirTrace -VampirTrace's -VirtualGL -WebVNC -WinSCP -Workdir -XArray -XGBoost -XLC -XLF -Xeon -Xming -ZIH -ZIH's -ZSH analytics +Analytics anonymized +Ansys +APIs +AVX awk +BeeGFS benchmarking +BLAS +BMC broadwell bsub bullx +CCM ccNUMA centauri +CentOS +CFX cgroups checkpointing +Chemnitz citable +CLI +CMake +COMSOL conda config +CONFIG cpu +CPU +CPUID cpus +CPUs crossentropy css +CSV +CUDA cuDNN +CXFS dask +Dask dataframes +DataFrames +Dataheap datamover +DataParallel dataset +DCV ddl +DDP +DDR +DFG distr +DistributedDataParallel +DMTCP +DNS +Dockerfile +Dockerfiles +DockerHub dockerized dotfile dotfiles downtime downtimes +EasyBlocks +EasyBuild +EasyConfig ecryptfs engl english env +EPYC +Espresso +ESSL facto fastfs +FFT +FFTW filesystem filesystems flink +Flink +FlinkExample +FMA foreach +Fortran +Galilei +Gauss +Gaussian +GBit +GDB +GDDR +GFLOPS gfortran +GiB gifferent +GitHub +GitLab +GitLab's glibc +Gloo gnuplot gpu +GPU +GPUs gres +GROMACS +GUIs hadoop haswell +HBM +HDEEM +HDF +HDFS +HDFView hiera horovod +Horovod horovodrun hostname +Hostnames hpc +HPC hpcsupport +HPE +HPL html hvd hyperparameter @@ -287,110 +141,265 @@ hyperthreading icc icpc ifort +ImageNet img +Infiniband init inode +Instrumenter +IOPS +IPs ipynb +ipython +IPython +ISA +Itanium jobqueue jpg jss jupyter +Jupyter +JupyterHub +JupyterLab +Jupytext +Keras +kernelspec +KNL +Kunststofftechnik +LAMMPS +LAPACK lapply +Leichtbau +LINPACK linter +Linter lmod +LoadLeveler localhost lsf lustre markdownlint +Mathematica +MathKernel +MathWorks matlab +Matplotlib +MEGWARE mem +Memcheck +MiB +Microarchitecture +MIMD +Miniconda mkdocs +MKL +MNIST +MobaXTerm modenv modenvs modulefile +Montecito mountpoint mpi -mpiCC +Mpi mpicc +mpiCC mpicxx mpif mpifort mpirun multicore multiphysics +Multiphysics multithreaded +Multithreading +NAMD natively nbgitpuller nbsp +NCCL +Neptun +NFS +NGC nodelist +NODELIST +NRINGS ntasks +NUM +NUMA +NUMAlink +NumPy +Nutzungsbedingungen +Nvidia +NVLINK +NVMe +NWChem +OME +OmniOpt +OPARI +OpenACC +OpenBLAS +OpenCL +OpenGL +OpenMP openmpi +OpenMPI +OpenSSH +Opteron +OTF overfitting pandarallel +Pandarallel +PAPI parallelization parallelize parallelized parfor pdf perf +Perf performant +PESSL +PGI +PiB +Pika pipelining +PMI png +PowerAI ppc pre +Pre +preload +Preload preloaded preloading prepend preprocessing +PSOCK +Pthread +Pthreads pty +PuTTY pymdownx +PythonAnaconda pytorch +PyTorch +Quantum queue quickstart +Quickstart randint reachability +README reproducibility requeueing resnet +ResNet +RHEL +Rmpi rome romeo +RSA +RSS +RStudio +Rsync runnable runtime +Runtime sacct salloc +Sandybridge +Saxonid sbatch +ScaDS scalability scalable +ScaLAPACK +Scalasca scancel +Scikit +SciPy scontrol scp scs +SFTP +SGEMM +SGI +SHA +SHMEM +SLES +Slurm +SLURMCluster +SMP +SMT +SparkExample +spawner spython squeue srun ssd +SSHFS +STAR stderr stdout subdirectories subdirectory +SubMathKernel +Superdome +SUSE +SXM +TBB +TCP +TensorBoard tensorflow +TensorFlow +TFLOPS +Theano tmp todo +ToDo toolchain toolchains torchvision +Torchvision tracefile tracefiles tracepoints transferability +Trition undistinguishable unencrypted uplink +urlpath userspace +Valgrind +Vampir +VampirServer +VampirTrace +VampirTrace's +VASP vectorization venv virtualenv +VirtualGL +VMs +VMSize +VPN +WebVNC +WinSCP +WML +Workdir workspace workspaces +XArray +Xeon +XGBoost +XLC +XLF +Xming yaml zih +ZIH +ZIH's +ZSH