Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
hpc-compendium
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Deploy
Releases
Package Registry
Container Registry
Model registry
Operate
Terraform modules
Monitor
Incidents
Service Desk
Analyze
Value stream analytics
Contributor analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Terms and privacy
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
ZIH
hpcsupport
hpc-compendium
Commits
b32d4ce5
Commit
b32d4ce5
authored
2 years ago
by
Lars Jitschin
Browse files
Options
Downloads
Patches
Plain Diff
separated custom environments, I still need to work on tabs for that page
parent
ca89b56f
No related branches found
No related tags found
2 merge requests
!687
Automated merge from preview to main
,
!632
Improving JupyterHub Documentation
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
doc.zih.tu-dresden.de/docs/access/jupyterhub.md
+13
-145
13 additions, 145 deletions
doc.zih.tu-dresden.de/docs/access/jupyterhub.md
doc.zih.tu-dresden.de/docs/access/jupyterhub_custom_environments.md
+147
-0
147 additions, 0 deletions
...-dresden.de/docs/access/jupyterhub_custom_environments.md
with
160 additions
and
145 deletions
doc.zih.tu-dresden.de/docs/access/jupyterhub.md
+
13
−
145
View file @
b32d4ce5
...
...
@@ -261,156 +261,24 @@ With these **standard environments** we have tried to integrate a set of compati
\*\* R is loaded from the [module system](../software/modules.md)
### Creating and Using a Custom Environment
!!! info
Interactive code interpreters which are used by Jupyter notebooks are called
*kernels*. Creating and using your own kernel has the benefit that you can
install your own preferred Python packages and use them in your notebooks.
We currently have two different architectures at ZIH systems.
Build your kernel environment on the
**same architecture**
that you want to use
later on with the kernel. In the examples below, we use the name
"my-kernel" for our user kernel. We recommend to prefix your kernels
with keywords like
`haswell`
,
`ml`
,
`romeo`
,
`venv`
,
`conda`
. This way, you
can later recognize easier how you built the kernel and on which hardware it
will work. Depending on that hardware, allocate resources:
=== "Nodes with x86_64 (Intel) CPU"
Use
**one srun command**
of these:
```console
maria@login$ srun --partition=haswell64 --pty --ntasks=1 --cpus-per-task=2 \
--mem-per-cpu=2541 --time=08:00:00 bash -l
maria@login$ srun --partition=gpu2 --pty --ntasks=1 --cpus-per-task=2 \
--mem-per-cpu=2541 --time=08:00:00 bash -l
```
=== "Nodes with x86_64 (AMD) CPU"
Use
**one srun command**
of these:
```console
maria@login$ srun --partition=romeo --pty --ntasks=1 --cpus-per-task=3 \
--mem-per-cpu=1972 --time=08:00:00 bash -l
maria@login$ srun --partition=alpha --gres=gpu:1 --pty --ntasks=1 \
--cpus-per-task=6 --mem-per-cpu=10312 --time=08:00:00 bash -l
```
=== "Nodes with ppc64le CPU"
```
console
maria@login$
srun
--pty
--partition
=
ml
--ntasks
=
1
--cpus-per-task
=
2
--mem-per-cpu
=
1443
\
--time=08:00:00 bash -l
```
When creating a virtual environment in your home directory, you got to decide
to either use "Python virtualenv" or "conda environment".
!!! note
Please keep in mind that Python virtualenv is the preferred way to create a Python
virtual environment.
For working with conda virtual environments, it may be necessary to configure your shell
as described in
[
Python virtual environments
](
../software/python_virtual_environments.md#conda-virtual-environment
)
#### Python Virtualenv
```
console
marie@compute$
module load Python/3.8.6-GCCcore-10.2.0
Module Python/3.8.6-GCCcore-10.2.0 and 11 dependencies loaded.
marie@compute$
mkdir
user-kernel
# please use workspaces!
marie@compute$
cd
user-kernel
marie@compute$
virtualenv
--system-site-packages
my-kernel
created virtual environment CPython3.8.6.final.0-64 in 5985ms
creator CPython3Posix(dest=[...]/my-kernel, clear=False, global=True)
seeder FromAppData(download=False, pip=bundle, setuptools=bundle, wheel=bundle, via=copy, app_data_dir=[...])
added seed packages: pip==20.2.3, setuptools==50.3.0, wheel==0.35.1
activators BashActivator,CShellActivator,FishActivator,PowerShellActivator,PythonActivator,XonshActivator
marie@compute$
source
my-kernel/bin/activate
(my-kernel) marie@compute$
pip
install
ipykernel
Collecting ipykernel
[...]
Successfully installed [...] ipykernel-6.9.1 ipython-8.0.1 [...]
(my-kernel) marie@compute$
pip
install
--upgrade
pip
(my-kernel) marie@compute$
python
-m
ipykernel
install
--user
--name
my-kernel
--display-name
=
"my kernel"
Installed kernelspec my-kernel in .../.local/share/jupyter/kernels/my-kernel
(my-kernel) marie@compute$
pip
install
[
...]
# now install additional packages for your notebooks
(my-kernel) marie@compute$
deactivate
```
!!! warning
Depending on the Python module you have loaded for creating your virtual environment, you should
select the apropriate [Standard environment](#standard-environments). For example, you could
select `scs5_gcccore-10.2.0_python-3.8.6`, when you want to use `my-kernel`. Furthermore,
ensure, that you pre-load the same modules via [Spawner Options](#start-a-session) that you used
for creating your kernel.
#### Conda Environment
Load the needed module depending on partition architecture:
=== "x86 nodes (e.g. partition
`haswell`
,
`gpu2`
)"
```
console
marie@compute$
module load Anaconda3
```
=== "PowerPC nodes (partition
`ml`
)"
```
console
marie@ml$
module load PythonAnaconda
```
!!! hint
For working with conda virtual environments, it may be necessary to configure your shell as
described in
[
Python virtual environments
](
../software/python_virtual_environments.md#conda-virtual-environment
)
.
Continue with environment creation, package installation and kernel
registration:
```
console
marie@compute$
mkdir
user-kernel
# please use workspaces!
marie@compute$
conda create
--prefix
$HOME
/user-kernel/my-kernel
python
=
3.8.6
Collecting package metadata: done
Solving environment: done
[...]
marie@compute$
conda activate
$HOME
/user-kernel/my-kernel
marie@compute$
conda
install
ipykernel
Collecting package metadata: done
Solving environment: done
[...]
marie@compute$
python
-m
ipykernel
install
--user
--name
my-kernel
--display-name
=
"my kernel"
Installed kernelspec my-kernel in [...]
marie@compute$
conda
install
[
..]
# now install additional packages for your notebooks
marie@compute$
conda deactivate
```
Now you can start a new session and your kernel should be available.
**JupyterLab**
: Your kernels are listed on the launcher page:

{: align="center"}
You can switch kernels of existing notebooks in the menu:

{: align="center"}
**Classic Jupyter notebook**
: Your kernel is listed in the New menu:

{: align="center"}
You can switch kernels of existing notebooks in the kernel menu:

{: align="center"}
!!! note
Both python venv and conda virtual environments will be mention in the same
list.
### Loading Modules
You have now the option to preload modules from the
[
module system
](
../software/modules.md
)
.
Select multiple modules that will be preloaded before your notebook server
starts. The list of available modules depends on the module environment you want
to start the session in (
`scs5`
or
`ml`
). The right module environment will be
to start the session in (
`scs5`
,
`hiera`
or
`ml`
). The right module environment will be
chosen by your selected partition.
### Custom Kernels
As you might have noticed, after launching Jupyter
**Lab**
,
there are several boxes with icons therein visible in the
`Launcher`
.
Each box therein represents a so called 'Kernel'
(note that these are not to be confused with Operating System Kernel,
but similarly provide basic functionality for running your use cases,
e.g. Python or R)
You can find further documentation on creating your own Kernels
[
here
](
/access/jupyterhub_custom_environments
)
This diff is collapsed.
Click to expand it.
doc.zih.tu-dresden.de/docs/access/jupyterhub_custom_environments.md
0 → 100644
+
147
−
0
View file @
b32d4ce5
# Creating and Using a Custom Environment for JupyterHub
!!! info
Interactive code interpreters which are used by Jupyter notebooks are called
*kernels*. Creating and using your own kernel has the benefit that you can
install your own preferred Python packages and use them in your notebooks.
We currently have two different architectures at ZIH systems.
Build your kernel environment on the
**same architecture**
that you want to use
later on with the kernel. In the examples below, we use the name
"my-kernel" for our user kernel. We recommend to prefix your kernels
with keywords like
`haswell`
,
`ml`
,
`romeo`
,
`venv`
,
`conda`
. This way, you
can later recognize easier how you built the kernel and on which hardware it
will work. Depending on that hardware, allocate resources:
## Preliminary Steps
=== "Nodes with x86_64 (Intel) CPU"
Use
**one srun command**
of these:
```console
maria@login$ srun --partition=haswell64 --pty --ntasks=1 --cpus-per-task=2 \
--mem-per-cpu=2541 --time=08:00:00 bash -l
maria@login$ srun --partition=gpu2 --pty --ntasks=1 --cpus-per-task=2 \
--mem-per-cpu=2541 --time=08:00:00 bash -l
```
=== "Nodes with x86_64 (AMD) CPU"
Use
**one srun command**
of these:
```console
maria@login$ srun --partition=romeo --pty --ntasks=1 --cpus-per-task=3 \
--mem-per-cpu=1972 --time=08:00:00 bash -l
maria@login$ srun --partition=alpha --gres=gpu:1 --pty --ntasks=1 \
--cpus-per-task=6 --mem-per-cpu=10312 --time=08:00:00 bash -l
```
=== "Nodes with ppc64le CPU"
```
console
maria@login$
srun
--pty
--partition
=
ml
--ntasks
=
1
--cpus-per-task
=
2
--mem-per-cpu
=
1443
\
--time=08:00:00 bash -l
```
When creating a virtual environment in your home directory, you got to decide
to either use "Python virtualenv" or "conda environment".
!!! note
Please keep in mind that Python virtualenv is the preferred way to create a Python
virtual environment.
For working with conda virtual environments, it may be necessary to configure your shell
as described in
[
Python virtual environments
](
../software/python_virtual_environments.md#conda-virtual-environment
)
## Python Virtualenv
```
console
marie@compute$
module load Python/3.8.6-GCCcore-10.2.0
Module Python/3.8.6-GCCcore-10.2.0 and 11 dependencies loaded.
marie@compute$
mkdir
user-kernel
# please use workspaces!
marie@compute$
cd
user-kernel
marie@compute$
virtualenv
--system-site-packages
my-kernel
created virtual environment CPython3.8.6.final.0-64 in 5985ms
creator CPython3Posix(dest=[...]/my-kernel, clear=False, global=True)
seeder FromAppData(download=False, pip=bundle, setuptools=bundle, wheel=bundle, via=copy, app_data_dir=[...])
added seed packages: pip==20.2.3, setuptools==50.3.0, wheel==0.35.1
activators BashActivator,CShellActivator,FishActivator,PowerShellActivator,PythonActivator,XonshActivator
marie@compute$
source
my-kernel/bin/activate
(my-kernel) marie@compute$
pip
install
ipykernel
Collecting ipykernel
[...]
Successfully installed [...] ipykernel-6.9.1 ipython-8.0.1 [...]
(my-kernel) marie@compute$
pip
install
--upgrade
pip
(my-kernel) marie@compute$
python
-m
ipykernel
install
--user
--name
my-kernel
--display-name
=
"my kernel"
Installed kernelspec my-kernel in .../.local/share/jupyter/kernels/my-kernel
(my-kernel) marie@compute$
pip
install
[
...]
# now install additional packages for your notebooks
(my-kernel) marie@compute$
deactivate
```
!!! warning
Depending on the Python module you have loaded for creating your virtual environment, you should
select the apropriate [Standard environment](#standard-environments). For example, you could
select `scs5_gcccore-10.2.0_python-3.8.6`, when you want to use `my-kernel`. Furthermore,
ensure, that you pre-load the same modules via [Spawner Options](#start-a-session) that you used
for creating your kernel.
## Conda Environment
Load the needed module depending on partition architecture:
=== "x86 nodes (e.g. partition
`haswell`
,
`gpu2`
)"
```
console
marie@compute$
module load Anaconda3
```
=== "PowerPC nodes (partition
`ml`
)"
```
console
marie@ml$
module load PythonAnaconda
```
!!! hint
For working with conda virtual environments, it may be necessary to configure your shell as
described in
[
Python virtual environments
](
../software/python_virtual_environments.md#conda-virtual-environment
)
.
Continue with environment creation, package installation and kernel
registration:
```
console
marie@compute$
mkdir
user-kernel
# please use workspaces!
marie@compute$
conda create
--prefix
$HOME
/user-kernel/my-kernel
python
=
3.8.6
Collecting package metadata: done
Solving environment: done
[...]
marie@compute$
conda activate
$HOME
/user-kernel/my-kernel
marie@compute$
conda
install
ipykernel
Collecting package metadata: done
Solving environment: done
[...]
marie@compute$
python
-m
ipykernel
install
--user
--name
my-kernel
--display-name
=
"my kernel"
Installed kernelspec my-kernel in [...]
marie@compute$
conda
install
[
..]
# now install additional packages for your notebooks
marie@compute$
conda deactivate
```
Now you can start a new session and your kernel should be available.
**JupyterLab**
: Your kernels are listed on the launcher page:

{: align="center"}
You can switch kernels of existing notebooks in the menu:

{: align="center"}
**Classic Jupyter notebook**
: Your kernel is listed in the New menu:

{: align="center"}
You can switch kernels of existing notebooks in the kernel menu:

{: align="center"}
!!! note
Both python venv and conda virtual environments will be mention in the same
list.
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment