6 merge requests!333Draft: update NGC containers,!322Merge preview into main,!319Merge preview into main,!279Draft: Machine Learning restructuring,!268Update ML branch with the content from DA,!258Data Analytics restructuring
Classical simulation methods as well as machine learning methods (e.g. neural networks) have a large number of hyperparameters that significantly determine the accuracy, efficiency, and transferability of the method.
# Hyperparameter Optimization (OmniOpt)
In classical simulations, the hyperparameters are usually determined by adaptation to measured values.
Esp. in neural networks, the hyperparameters determine the network architecture: number and type of layers, number of neurons, activation functions, measures against overfitting etc.
Classical simulation methods as well as machine learning methods (e.g. neural networks) have
The most common methods to determine hyperparameters are intuitive testing, grid search or random search.
a large number of hyperparameters that significantly determine the accuracy, efficiency, and
transferability of the method. In classical simulations, the hyperparameters are usually
The tool OmniOpt performs hyperparameter optimization within a broad range of applications as classical simulations or machine learning algorithms.
determined by adaptation to measured values. Esp. in neural networks, the hyperparameters
Omniopt is robust and it checks and installs all dependencies automatically and fixes many problems in the background.
determine the network architecture: number and type of layers, number of neurons, activation
While Omniopt optimizes, no further intervention is required.
functions, measures against overfitting etc. The most common methods to determine hyperparameters
are intuitive testing, grid search or random search.
The tool OmniOpt performs hyperparameter optimization within a broad range of applications as
classical simulations or machine learning algorithms.
Omniopt is robust and it checks and installs all dependencies automatically and fixes many
problems in the background. While Omniopt optimizes, no further intervention is required.
You can follow the ongoing stdout (standard output) live in the console.
You can follow the ongoing stdout (standard output) live in the console.
Omniopt’s overhead is minimal and virtually imperceptible.
Omniopt’s overhead is minimal and virtually imperceptible.
## Quickstart with OmniOpt
## Quickstart with OmniOpt
The following instructions demonstrate the basic usage of OmniOpt on the ZIH system, based on the hyperparameter optimization for a neural network.
The following instructions demonstrate the basic usage of OmniOpt on the ZIH system, based
on the hyperparameter optimization for a neural network.
The typical OmniOpt workflow comprises at least the following steps:
The typical OmniOpt workflow comprises at least the following steps:
...
@@ -21,24 +28,33 @@ The typical OmniOpt workflow comprises at least the following steps:
...
@@ -21,24 +28,33 @@ The typical OmniOpt workflow comprises at least the following steps:
### Prepare Application Script and Software Environment
### Prepare Application Script and Software Environment
The following example application script was created from [https://pytorch.org/tutorials/beginner/basics/quickstart_tutorial.html](https://pytorch.org/tutorials/beginner/basics/quickstart_tutorial.html){:target="_blank"} as a starting point.
The following example application script was created from
Therein, a neural network is trained on the MNIST Fashion dataset.
Therein, a neural network is trained on the MNIST Fashion dataset.
There are three script preparation steps for OmniOpt:
There are three script preparation steps for OmniOpt:
+ Changing hard-coded hyperparameters (chosen here: batch size, epochs, size of layer 1 and 2) into command line parameters.
+ Changing hard-coded hyperparameters (chosen here: batch size, epochs, size of layer 1 and 2)
Esp. for this example, the Python module argparse (see the docs at [https://docs.python.org/3/library/argparse.html](https://docs.python.org/3/library/argparse.html){:target="_blank"}) is used.
into command line parameters.
Esp. for this example, the Python module `argparse` (see the docs at
There are many ways for parsing arguments into Python scripts.
There are many ways for parsing arguments into Python scripts.
The most easiest approach is the sys module (see [https://www.geeksforgeeks.org/how-to-use-sys-argv-in-python/](https://www.geeksforgeeks.org/how-to-use-sys-argv-in-python/){:target="_blank"}), which would be fully sufficient for usage with OmniOpt.
The most easiest approach is the sys module (see
Nevertheless, this basic approach has no consistency checks or error handling etc.