Skip to content
Snippets Groups Projects
Commit 4d47209e authored by Jan Frenzel's avatar Jan Frenzel Committed by Martin Schroschk
Browse files

Apply 1 suggestion(s) to 1 file(s)

parent 8ca343c4
No related branches found
No related tags found
3 merge requests!392Merge preview into contrib guide for browser users,!378Merge p in m,!367Update the distributed_training.md Pytorch section
...@@ -145,7 +145,8 @@ PyTorch provides multiple ways to achieve data parallelism to train the deep lea ...@@ -145,7 +145,8 @@ PyTorch provides multiple ways to achieve data parallelism to train the deep lea
efficiently. These models are part of the `torch.distributed` sub-package that ships with the main efficiently. These models are part of the `torch.distributed` sub-package that ships with the main
deep learning package. deep learning package.
Easiest method to quickly prototype if the model is trainable in a multi-GPU setting is to wrap the exisiting model with the `torch.nn.DataParallel` class as shown below, The easiest method to quickly prototype if the model is trainable in a multi-GPU setting is to wrap
the existing model with the `torch.nn.DataParallel` class as shown below,
```python ```python
model = torch.nn.DataParalell(model) model = torch.nn.DataParalell(model)
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment