Using multiple gpus in

I have a machine with 2 gpus.
I have set

When I run
python -data /home/jerom/INT/Resources/Bijbel/preprocess_data -save_model /home/jerom/INT/Resources/Bijbel/rembrandt -encoder_type brnn -world_size 1 -gpu_ranks 0
this works fine, and is using gpu nr 0, as witnessed by nvidia-smi.

But when I run
python -data /home/jerom/INT/Resources/Bijbel/preprocess_data -save_model /home/jerom/INT/Resources/Bijbel/rembrandt -encoder_type brnn -world_size 2 -gpu_ranks 0 1

I get:
Traceback (most recent call last):
File “”, line 118, in
File “”, line 35, in main
mp = torch.multiprocessing.get_context(‘spawn’)
AttributeError: ‘module’ object has no attribute ‘get_context’

Any help? thanks

You should have:


using ‘,’ instead of ‘.’ to separate devices ids.

This may solve your problem.

tx, but it was a typo in my reporting of the error, not in the actual commands.
So the problem is still there.

Are you using python 2.7 or python 3?
Multi-gpu training is only supported by python3:

thanks, that was it :wink: now it works.

1 Like