How to use multi-gpu while training

pytorch

(Wen Tsai) #1

Hi, I noticed that the issue about multi-gpu has been marked as finished features:

Multi-gpu PyTorch version?

However, I couldn’t use it while setting the option to -gpuid 0 1, in fact, I found some codes in train.py:

if len(opt.gpuid) > 1:
    sys.stderr.write("Sorry, multigpu isn't supported yet, coming soon!\n")
    sys.exit(1)

Is that a bug or the multi-gpu features still not available in opennmt-py?

Thanks.