OpenNMT

RuntimeError: Subtraction, the `-` operator, with a bool tensor is not supported. If you are trying to invert a mask, use the `~` or `bitwise_not()` operator instead

Hi,

I am presently creating a model for german-english translation using OpenNMT. I received the following error while training the model :

RuntimeError: Subtraction, the - operator, with a bool tensor is not supported. If you are trying to invert a mask, use the ~ or bitwise_not() operator instead.

These are the steps that I have followed till the point I received the above error :
1)
(OpenNMT) admin123@admin123-MS-7A15:~/OpenNMT-Trans/OpenNMT-py$ python preprocess.py -train_src data/europarl-v7.de-en1.de -train_tgt data/europarl-v7.de-en1.en -valid_src data/validsrc.txt -valid_tgt data/validtgt.txt -save_data data/demo
[2019-08-20 20:55:49,848 INFO] Extracting features…
[2019-08-20 20:55:49,849 INFO] * number of source features: 0.
[2019-08-20 20:55:49,849 INFO] * number of target features: 0.
[2019-08-20 20:55:49,849 INFO] Building Fields object…
[2019-08-20 20:55:49,849 INFO] Building & saving training data…
[2019-08-20 20:55:49,849 INFO] Reading source and target files: data/europarl-v7.de-en1.de data/europarl-v7.de-en1.en.
[2019-08-20 20:55:50,138 INFO] Building shard 0.
[2019-08-20 20:56:16,465 INFO] * saving 0th train data shard to data/demo.train.0.pt.
[2019-08-20 20:56:45,150 INFO] Building shard 1.
[2019-08-20 20:57:11,380 INFO] * saving 1th train data shard to data/demo.train.1.pt.
[2019-08-20 20:57:36,484 INFO] * tgt vocab size: 50004.
[2019-08-20 20:57:37,359 INFO] * src vocab size: 50002.
[2019-08-20 20:57:38,627 INFO] Building & saving validation data…
[2019-08-20 20:57:38,627 INFO] Reading source and target files: data/validsrc.txt data/validtgt.txt.
[2019-08-20 20:57:38,628 INFO] Building shard 0.
[2019-08-20 20:57:38,700 INFO] * saving 0th valid data shard to data/demo.valid.0.pt.

(OpenNMT) admin123@admin123-MS-7A15:~/OpenNMT-Trans/OpenNMT-py$ python train.py -data data/demo -save_model demo-model
[2019-08-20 21:03:17,069 INFO] * src vocab size = 50002
[2019-08-20 21:03:17,069 INFO] * tgt vocab size = 50004
[2019-08-20 21:03:17,069 INFO] Building model…
[2019-08-20 21:03:17,906 INFO] NMTModel(
(encoder): RNNEncoder(
(embeddings): Embeddings(
(make_embedding): Sequential(
(emb_luts): Elementwise(
(0): Embedding(50002, 500, padding_idx=1)
)
)
)
(rnn): LSTM(500, 500, num_layers=2, dropout=0.3)
)
(decoder): InputFeedRNNDecoder(
(embeddings): Embeddings(
(make_embedding): Sequential(
(emb_luts): Elementwise(
(0): Embedding(50004, 500, padding_idx=1)
)
)
)
(dropout): Dropout(p=0.3, inplace=False)
(rnn): StackedLSTM(
(dropout): Dropout(p=0.3, inplace=False)
(layers): ModuleList(
(0): LSTMCell(1000, 500)
(1): LSTMCell(500, 500)
)
)
(attn): GlobalAttention(
(linear_in): Linear(in_features=500, out_features=500, bias=False)
(linear_out): Linear(in_features=1000, out_features=500, bias=False)
)
)
(generator): Sequential(
(0): Linear(in_features=500, out_features=50004, bias=True)
(1): Cast()
(2): LogSoftmax()
)
)
[2019-08-20 21:03:17,906 INFO] encoder: 29009000
[2019-08-20 21:03:17,906 INFO] decoder: 55812004
[2019-08-20 21:03:17,906 INFO] * number of parameters: 84821004
[2019-08-20 21:03:17,907 INFO] Starting training on CPU, could be very slow
[2019-08-20 21:03:17,907 INFO] Start training loop and validate every 10000 steps…
[2019-08-20 21:03:17,907 INFO] Loading dataset from data/demo.train.0.pt
[2019-08-20 21:03:28,754 INFO] number of examples: 924406
Traceback (most recent call last):
File “train.py”, line 200, in
main(opt)
File “train.py”, line 88, in main
single_main(opt, -1)
File “/home/admin123/OpenNMT-Trans/OpenNMT-py/onmt/train_single.py”, line 143, in main
valid_steps=opt.valid_steps)
File “/home/admin123/OpenNMT-Trans/OpenNMT-py/onmt/trainer.py”, line 243, in train
report_stats)
File “/home/admin123/OpenNMT-Trans/OpenNMT-py/onmt/trainer.py”, line 357, in _gradient_accumulation
outputs, attns = self.model(src, tgt, src_lengths, bptt=bptt)
File “/home/admin123/anaconda3/envs/OpenNMT/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 547, in call
result = self.forward(*input, **kwargs)
File “/home/admin123/OpenNMT-Trans/OpenNMT-py/onmt/models/model.py”, line 47, in forward
memory_lengths=lengths)
File “/home/admin123/anaconda3/envs/OpenNMT/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 547, in call
result = self.forward(*input, **kwargs)
File “/home/admin123/OpenNMT-Trans/OpenNMT-py/onmt/decoders/decoder.py”, line 213, in forward
tgt, memory_bank, memory_lengths=memory_lengths)
File “/home/admin123/OpenNMT-Trans/OpenNMT-py/onmt/decoders/decoder.py”, line 395, in run_forward_pass
memory_lengths=memory_lengths)
File “/home/admin123/anaconda3/envs/OpenNMT/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 547, in call
result = self.forward(*input, **kwargs)
File “/home/admin123/OpenNMT-Trans/OpenNMT-py/onmt/modules/global_attention.py”, line 183, in forward
align.masked_fill
(1 - mask, -float(‘inf’))
File “/home/admin123/anaconda3/envs/OpenNMT/lib/python3.7/site-packages/torch/tensor.py”, line 325, in rsub
return _C._VariableFunctions.rsub(self, other)
RuntimeError: Subtraction, the - operator, with a bool tensor is not supported. If you are trying to invert a mask, use the ~ or bitwise_not() operator instead.

Please assist me in resolving this error.

Thank You,
Kishor.

Hi,

Got the solution for the above issue, it was due to torch version. I had installed torch 1.2.0, by downgrading it to torch 1.1 the issue got resolved.

Regards,
Kishor.

Hi @KishorKP
We just merged a first batch of fixes to support pytorch 1.2.0 . This particular issue should be fixed.

Dear Francois,

No, the issue persists now at least as I have done a fresh installation of packages yesterday.

Thank You,
Kishor.

This PR was merged 1 hour ago. Please try pulling the current master.

Dear Francois,

 Let me check with the setup again to confirm.

Regards,
Kishor.

Can someone help to solve this issue please.
I’m facing same problem with pytorch version 1.5.1.