Hi everyone,
I’m working on Japanese-Vietnamese pair and I’m having the runtime error below whem training the model. My src is Japanese and tgt is Vietnamese.
I will really appreciate your kindness.
Command used for training model:
python train.py -data data/demo-train.pt -save_model demo-model
Encountered error:
Namespace(batch_size=64, brnn=False, brnn_merge=‘concat’, curriculum=False, data=‘data1/demo.train.pt’, dropout=0.3, epochs=13, extra_shuffle=False, gpus=[], input_feed=1, layers=2, learning_rate=1.0, learning_rate_decay=0.5, log_interval=50, max_generator_batches=32, max_grad_norm=5, optim=‘sgd’, param_init=0.1, pre_word_vecs_dec=None, pre_word_vecs_enc=None, rnn_size=500, save_model=‘demo-model’, start_decay_at=8, start_epoch=1, train_from=’’, train_from_state_dict=’’, word_vec_size=500)
Loading data from ‘data1/demo.train.pt’
- vocabulary size. source = 50004; target = 15290
- number of training sentences. 14
- maximum batch size. 64
Building model… - number of parameters: 50073290
NMTModel (
(encoder): Encoder (
(word_lut): Embedding(50004, 500, padding_idx=0)
(rnn): LSTM(500, 500, num_layers=2, dropout=0.3)
)
(decoder): Decoder (
(word_lut): Embedding(15290, 500, padding_idx=0)
(rnn): StackedLSTM (
(dropout): Dropout (p = 0.3)
(layers): ModuleList (
(0): LSTMCell(1000, 500)
(1): LSTMCell(500, 500)
)
)
(attn): GlobalAttention (
(linear_in): Linear (500 -> 500)
(sm): Softmax ()
(linear_out): Linear (1000 -> 500)
(tanh): Tanh ()
)
(dropout): Dropout (p = 0.3)
)
(generator): Sequential (
(0): Linear (500 -> 15290)
(1): LogSoftmax ()
)
)
Traceback (most recent call last):
File “train.py”, line 351, in
main()
File “train.py”, line 347, in main
trainModel(model, trainData, validData, dataset, optim)
File “train.py”, line 232, in trainModel
train_loss, train_acc = trainEpoch(epoch)
File “train.py”, line 185, in trainEpoch
batchOrder = torch.randperm(len(trainData))
RuntimeError: must be strictly positive at /data/users/soumith/builder/wheel/pytorch-src/torch/lib/TH/generic/THTensorMath.c:1930