Pretrained Model English<->German

Hi,

I want to use your pretrained English2German model, but it only works with a GPU (that I don’t have).
Can you upload a model for CPUs instead of a checkpoint?

Thanks
Marcel

1 Like

good point! sure I will - give me 2 days, I have a larger model almost finished training.

Hi all
is it possible if i want to create a new model english2indonesia ?

sarwo

Yes. There is some training data available here through WAT2016 http://lotus.kuee.kyoto-u.ac.jp/WAT/#evaluation.html

@srush thanks, terima kasih banyak

Hi rush, how to running this program using CPU, is tutoring available ?

Hello - I forget to mention. the models are now cpu compatible - new links: https://s3.amazonaws.com/opennmt-models/onmt_baseline_wmt15-all.en-de_epoch13_7.19_release.t7, and https://s3.amazonaws.com/opennmt-models/onmt_baseline_wmt15-all.de-en_epoch13_8.98_release.t7.

1 Like

Thanks, but…

Loading './gereng/onmt_baseline_wmt15-all.de-en_epoch13_8.98_release.t7'...	
/home/marcel/torch/install/bin/luajit: ./onmt/data/Batch.lua:10: bad argument #1 to 'size' (dimension 1 out of range of 0D tensor at /tmp/luarocks_torch-scm-1-8096/torch7/generic/Tensor.c:19)
stack traceback:
	[C]: in function 'size'
	./onmt/data/Batch.lua:10: in function 'getLength'
	./onmt/data/Batch.lua:71: in function '__init'
	/home/marcel/torch/install/share/lua/5.1/torch/init.lua:91: in function 'getBatch'
	./onmt/translate/Translator.lua:323: in function 'translate'
	translate.lua:116: in function 'main'
	translate.lua:193: in main chunk
	[C]: in function 'dofile'
	...rcel/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main chunk
	[C]: at 0x00405d50

Can you give the exact commandline you are using?
Thanks

th translate.lua -model ./gereng/onmt_baseline_wmt15-all.de-en_epoch13_8.98_release.t7 -src ../test.tok -output ./out.out -gpuid -1

ok - I can reproduce when there is an empty sentence in the input file - I opened an issue here. Can you double-check and filter your input file as a temp workaround? Thanks for the report.

OK, thank you very much.
You are right, there was a line break at the end of the input file that caused the error. I could have noticed that, too!
Without the line break everything is ok!

I’m a bit puzzled by this. I’m getting good translations then a failure before the end of the document with exactly the same message:
luajit: ./onmt/data/Batch.lua:10: bad argument #1 to ‘size’ (dimension 1 out of range of 0D tensor at /tmp/luarocks_torch-scm-1-8096/torch7/generic/Tensor.c:19)
stack traceback:
[C]: in function ‘size’
./onmt/data/Batch.lua:10: in function ‘getLength’
./onmt/data/Batch.lua:71: in function '__init’
There are no “empty sentences”.
I’m puzzled how the requirement of one sentence per line is reconciled with the need to avoid line breaks.
Terence

What version/revision of OpenNMT are you using? This issue should be fixed with recent version of the project that handles empty lines in the file to translate.