ModuleNotFoundError: No module named 'onmt.IO'

when run the train.py
ModuleNotFoundError: No module named ‘onmt.IO’ has happened, how to deal with it?

The traceback is as follow:
Traceback (most recent call last):
File “D:/PyCharm/WorkSpace/OpenNMT-py-master/train.py”, line 502, in
main()
File “D:/PyCharm/WorkSpace/OpenNMT-py-master/train.py”, line 476, in main
first_dataset = next(lazily_load_dataset(“train”))
File “D:/PyCharm/WorkSpace/OpenNMT-py-master/train.py”, line 332, in lazily_load_dataset
yield lazy_dataset_loader(pt, corpus_type)
File “D:/PyCharm/WorkSpace/OpenNMT-py-master/train.py”, line 319, in lazy_dataset_loader
dataset = torch.load(pt_file)
File “C:\Users\LWK\Anaconda3\lib\site-packages\torch\serialization.py”, line 303, in load
return _load(f, map_location, pickle_module)
File “C:\Users\LWK\Anaconda3\lib\site-packages\torch\serialization.py”, line 469, in _load
result = unpickler.load()
ModuleNotFoundError: No module named ‘onmt.IO

1 Like

I also get this error when running on google colabratory:

Traceback (most recent call last):
File “/content/OpenNMT-py/train.py”, line 143, in
main(opt)
File “/content/OpenNMT-py/train.py”, line 97, in main
first_dataset = next(lazily_load_dataset(“train”, opt))
File “/content/OpenNMT-py/onmt/inputters/inputter.py”, line 534, in lazily_load_dataset
yield _lazy_dataset_loader(pt, corpus_type)
File “/content/OpenNMT-py/onmt/inputters/inputter.py”, line 525, in _lazy_dataset_loader
dataset = torch.load(pt_file)
File “/usr/local/lib/python3.6/dist-packages/torch/serialization.py”, line 303, in load
return _load(f, map_location, pickle_module)
File “/usr/local/lib/python3.6/dist-packages/torch/serialization.py”, line 469, in _load
result = unpickler.load()
ModuleNotFoundError: No module named ‘onmt.io

using pytorch 0.4.0 and torchtext 0.3.0

According to the github project:

onmt.IO was refactored into a onmt.io package recently.
This indicates that your preprocessed dataset is too old.

I solved this problem by retraining my dataset, i.e. to preprocess again.

Hope this help.

1 Like

Yes, this solved the issue, thanks.