Running OpenNMT-tf within Docker with nvidia-docker

tensorflow

(Terence Lewis) #1

Hi, Having done some experiments with TensorFlow on CPU I’ve installed tensorflow-gpu within Docker with nvidia-docker and the tests run nicely. Could somebody please point me to documentation or a HowTo to associate OpenNMT-tf with this set-up.

Thanks
Terence


(Guillaume Klein) #2

Hi,

It depends how you want to use it really.

You could first build a custom image that includes OpenNMT-tf and invokes docker run as if you were invoking an actual script:

FROM tensorflow/tensorflow:1.11.0-gpu
WORKDIR /root
RUN pip install OpenNMT-tf
ENTRYPOINT []
docker build -t opennmt/opennmt-tf -f Dockerfile .
nvidia-docker run -it --rm --entrypoint onmt-main opennmt/opennmt-tf -h

Of course you need to mount the directories you want to access within the Docker container.


Unable to use GPU
(Terence Lewis) #3

Thanks - that’s a good starting point!


(Terence Lewis) #4

Hi @guillaumekln , It just kept on building until it filled the disk. Can opennmt-tf really occupy 824.5 GB? And I don’t understand why it should be writing into an OpenNMT subdirectory to which no reference was made in the Dockerfile. Any idea what could be happening?
Regards,
Terence


(Guillaume Klein) #5

824.5GB is the size of the context passed to docker build. See the PATH argument in the command documentation:

You should adapt the commands I shared above to your setup.