ONNX - a standard for neural network representation

There has been some discussion for a long time about a standard representation of neural networks, and ONNX seems to make its way quite fast:

It is actively promoted by Facebook (as a key feature of pytorch 0.4) AWS (through MXNet) and Microsoft and will, for instance, allow importing a neural model to Caffe2 or converting to Core ML which is the machine learning framework in iOS devices - and there is huge ecosystem with many large and smaller players around and compilers/backends taking an ONNX model and proposing simple and optimized inference environment. Tensorflow is not openly supporting the project, but there are some conversion tools between TensorFlow models and ONNX (https://github.com/onnx/onnx-tensorflow).

There is a catch though, ONNX is (for the moment) used to represent the architecture of the neural network with a simplified set of “operators”, but it does not cover all the logic necessary for a translation, preprocessing, recurrent connection between the different components of a neural network, the beam search, etc… So it is not completely magic, and we still need a bit of engineering and code around.

In the current framework competition, ONNX seems to be going into a good direction - ideally, we can dream about using the format to transfer a pre-trained model in OpenNMT lua to OpenNMT-py or to export a model and compile it to run on a totally different infrastructure.

I am interested in digging more in that, especially for OpenNMT - and I have created a converter from lua to onnx if you are interested in:

Jean

3 Likes