Simple Web Interface

Dear Samuel,

So, using Streamlist caching by adding @st.cache(allow_output_mutation=True) before the load_models() function helps avoid reloading the models with every new request. This makes sense. Many thanks for sharing!

Kind regards,
Yasmin

Here is an example of what I was able to do with some additional customization:

Which is really useful for translators to validate the model.

3 Likes

Great work, Samuel. I love the colour coding :slight_smile:

If anyone needs it… Its kinda of hard to find the right colours so that it’s readable and “nice” to see…

I used Color library to generate the gradient:

from colour import Color
green = Color("#56ff33")
colors = list(green.range_to(Color("#ff6e6e"),10))
print(colors)

which provided me this list of colours:
colorList = ["#56ff33", “#84ff3a”, “#aeff40”, “#d7ff47”, “#fcff4d”, “#ffdf54”, “#ffbf5a”, “#ffa161”, “#ff8667”, “#ff6e6e”]

the code for the color legend is :

#Colour legend
            st.write('Colours Legend')
            legend = '<div style="display: table;"><div style="display: table-row">'
            for color in colorList:
                if color == colorList[0]:
                    legendText = 'Machine is sure'
                elif color == colorList[len(colorList)-1]:
                    legendText = 'Machine is not so sure'
                else:
                    legendText = ' '
                legend = legend + '<div style="background-color: ' + color + '; padding: 4px 3px; display: table-cell; width: min-content;">' + legendText + '</div>'
            legend = legend + '</div></div>'
            st.markdown(legend, unsafe_allow_html=True)

For the implementation with the predict score… you need to build a custom formula that generate a score from 0 to 9 and then place the sentence in a with the coresponding index in the colour list.

something like this (I’m using dataframe, so just adjust it to your structure…):

'<span style="background-color: ' + colorList[int(min(round(abs(x['PredictScore']),0), len(colorList)-1))] + '">' + x['Target'] + '</span>'

in this case i’m using the predict score directly, but i’m going to change that to use the normalisez predict score option from ctranslate2. Otherwise, long sentence always comeout red.

1 Like

@ymoslem I was wondering if CTranslate2 currently supports LSTM Models, and if not, how do you think I should go about integrating the LSTM weights into this streamlit tool?

I was currently able to integrate the transformer model into this streamlit server by converting my transformer “model.pt” file into “model.bin” using the CTranslate2 tool. Could you suggest something on how I could use the LSTM model for predicting translation for one line input since I don’t want to write to file every time and translate using onmt_translate, is there an alternative to this? @guillaumekln ?

Dear Ahan,

I do not think so. If you try to use CTranslate2 converter on a LSTM model, you will get this error:

- Options --encoder_type and --decoder_type must be 'transformer'

I am not sure, but I think OpenNMT-py REST Server supports all types of models. I used to use it with this web interface, or you can even go simpler using something like this example.

Kind regards,
Yasmin

2 Likes

Thank you this worked! Hosted a server for getting results for LSTM models and using CTranslate2 for Transformer model. This was very helpful!

1 Like

Hi Yasmin, Great Work

1 Like

Is there a way to make the server hot reload when the config file is changed (a new model is added)?

Hi James!

What do you mean by “config” file? If you mean the Python file, Streamlit supports hot reloading with changing and saving the Python file. I assume that advanced questions about Streamlit should be sent to their forum.

Again, as I mentioned before, this tutorial is meant for building quick demos for research purposes. For production purposes, usually a REST API (with Flask or FastAPI) is created and the task of loading models will be (fully or partially) handled from there.

Kind regards,
Yasmin

Hello James,

Which config file are you referring to?

Personally, use streamlit as front end. And I have a flask app in a docker and my models in a “docker volume”.

My python code refer to the folder nomenclature and file names nomenclature to understand which models are available.

example:

folder structure:

models/languageName/model.bin

Code:

  • loop on any folder in /models
  • if there is a model.bin file within the folder consider that the language is available.

When I add a new model I don’t need to rebuild anything. I just need to upload the model in the model folder and right away streamlit has access to the model.

I’m referring to the json file which specifies the models and their settings usually found in the folder, available_models.
Currently, any new addition of a model requires me to edit the config file, kill the server and restart it. I’m wondering if there’s a way to hot load models when the new model is called from automatic reading the newly edited conf file.
(I might be in the wrong thread since I’m looking from an API perspective)

I’ll look into Streamlit (GitHub - ymoslem/CTranslate-NMT-Web-Interface: Machine Translation (MT) Web Interface for OpenNMT and FairSeq models using CTranslate and Streamlit)

Dear James,

I assume you are talking about Simple OpenNMT-py REST server. This REST API uses Flask. In my experience, the task of auto-reloading in Flask is not as straightforward as it is in FastAPI. Still, you can have a look at the answers in this discussion.

All the best,
Yasmin

Hello,

James really seem to be doing exactly what I some what already done.

I don’t need to reload my API when I upload new models.

here some information that could be helpful:

Best regards,
Samuel

Hi Samuel!

I assume you are using FastAPI, right? In FastAPI, one can just use the flag --reload

Kind regards,
Yasmin

Hello Yasmin,

No i made a pure flask api in the end. I have a flask api that serves my models and i have an another api with streamlit that serves has UI (user interface). The UI call the translating API to get the translation and provide the information of the source and target language and the text to be translated. The translating api can also be called to provide the list of languages pair supported.
Best regards,
Samuel

1 Like

Hi @ymoslem, thanks for this tutorial… its excellent. I do have one question… I was able to get the app working using my own trained model. Following the tutorial, I took the model pt file and converted it to a CTranslate2 model using ct2-opennmt-py-converter and it works fine.

My question… should one first run onmt_release_model on the pt file before running the ct2-opennmt-py-converter to remove the training only parameters, or does the c2 converter do that already?

Even better, you can convert directly to CT2 format with the onmt_release_model (check the -format and -quantization args).

Thanks @francoishernandez, for the reply. So when I use the following command:

onmt_release_model --model ms_35.pt -o test.pt --quantization int8 --format pytorch

It works with no errors, but when changing the output format to ctranslate2, it generates an error. I am wondering if I need to compile OpenNMT-py using an option flag?

Traceback (most recent call last):
File “/Users/cryptik/.virtualenvs/opennmt-pv1/bin/onmt_release_model”, line 8, in
sys.exit(main())
File “/Users/cryptik/.virtualenvs/opennmt-pv1/lib/python3.8/site-packages/onmt/bin/release_model.py”, line 59, in main
converter.convert(opt.output, model_spec, force=True,
File “/Users/cryptik/.virtualenvs/opennmt-pv1/lib/python3.8/site-packages/ctranslate2/converters/converter.py”, line 53, in convert
model_spec.validate()
File “/Users/cryptik/.virtualenvs/opennmt-pv1/lib/python3.8/site-packages/ctranslate2/specs/model_spec.py”, line 265, in validate
if self._vmap is not None and not os.path.exists(self._vmap):
File “/usr/local/opt/python@3.8/bin/…/Frameworks/Python.framework/Versions/3.8/lib/python3.8/genericpath.py”, line 19, in exists
os.stat(path)
TypeError: stat: path should be string, bytes, os.PathLike or integer, not TransformerSpec

There may be a mismatch in your OpenNMT-py // CTranslate2 versions. Are you up to date on both?

Some significant changes were introduced here to allow CT2>=2.0.0 support.