kinetical's picture
Update README.md
fbdac99 verified
|
raw
history blame
1.9 kB
metadata
license: apache-2.0
language:
  - et
  - ru
  - en
base_model:
  - meta-llama/Llama-3.2-3B-Instruct

Simultaneous Machine Translation between English and Estonian using Llama-3.2

Introduction

This is a llama-3.2 3B instruct model finetuned on 500k pairs of sentences sampled from NLLB and WikiMatrix, in all directions between Estonian, Russian, and English, accompanied with a few other languages randomly sampled.

This model is very good at translating in simultaneous manner, i.e. it can handle incomplete and streaming inputs. Try to type your sentence in piece meal : every few words, hit enter and see the model translates as you go.

How to run it

First, install llama.cpp, follow https://github.com/ggerganov/llama.cpp?tab=readme-ov-file

Then, pull the quantized gguf file, and run the following command:

Estonian to English direction

llama-cli -m Llama-3.2-3B-Instruct-Q4_K_M.gguf \
          -p "You are a professional Estonian-to-English simultaneous interpreter. Translate the following conversations into English." \
          -cnv \
          --chat-template llama3 \
          -c 4096 --temp 0.0

English to Estonian direction

change the system above to "You are a professional English-to-Estonian simultaneous interpreter. Translate the following conversations into Estonian.", i.e.

llama-cli -m Llama-3.2-3B-Instruct-Q4_K_M.gguf \
          -p "You are a professional English-to-Estonian simultaneous interpreter. Translate the following conversations into Estonian." \
          -cnv \
          --chat-template llama3 \
          -c 4096 --temp 0.0

Now you can try to type in sentences that you want to translate.

  • You may try to type in a few words, hit enter, and repeat. The model should be able to translate simultaneously.
  • You can also try to input a full sentence.
  • It's not great for a paragraph (yet, still).