Skip to content

Offline Text Translate (OTT) - Local Offline Translation

Local offline text translation supporting multiple languages, providing an API interface.

This project is a re-packaging based on the open-source project LibreTranslate. The goal is to provide an easy-to-deploy translation API service directly on a local machine, without the need for Docker. It also offers pre-compiled Windows EXE packages, eliminating the need for deployment – just double-click to use, making it convenient for beginners and novices.

The first launch requires downloading models; subsequent runs can be done offline.

If you want to use the native LibreTranslate project or deploy it in Docker, please visit https://github.com/LibreTranslate/LibreTranslate.

Using the Pre-compiled Windows Package

  1. If you cannot access the address https://raw.githubusercontent.com, you must set the proxy address in set.ini using PROXY=.

    Alternatively, you can download the pre-packaged models from Baidu Netdisk. After extracting, copy the ".local" folder inside to overwrite the folder of the same name in the root directory of this software. Click to download from Baidu Netdisk.

  2. Click to download the Windows pre-compiled package. Extract it to an English directory without spaces, and double-click start.exe.

  3. The first time it starts, it will automatically download the models. After the download is complete, it will display the address and port of the current API service, which you can then use. (Alternatively, you can download the pre-packaged models from Baidu Netdisk. After extracting, copy the ".local" folder inside to overwrite the folder of the same name in the root directory of this software).

  4. You can write your own program to request this API service to replace functions like Baidu Translate, or fill it into software that requires translation functions. For example, if you want to use it in video translation and dubbing software, fill in the server address and port (default is http://127.0.0.1:9911) in the software menu - Settings - OTT.

Source Code Deployment on Windows

  1. First, download and install Python 3.9+ from python.org (3.10 is recommended). During installation, be sure to check the "Add ... Path" checkbox for convenient use later.

  2. Install a Git client on Windows. Click to download, select "64-bit Git for Windows Setup", download, and double-click to install, clicking "Next" until finished.

  3. Create an empty directory, for example, create a directory named "ott" under the D drive: D:/ott. Then, enter this directory and type cmd in the folder address bar and press Enter. In the opened cmd black window, enter git clone https://github.com/jianchang512/ott . and press Enter to execute.

  4. Create a virtual environment. In the just-opened cmd window, continue to enter the command python -m venv venv and press Enter.

Note: If you get the message "'python' is not recognized as an internal or external command, operable program or batch file", it means you did not check the checkbox in Step 0 during installation. Double-click the downloaded Python installation package again, select "Modify", and then be sure to check "Add ... Path".

After reinstalling Python, **you must close the opened cmd window**, otherwise, it may still prompt that the command is not found.  Then enter `D:/ott`, type `cmd` in the address bar and press Enter, and then re-execute `python -m venv venv`.
  1. After the above command is executed successfully, continue to enter .\venv\scripts\activate and press Enter, and then execute pip install -r requirements.txt --no-deps. If you get the message "not found version xxx", please change the mirror source to the official PIP source or the Alibaba Cloud mirror.

  2. If you need to enable CUDA acceleration for translation, continue to execute the following separately:

    pip uninstall -y torch

    pip install torch==2.1.2 --index-url https://download.pytorch.org/whl/cu121

  3. Set the proxy in set.ini with PROXY=proxy address. For example, if your proxy address is http://127.0.0.1:10189, then fill it in as PROXY=http://127.0.0.1:10189.

  4. Execute the command to start the service: python start.py.