Alpha
The Lingo.dev Compiler is in alpha. It is unstable, not recommended for production use, and APIs may change between releases.
The Lingo.dev Compiler supports multiple translation providers, from the managed Lingo.dev localization engine to direct LLM provider connections and local models. You configure providers through the models option, which accepts either a single provider string or an object mapping locale pairs to specific providers.
Lingo.dev Engine (recommended)#
The Lingo.dev localization engine is the default provider. It routes translations through a managed pipeline with dynamic model selection, automatic fallbacks, glossary enforcement, and brand voice profiles.
{
models: "lingo.dev",
}Authenticate via CLI:
npx lingo.dev@latest loginOr set the API key in .env:
LINGODOTDEV_API_KEY=your_key_hereWhy use the Lingo.dev engine
The localization engine selects the optimal model per locale pair, applies your glossary and brand voice rules, and falls back to alternative models if a provider is unavailable. Direct LLM providers do not include these features.
Direct LLM providers#
Connect directly to any supported LLM provider by specifying a provider:model string:
| Provider | Model format | Environment variable | Example |
|---|---|---|---|
| OpenAI | openai:<model> | OPENAI_API_KEY | openai:gpt-4o |
| Anthropic | anthropic:<model> | ANTHROPIC_API_KEY | anthropic:claude-3-5-sonnet |
google:<model> | GOOGLE_API_KEY | google:gemini-2.0-flash | |
| Groq | groq:<model> | GROQ_API_KEY | groq:llama-3.3-70b-versatile |
| Mistral | mistral:<model> | MISTRAL_API_KEY | mistral:mistral-large |
| OpenRouter | openrouter:<model> | OPENROUTER_API_KEY | openrouter:anthropic/claude-3.5-sonnet |
| Ollama | ollama:<model> | None (local) | ollama:llama3.2 |
Single provider for all locales#
Set a string to use one provider for every locale pair:
{
models: "openai:gpt-4o",
}Ollama (local models)#
Ollama runs models locally with no API key required. Install Ollama, pull a model, and configure:
{
models: "ollama:llama3.2",
}Local models are useful for offline development and for teams that cannot send content to external APIs. Translation quality varies by model size - larger models produce more accurate results.
Locale-pair mapping#
The models option accepts an object to route specific locale pairs to different providers. Keys use the format source:target with wildcard (*) support:
{
models: {
"*:*": "lingo.dev", // Default for all pairs
"*:ja": "anthropic:claude-3-5-sonnet", // Japanese via Anthropic
"*:zh-Hans": "anthropic:claude-3-5-sonnet", // Simplified Chinese via Anthropic
"en:de": "openai:gpt-4o", // English-to-German via OpenAI
},
}The compiler matches locale pairs from most specific to least specific:
Exact match
en:de matches only English-to-German translations.
Target wildcard
*:ja matches any source language translating to Japanese.
Full wildcard
*:* is the fallback for any pair without a more specific match.
This mapping lets you optimize for cost and quality. For example, use a fast model for European languages and a model with stronger CJK support for East Asian locales.
Custom prompts#
The prompt option sets a system prompt for the translation LLM. Use {SOURCE_LOCALE} and {TARGET_LOCALE} as placeholders - the compiler replaces them with the actual locale codes at translation time:
{
prompt: "You are translating a SaaS application UI from {SOURCE_LOCALE} to {TARGET_LOCALE}. Keep translations concise. Preserve technical terms in English. Use formal register.",
}Custom prompts apply to direct LLM providers only. When using the Lingo.dev localization engine, configure instructions and brand voice through the Lingo.dev dashboard instead.
