🎉 v1.0

Get started

  • Welcome
  • Documentation
  • Pricing
    Soon

Tools

  • I18n MCP
  • CLI
  • CI/CD Integrations
  • Compiler
    Alpha
  • Connect Your Engine

Resources

  • Languages
  • LLM Models
  • Guides

Company

  • Enterprise
  • CareersHiring!
Dashboard

Lingo.dev Compiler

  • How it works
  • Setup

Framework Integration

  • Next.js
  • Vite + React

Configuration

  • Configuration Reference
  • Translation Providers
  • Build Modes

Features

  • Manual Overrides
  • Custom Locale Resolvers
  • Automatic Pluralization
  • Locale Switching

Development

  • Development Tools
  • Project Structure

Guides

  • Best Practices
  • Migration Guide
  • Troubleshooting

Translation Providers

Max PrilutskiyMax Prilutskiy·Updated 5 days ago·3 min read

Alpha

The Lingo.dev Compiler is in alpha. It is unstable, not recommended for production use, and APIs may change between releases.

The Lingo.dev Compiler supports multiple translation providers, from the managed Lingo.dev localization engine to direct LLM provider connections and local models. You configure providers through the models option, which accepts either a single provider string or an object mapping locale pairs to specific providers.

Lingo.dev Engine (recommended)#

The Lingo.dev localization engine is the default provider. It routes translations through a managed pipeline with dynamic model selection, automatic fallbacks, glossary enforcement, and brand voice profiles.

ts
{
  models: "lingo.dev",
}

Authenticate via CLI:

bash
npx lingo.dev@latest login

Or set the API key in .env:

bash
LINGODOTDEV_API_KEY=your_key_here

Why use the Lingo.dev engine

The localization engine selects the optimal model per locale pair, applies your glossary and brand voice rules, and falls back to alternative models if a provider is unavailable. Direct LLM providers do not include these features.

Direct LLM providers#

Connect directly to any supported LLM provider by specifying a provider:model string:

ProviderModel formatEnvironment variableExample
OpenAIopenai:<model>OPENAI_API_KEYopenai:gpt-4o
Anthropicanthropic:<model>ANTHROPIC_API_KEYanthropic:claude-3-5-sonnet
Googlegoogle:<model>GOOGLE_API_KEYgoogle:gemini-2.0-flash
Groqgroq:<model>GROQ_API_KEYgroq:llama-3.3-70b-versatile
Mistralmistral:<model>MISTRAL_API_KEYmistral:mistral-large
OpenRouteropenrouter:<model>OPENROUTER_API_KEYopenrouter:anthropic/claude-3.5-sonnet
Ollamaollama:<model>None (local)ollama:llama3.2

Single provider for all locales#

Set a string to use one provider for every locale pair:

ts
{
  models: "openai:gpt-4o",
}

Ollama (local models)#

Ollama runs models locally with no API key required. Install Ollama, pull a model, and configure:

ts
{
  models: "ollama:llama3.2",
}

Local models are useful for offline development and for teams that cannot send content to external APIs. Translation quality varies by model size - larger models produce more accurate results.

Locale-pair mapping#

The models option accepts an object to route specific locale pairs to different providers. Keys use the format source:target with wildcard (*) support:

ts
{
  models: {
    "*:*": "lingo.dev",                          // Default for all pairs
    "*:ja": "anthropic:claude-3-5-sonnet",       // Japanese via Anthropic
    "*:zh-Hans": "anthropic:claude-3-5-sonnet",  // Simplified Chinese via Anthropic
    "en:de": "openai:gpt-4o",                    // English-to-German via OpenAI
  },
}

The compiler matches locale pairs from most specific to least specific:

1

Exact match

en:de matches only English-to-German translations.

2

Target wildcard

*:ja matches any source language translating to Japanese.

3

Full wildcard

*:* is the fallback for any pair without a more specific match.

This mapping lets you optimize for cost and quality. For example, use a fast model for European languages and a model with stronger CJK support for East Asian locales.

Custom prompts#

The prompt option sets a system prompt for the translation LLM. Use {SOURCE_LOCALE} and {TARGET_LOCALE} as placeholders - the compiler replaces them with the actual locale codes at translation time:

ts
{
  prompt: "You are translating a SaaS application UI from {SOURCE_LOCALE} to {TARGET_LOCALE}. Keep translations concise. Preserve technical terms in English. Use formal register.",
}

Custom prompts apply to direct LLM providers only. When using the Lingo.dev localization engine, configure instructions and brand voice through the Lingo.dev dashboard instead.

Next Steps#

Configuration Reference
All configuration options in one place
Build Modes
Dev, CI, and production workflows
Best Practices
Cost optimization and model selection tips
Lingo.dev Engines
Configure a localization engine on Lingo.dev

Was this page helpful?