Quickstart

Get up and running with Lingo.dev CLI

Introduction

Lingo.dev CLI is a free, open-source CLI for translating apps and content with AI. This quickstart guide explains how to get up and running with Lingo.dev CLI in a matter of minutes.

Step 1. Initialize a project

  1. Navigate into a project directory:

    cd <your-project-directory>
    
  2. Run the following command:

    npx lingo.dev@latest init
    
  3. Follow the prompts.

The CLI will create an i18n.json file for configuring the translation pipeline.

Step 2. Configure a bucket

In the i18n.json file, configure at least one bucket:

{
  "$schema": "https://lingo.dev/schema/i18n.json",
  "version": 1.8,
  "locale": {
    "source": "en",
    "targets": ["es"]
  },
  "buckets": {
    "json": {
      "include": ["locales/[locale].json"]
    }
  }
}

Buckets determine the parser that Lingo.dev CLI uses for extracting translatable content. For example, the "json" bucket is specifically designed for translating JSON files.

A bucket's configuration must define one or more include patterns, which specify what files should be translated. (You can optionally specify exclude patterns for additional control.)

Be aware that:

  • The include patterns of certain buckets must contain a special [locale] placeholder, but in other cases this placeholder is strictly not allowed.
  • Some buckets support additional features, such as key locking.

To learn more about the exact requirements for each bucket, refer to the documentation for each bucket, such as JSON or CSV.

Step 3. Configure an LLM provider

After Lingo.dev CLI extracts content from a file, it sends it to a large language model (LLM) for translation, before writing the translated content back to the file system.

We recommend using Lingo.dev Engine — our own, hosted platform — as the LLM provider, but this isn't strictly required. The CLI supports a number of third-party providers, such as Anthropic and OpenAI.

Lingo.dev Engine

  1. Log in to Lingo.dev Engine.
  2. Navigate to the Projects page.
  3. Click API key > Copy.
  4. Set a LINGODOTDEV_API_KEY environment variable with the API key as the value.

Third-party provider

  1. Get an API key from one of the supported providers.

  2. Set an environment variable with the API key as the value:

    • If you're using Anthropic, set ANTHROPIC_API_KEY
    • If you're using Google, set GOOGLE_API_KEY
    • If you're using Mistral, set MISTRAL_API_KEY
    • If you're using OpenAI, set OPENAI_API_KEY
    • If you're using OpenRouter, set OPENROUTER_API_KEY
  3. In the i18n.json file, add a provider object with the following properties:

    • id - The ID of the LLM provider (e.g., openai).
    • model - The ID of a specific model from that LLM provider (e.g., gpt-4o-mini).
    • prompt - The prompt to be sent with all LLM requests. Use {source} and {target} as placeholder values for the source and target locales. These will be replaced at runtime.

    For example:

    {
      "$schema": "https://lingo.dev/schema/i18n.json",
      "version": 1.8,
      "locale": {
        "source": "en",
        "targets": ["es"]
      },
      "buckets": {
        "json": {
          "include": ["locales/[locale].json"]
        }
      },
      "provider": {
        "id": "openai",
        "model": "gpt-4o-mini",
        "prompt": "Translate the provided text from {source} to {target}."
      }
    }
    

To learn more about the available options, see i18n.json.

Step 4. Generate translations

In the project directory, run the following command:

npx lingo.dev@latest run

The CLI will:

  1. Determine what files need to be translated, based on the i18n.json file.
  2. Extract the translatable content from the files.
  3. Send the content to the configured LLM provider for translation.
  4. Write the translated content back to the file system.
  5. Create an i18n.lock file for keeping track of translated content.

To learn more about the complete lifecycle, see How it works.

Next steps

See CLI commands for the complete list of commands, options, and flags available via the CLI.