OpenRouter Integration

Use Noirdoc with OpenRouter for multi-model access.

Overview

OpenRouter provides a unified API to access models from multiple providers — OpenAI, Anthropic, Meta, Google, Mistral, and others — through a single OpenAI-compatible endpoint. Noirdoc integrates with OpenRouter the same way it integrates with OpenAI: change the base URL and API key, and PII is pseudonymized automatically.

Setup

Configure your OpenRouter provider in the Noirdoc portal by providing your OpenRouter API key. Then use the Noirdoc proxy URL and your px- proxy key in your application.

Since OpenRouter follows the OpenAI API format, the base URL includes the /v1 suffix:

base_url = "https://api.noirdoc.de/v1"

Python

from openai import OpenAI

client = OpenAI(
    base_url="https://api.noirdoc.de/v1",
    api_key="px-your-noirdoc-key",
)

response = client.chat.completions.create(
    model="anthropic/claude-sonnet-4-6",
    messages=[
        {
            "role": "user",
            "content": "Summarize the contract for Max Mustermann, born 15.03.1985.",
        }
    ],
)

print(response.choices[0].message.content)

Use the full OpenRouter model identifier in the model parameter, for example anthropic/claude-sonnet-4-6, openai/gpt-5.4-mini, google/gemini-2.5-pro, or meta-llama/llama-4-maverick.

Node.js

import OpenAI from "openai";

const client = new OpenAI({
  baseURL: "https://api.noirdoc.de/v1",
  apiKey: "px-your-noirdoc-key",
});

const response = await client.chat.completions.create({
  model: "openai/gpt-5.4-mini",
  messages: [
    {
      role: "user",
      content: "Draft an email to julia.weber@example.com about the quarterly report.",
    },
  ],
});

console.log(response.choices[0].message.content);

Auto-routing

OpenRouter supports automatic model selection with openrouter/auto, which picks the best model for your request:

response = client.chat.completions.create(
    model="openrouter/auto",
    messages=[
        {
            "role": "user",
            "content": "Explain the GDPR implications for processing health records of patient Anna Schmidt.",
        }
    ],
)

PII pseudonymization works regardless of which model OpenRouter selects.

Streaming

Streaming works exactly like the standard OpenAI integration. See the Streaming page for SDK examples and SSE details.

Available models

Any model available on OpenRouter works through Noirdoc. Popular options include:

  • openai/gpt-5.4-mini and openai/gpt-5.4
  • anthropic/claude-sonnet-4-6 and anthropic/claude-opus-4-6
  • google/gemini-2.5-pro and google/gemini-2.5-flash
  • meta-llama/llama-4-maverick
  • mistralai/mistral-large-2512
  • openrouter/auto for automatic model selection

Check the OpenRouter documentation for the full list of supported models and pricing.

How it works

Noirdoc treats OpenRouter requests the same as OpenAI requests because OpenRouter uses the OpenAI-compatible API format. The Authorization: Bearer px-... header identifies the request, Noirdoc resolves your OpenRouter provider key from the portal, pseudonymizes PII, and forwards the request to OpenRouter. The response is reidentified and returned to your application.