OpenAI Integration

Use Noirdoc with the OpenAI Python and Node.js SDKs.

Overview

Integrating Noirdoc with OpenAI takes two lines of configuration. Point the SDK at the Noirdoc proxy and swap your API key for a Noirdoc proxy key. Every request is automatically scanned for personal data, pseudonymized before it reaches OpenAI, and restored in the response.

Python — Chat Completions

Install the OpenAI SDK if you have not already:

pip install openai

Replace base_url and api_key in your existing client:

from openai import OpenAI

client = OpenAI(
    base_url="https://api.noirdoc.de/v1",
    api_key="px-your-noirdoc-key",
)

response = client.chat.completions.create(
    model="gpt-5.4-mini",
    messages=[
        {
            "role": "user",
            "content": "Draft a summary for Max Mustermann, born 15.03.1985, email max@example.com.",
        }
    ],
)

print(response.choices[0].message.content)

The proxy replaces Max Mustermann with a pseudonym like <<PERSON_1>> and max@example.com with <<EMAIL_1>> before the prompt reaches OpenAI. The response is returned with the original values seamlessly restored.

Python — Responses API

The newer Responses API works exactly the same way:

response = client.responses.create(
    model="gpt-5.4-mini",
    input="Summarize the contract for Anna Schmidt, IBAN DE89370400440532013000.",
)

print(response.output_text)

Noirdoc automatically detects the endpoint and applies the correct pseudonymization pipeline. Session continuity is preserved across chained responses via previous_response_id.

Node.js

npm install openai
import OpenAI from "openai";

const client = new OpenAI({
  baseURL: "https://api.noirdoc.de/v1",
  apiKey: "px-your-noirdoc-key",
});

const response = await client.chat.completions.create({
  model: "gpt-5.4-mini",
  messages: [
    {
      role: "user",
      content: "Send a reminder to julia.weber@example.com about her appointment on 2025-04-10.",
    },
  ],
});

console.log(response.choices[0].message.content);

Streaming

Streaming is fully supported. The proxy processes the response stream in real time, reidentifying pseudonyms as tokens arrive. See the Streaming page for SDK examples and SSE details.

Supported models

Any model available through the OpenAI API works through Noirdoc, including gpt-5.4-mini, gpt-5.4, gpt-5.4-nano, o3, and future releases. Noirdoc does not inspect or filter the model parameter — it simply forwards the pseudonymized request to OpenAI.

Your provider API key is stored securely in the Noirdoc portal and never exposed in your application code.