50+ Integrationen

Ändere nur die Base-URL und nutze PrivacyProxy mit allen Tools, die OpenAI unterstützen. Python, JavaScript, CLI, No-Code, AI Agents, Chat UIs und IDE Extensions.

# Nur 1 Zeile ändern!
client = OpenAI(
    base_url="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key"
)

Schnellstart in 3 Schritten

1

Base URL ändern

api.openai.com → api.privacyproxy.dev
2

PrivacyProxy Key nutzen

api_key="sk-your-privacyproxy-key"
3

Model-Prefix verwenden

model="u_ID/openai/gpt-4o"

Python Libraries

OpenAI Python SDK Empfohlen
from openai import OpenAI

client = OpenAI(
    base_url="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key"
)

response = client.chat.completions.create(
    model="u_123/openai/gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello!"}]
)
LangChain
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key",
    model="u_123/openai/gpt-4o"
)

response = llm.invoke("What is GDPR?")
LlamaIndex
from llama_index.llms.openai import OpenAI

llm = OpenAI(
    api_base="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key",
    model="u_123/openai/gpt-4o"
)
Instructor (Structured Output)
import instructor
from openai import OpenAI

client = instructor.from_openai(OpenAI(
    base_url="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key"
))

Weitere: LiteLLM, Haystack, DSPy, Guidance, Marvin, Outlines

JavaScript / TypeScript

OpenAI Node.js SDK
import OpenAI from 'openai';

const client = new OpenAI({
  baseURL: 'https://api.privacyproxy.dev/v1',
  apiKey: 'sk-your-privacyproxy-key',
});

const response = await client.chat.completions.create({
  model: 'u_123/openai/gpt-4o-mini',
  messages: [{ role: 'user', content: 'Hello!' }],
});
Vercel AI SDK
import { createOpenAI } from '@ai-sdk/openai';
import { generateText } from 'ai';

const openai = createOpenAI({
  baseURL: 'https://api.privacyproxy.dev/v1',
  apiKey: 'sk-your-privacyproxy-key',
});

const { text } = await generateText({
  model: openai('u_123/openai/gpt-4o'),
  prompt: 'What is TypeScript?',
});

CLI Tools

cURL
curl https://api.privacyproxy.dev/v1/chat/completions \
  -H "Authorization: Bearer sk-your-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "u_123/openai/gpt-4o-mini",
    "messages": [{"role": "user", "content": "Hi"}]
  }'
Environment Variables
# .env oder Shell
export OPENAI_API_KEY="sk-your-privacyproxy-key"
export OPENAI_BASE_URL="https://api.privacyproxy.dev/v1"

# Funktioniert dann mit:
# - OpenAI CLI
# - aider
# - sgpt
# - llm (Simon Willison)

No-Code / Low-Code

n8n

OpenAI Node → Custom Base URL

Flowise

Credentials → Base Path

Dify

Model Provider → OpenAI-compatible

Langflow

Custom OpenAI Component

Make (Integromat)

HTTP Module

Zapier

Webhooks by Zapier

AI Agents & Frameworks

CrewAI
from crewai import Agent
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key",
    model="u_123/openai/gpt-4o"
)

agent = Agent(role='Researcher', llm=llm)
AutoGen (Microsoft)
config_list = [{
    "model": "u_123/openai/gpt-4o",
    "base_url": "https://api.privacyproxy.dev/v1",
    "api_key": "sk-your-privacyproxy-key"
}]

assistant = autogen.AssistantAgent(
    name="assistant",
    llm_config={"config_list": config_list}
)

Weitere: AutoGPT, BabyAGI, Semantic Kernel, Pydantic AI, Smolagents

Chat Interfaces

Open WebUI

Settings → Connections → OpenAI Base URL

LibreChat

librechat.yaml → endpoints → custom

LobeChat

Settings → API Proxy Address

ChatBox

Settings → API Host

IDE Extensions

Continue.dev (VS Code / JetBrains)
// ~/.continue/config.json
{
  "models": [{
    "title": "PrivacyProxy GPT-4o",
    "provider": "openai",
    "model": "u_123/openai/gpt-4o",
    "apiBase": "https://api.privacyproxy.dev/v1",
    "apiKey": "sk-your-privacyproxy-key"
  }]
}
Cursor

Settings → Models → OpenAI API Key:

  • Override Base URL: https://api.privacyproxy.dev/v1
  • API Key: sk-your-privacyproxy-key
  • Model: u_123/openai/gpt-4o
Aider
export OPENAI_API_KEY="sk-your-privacyproxy-key"
export OPENAI_API_BASE="https://api.privacyproxy.dev/v1"

aider --model u_123/openai/gpt-4o
Cody (Sourcegraph)
// settings.json
{
  "cody.autocomplete.advanced.provider": "openai",
  "cody.autocomplete.advanced.serverEndpoint":
    "https://api.privacyproxy.dev/v1",
  "cody.autocomplete.advanced.accessToken":
    "sk-your-privacyproxy-key"
}

Weitere Sprachen

PHP (Laravel)
$client = OpenAI::factory()
    ->withApiKey('sk-...')
    ->withBaseUri('https://api.privacyproxy.dev/v1')
    ->make();
Ruby
client = OpenAI::Client.new(
  access_token: "sk-...",
  uri_base: "https://api.privacyproxy.dev/v1"
)
Go
config := openai.DefaultConfig("sk-...")
config.BaseURL = "https://api.privacyproxy.dev/v1"
client := openai.NewClientWithConfig(config)
C# / .NET
var client = new OpenAIClient(
    new Uri("https://api.privacyproxy.dev/v1"),
    new AzureKeyCredential("sk-...")
);
Java (Spring AI)
spring:
  ai:
    openai:
      api-key: sk-...
      base-url: https://api.privacyproxy.dev
Rust
let config = OpenAIConfig::new()
    .with_api_key("sk-...")
    .with_api_base("https://api.privacyproxy.dev/v1");
let client = Client::with_config(config);

Model-Namen Format

{user_id}/{provider}/{model}
Provider Beispiel
OpenAI u_123/openai/gpt-4o-mini
Anthropic u_123/anthropic/claude-3-5-sonnet-20241022
Google u_123/gemini/gemini-1.5-pro
Groq u_123/groq/llama-3.1-70b-versatile
Mistral u_123/mistral/mistral-large-latest

Neue Models von Providern funktionieren automatisch dank Wildcard-Routing!