50+ Intégrations

Changez simplement l'URL de base et utilisez PrivacyProxy avec tous les outils compatibles OpenAI. Python, JavaScript, CLI, No-Code, AI Agents, Chat UIs et extensions IDE.

# Changez juste 1 ligne !
client = OpenAI(
    base_url="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key"
)

Démarrage en 3 étapes

1

Changer l'URL de base

api.openai.com → api.privacyproxy.dev
2

Utiliser la clé PrivacyProxy

api_key="sk-your-privacyproxy-key"
3

Utiliser le préfixe modèle

model="u_ID/openai/gpt-4o"

Bibliothèques Python

OpenAI Python SDK Recommandé
from openai import OpenAI

client = OpenAI(
    base_url="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key"
)

response = client.chat.completions.create(
    model="u_123/openai/gpt-4o-mini",
    messages=[{"role": "user", "content": "Hello!"}]
)
LangChain
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key",
    model="u_123/openai/gpt-4o"
)

response = llm.invoke("What is GDPR?")
LlamaIndex
from llama_index.llms.openai import OpenAI

llm = OpenAI(
    api_base="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key",
    model="u_123/openai/gpt-4o"
)
Instructor (Sortie structurée)
import instructor
from openai import OpenAI

client = instructor.from_openai(OpenAI(
    base_url="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key"
))

Autres: LiteLLM, Haystack, DSPy, Guidance, Marvin, Outlines

JavaScript / TypeScript

OpenAI Node.js SDK
import OpenAI from 'openai';

const client = new OpenAI({
  baseURL: 'https://api.privacyproxy.dev/v1',
  apiKey: 'sk-your-privacyproxy-key',
});

const response = await client.chat.completions.create({
  model: 'u_123/openai/gpt-4o-mini',
  messages: [{ role: 'user', content: 'Hello!' }],
});
Vercel AI SDK
import { createOpenAI } from '@ai-sdk/openai';
import { generateText } from 'ai';

const openai = createOpenAI({
  baseURL: 'https://api.privacyproxy.dev/v1',
  apiKey: 'sk-your-privacyproxy-key',
});

const { text } = await generateText({
  model: openai('u_123/openai/gpt-4o'),
  prompt: 'What is TypeScript?',
});

Outils CLI

cURL
curl https://api.privacyproxy.dev/v1/chat/completions \
  -H "Authorization: Bearer sk-your-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "u_123/openai/gpt-4o-mini",
    "messages": [{"role": "user", "content": "Hi"}]
  }'
Variables d'environnement
# .env ou Shell
export OPENAI_API_KEY="sk-your-privacyproxy-key"
export OPENAI_BASE_URL="https://api.privacyproxy.dev/v1"

# Fonctionne ensuite avec :
# - OpenAI CLI
# - aider
# - sgpt
# - llm (Simon Willison)

No-Code / Low-Code

n8n

Node OpenAI → URL de base personnalisée

Flowise

Credentials → Base Path

Dify

Model Provider → Compatible OpenAI

Langflow

Composant OpenAI personnalisé

Make (Integromat)

Module HTTP

Zapier

Webhooks by Zapier

Agents IA & Frameworks

CrewAI
from crewai import Agent
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://api.privacyproxy.dev/v1",
    api_key="sk-your-privacyproxy-key",
    model="u_123/openai/gpt-4o"
)

agent = Agent(role='Researcher', llm=llm)
AutoGen (Microsoft)
config_list = [{
    "model": "u_123/openai/gpt-4o",
    "base_url": "https://api.privacyproxy.dev/v1",
    "api_key": "sk-your-privacyproxy-key"
}]

assistant = autogen.AssistantAgent(
    name="assistant",
    llm_config={"config_list": config_list}
)

Autres: AutoGPT, BabyAGI, Semantic Kernel, Pydantic AI, Smolagents

Interfaces Chat

Open WebUI

Paramètres → Connexions → URL de base OpenAI

LibreChat

librechat.yaml → endpoints → custom

LobeChat

Paramètres → Adresse proxy API

ChatBox

Paramètres → Hôte API

Extensions IDE

Continue.dev (VS Code / JetBrains)
// ~/.continue/config.json
{
  "models": [{
    "title": "PrivacyProxy GPT-4o",
    "provider": "openai",
    "model": "u_123/openai/gpt-4o",
    "apiBase": "https://api.privacyproxy.dev/v1",
    "apiKey": "sk-your-privacyproxy-key"
  }]
}
Cursor

Paramètres → Modèles → Clé API OpenAI :

  • URL de base : https://api.privacyproxy.dev/v1
  • Clé API : sk-your-privacyproxy-key
  • Modèle : u_123/openai/gpt-4o
Aider
export OPENAI_API_KEY="sk-your-privacyproxy-key"
export OPENAI_API_BASE="https://api.privacyproxy.dev/v1"

aider --model u_123/openai/gpt-4o
Cody (Sourcegraph)
// settings.json
{
  "cody.autocomplete.advanced.provider": "openai",
  "cody.autocomplete.advanced.serverEndpoint":
    "https://api.privacyproxy.dev/v1",
  "cody.autocomplete.advanced.accessToken":
    "sk-your-privacyproxy-key"
}

Autres langages

PHP (Laravel)
$client = OpenAI::factory()
    ->withApiKey('sk-...')
    ->withBaseUri('https://api.privacyproxy.dev/v1')
    ->make();
Ruby
client = OpenAI::Client.new(
  access_token: "sk-...",
  uri_base: "https://api.privacyproxy.dev/v1"
)
Go
config := openai.DefaultConfig("sk-...")
config.BaseURL = "https://api.privacyproxy.dev/v1"
client := openai.NewClientWithConfig(config)
C# / .NET
var client = new OpenAIClient(
    new Uri("https://api.privacyproxy.dev/v1"),
    new AzureKeyCredential("sk-...")
);
Java (Spring AI)
spring:
  ai:
    openai:
      api-key: sk-...
      base-url: https://api.privacyproxy.dev
Rust
let config = OpenAIConfig::new()
    .with_api_key("sk-...")
    .with_api_base("https://api.privacyproxy.dev/v1");
let client = Client::with_config(config);

Format des noms de modèles

{user_id}/{provider}/{model}
Fournisseur Exemple
OpenAI u_123/openai/gpt-4o-mini
Anthropic u_123/anthropic/claude-3-5-sonnet-20241022
Google u_123/gemini/gemini-1.5-pro
Groq u_123/groq/llama-3.1-70b-versatile
Mistral u_123/mistral/mistral-large-latest

Les nouveaux modèles des fournisseurs fonctionnent automatiquement grâce au routage wildcard !