PrivacyProxy API Documentation
GDPR-compliant LLM proxy. Change one line of code, keep using OpenAI, Anthropic, and 60+ providers.
https://api.privacyproxy.dev
Quickstart
Get started in 3 steps:
1. Get your API key
Sign up at privacyproxy.dev and get your API key from the dashboard.
2. Add your provider key
In the dashboard, connect your OpenAI, Anthropic, or other provider API key. We encrypt and store it securely.
3. Change your base URL
Replace your provider's API URL with PrivacyProxy:
# Before (OpenAI direct)
client = OpenAI(api_key="sk-your-openai-key")
# After (via PrivacyProxy)
client = OpenAI(
api_key="sk-your-privacyproxy-key",
base_url="https://api.privacyproxy.dev/v1"
)
That's it. All requests now go through PrivacyProxy, PII is automatically masked before reaching OpenAI.
Authentication
All API requests require a Bearer token in the Authorization header:
Authorization: Bearer sk-your-privacyproxy-key
Get your API key from the Dashboard → API Keys.
Chat Completions
Create a chat completion. OpenAI-compatible endpoint.
Request Body
| Parameter | Type | Description |
|---|---|---|
model |
string | Model ID in format {user_id}/{provider}/{model} |
messages |
array | Array of message objects with role and content |
temperature |
number | Optional. Sampling temperature (0-2) |
max_tokens |
integer | Optional. Maximum tokens to generate |
stream |
boolean | Optional. Enable streaming responses |
Example Request
curl -X POST https://api.privacyproxy.dev/v1/chat/completions \
-H "Authorization: Bearer sk-your-privacyproxy-key" \
-H "Content-Type: application/json" \
-d '{
"model": "u_1/openai/gpt-4o-mini",
"messages": [
{"role": "user", "content": "Hello, my name is Max Mustermann and my email is max@example.com"}
]
}'
What happens
- PrivacyProxy receives:
"Hello, my name is Max Mustermann and my email is max@example.com" - PII is masked:
"Hello, my name is [NAME_a1b2] and my email is [EMAIL_c3d4]" - Masked request sent to OpenAI
- Response unmasked before returning to you
Example Response
{
"id": "chatcmpl-abc123",
"object": "chat.completion",
"created": 1703123456,
"model": "gpt-4o-mini",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello Max Mustermann! I see your email is max@example.com."
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 25,
"completion_tokens": 15,
"total_tokens": 40
}
}
Models
Model IDs follow the format:
{user_id}/{provider}/{model}
Your user ID is shown in your dashboard. Examples:
| Provider | Model ID |
|---|---|
| OpenAI | u_1/openai/gpt-4o |
| OpenAI | u_1/openai/gpt-4o-mini |
| Anthropic | u_1/anthropic/claude-3-5-sonnet-20241022 |
| Mistral | u_1/mistral/mistral-large-latest |
| Groq | u_1/groq/llama-3.1-70b-versatile |
Wildcard patterns are supported: u_1/openai/* allows any OpenAI model.
Supported Providers
64 providers supported, including:
EU Providers (GDPR-optimized)
- Mistral AI - Paris, France
- Aleph Alpha - Heidelberg, Germany
- OVHcloud AI - France
- Azure OpenAI (EU regions)
- AWS Bedrock (EU regions)
US Major Providers
- OpenAI
- Anthropic
- Google Gemini
- Groq
- Perplexity
See all providers: Integrations
Python Example
from openai import OpenAI
client = OpenAI(
api_key="sk-your-privacyproxy-key",
base_url="https://api.privacyproxy.dev/v1"
)
response = client.chat.completions.create(
model="u_1/openai/gpt-4o-mini",
messages=[
{"role": "user", "content": "Summarize this: Max Mustermann (max@example.com) ordered 5 items."}
]
)
print(response.choices[0].message.content)
Node.js Example
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'sk-your-privacyproxy-key',
baseURL: 'https://api.privacyproxy.dev/v1'
});
const response = await client.chat.completions.create({
model: 'u_1/openai/gpt-4o-mini',
messages: [
{ role: 'user', content: 'Summarize this: Max Mustermann (max@example.com) ordered 5 items.' }
]
});
console.log(response.choices[0].message.content);
cURL Example
curl -X POST https://api.privacyproxy.dev/v1/chat/completions \
-H "Authorization: Bearer sk-your-privacyproxy-key" \
-H "Content-Type: application/json" \
-d '{
"model": "u_1/openai/gpt-4o-mini",
"messages": [
{"role": "user", "content": "Hello, my name is Max Mustermann"}
]
}'
Error Handling
| Status | Error | Description |
|---|---|---|
401 |
auth_error | Invalid or missing API key |
400 |
invalid_request | Missing required parameters |
404 |
model_not_found | Model not available or provider not connected |
429 |
rate_limit | Too many requests |
500 |
server_error | Internal error, please retry |
Rate Limits
| Plan | Requests/min | Budget/month |
|---|---|---|
| Developer | 60 | $100 |
| Professional | 300 | $500 |
| Enterprise | Unlimited | Custom |
Questions? hello@privacyproxy.dev