Skip to main content

Quickstart

Let's discover LLMrouter.eu Quickstart in less than 5 minutes.

LLMrouter.eu is a secure gateway for managing Large Language Models (LLMs) from the EU, designed for organizations that require data privacy, cost optimization, and EU compliance. It helps avoid vendor lock-in, ensures data protection, and enables integration of the latest AI technologies into business workflows. LLMrouter is ideal for scalable, compliant AI solutions in regulated environments.

Get started with just some lines of code or usage of your preferred SDK.

Using the LLMrouter.eu API directly

You can use the LLMrouter.eu API directly with any HTTP client, as its endpoints are fully OpenAI API compatible. Simply point your requests to https://proxy.llmrouter.eu/v1 and use your API key for authentication.

Replace YOUR_API_KEY with your actual LLMrouter.eu API key.

import requests

url = "https://proxy.llmrouter.eu/v1/chat/completions"
headers = {
"Authorization": "Bearer YOUR_API_KEY",
"Content-Type": "application/json"
}

data = {
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "Hello, LLMrouter!"
}
]
}

response = requests.post(url, headers=headers, json=data)

print(response.json())

Using the OpenAI SDK

from openai import OpenAI
client = OpenAI(
base_url="https://proxy.llmrouter.eu/v1",
api_key="YOUR_API_KEY",
)
completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{
"role": "user",
"content": "Hello, LLMrouter!"
}
]
)
print(completion.choices[0].message.content)

Listing Available Models

You can retrieve all models available for your API key using the /models endpoint.

import requests

url = "https://proxy.llmrouter.eu/v1/models"
headers = {
"Authorization": "Bearer YOUR_API_KEY",
"Content-Type": "application/json"
}

response = requests.get(url, headers=headers)

print(response.json())

Using the OpenAI SDK for Models

from openai import OpenAI

client = OpenAI(
base_url="https://proxy.llmrouter.eu/v1",
api_key="YOUR_API_KEY"
)

# List all available models
models = client.models.list()

print("Available models:")
for model in models.data:
print(f"- {model.id}")

Using third-party SDKs

More Coming soon