Unified access to world-class AI models. OpenAI-compatible API — one key for all models.
https://tokenhub.store/api/v1All API requests use this base URL. Fully compatible with OpenAI SDKs.
Click on a provider to view available models and code examples.
Leading AI research company, creators of the GPT series. Known for state-of-the-art language models with exceptional reasoning and coding capabilities.
AI safety company known for Claude models. Excel at nuanced conversations, code, and complex reasoning with strong safety features.
Google's Gemini family offers cutting-edge multimodal capabilities with industry-leading context windows up to 2M tokens.
Elon Musk's AI company. Grok models are known for real-time knowledge, witty responses, and excellent coding assistance.
Chinese AI lab known for highly cost-effective models. DeepSeek-V3 offers exceptional performance at a fraction of the cost.
Alibaba's Qwen series offers powerful multilingual models with strong performance in both English and Chinese tasks.
Chinese AI company focused on GLM models. Strong performance in Chinese language tasks with competitive pricing.
Moonshot AI specializes in long-context processing with models supporting up to 128K tokens at competitive prices.
Create a chat completion with streaming support
Authorization: Bearer th-your-api-key Content-Type: application/json
{
"model": "openai/gpt-4.1",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
],
"temperature": 0.7,
"max_tokens": 1000,
"stream": false
}List all available models
curl https://tokenhub.store/api/v1/chat/completions \
-H "Authorization: Bearer th-your-api-key" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-4.1",
"messages": [{"role": "user", "content": "Hello!"}]
}'from openai import OpenAI
client = OpenAI(
api_key="th-your-api-key",
base_url="https://tokenhub.store/api/v1"
)
# Use any supported model
response = client.chat.completions.create(
model="openai/gpt-5.4", # Or any other model
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'th-your-api-key',
baseURL: 'https://tokenhub.store/api/v1',
});
// Use any supported model
const response = await client.chat.completions.create({
model: 'anthropic/claude-sonnet-4-6',
messages: [{ role: 'user', content: 'Hello!' }],
});
console.log(response.choices[0].message.content);from openai import OpenAI
client = OpenAI(
api_key="th-your-api-key",
base_url="https://tokenhub.store/api/v1"
)
stream = client.chat.completions.create(
model="google/gemini-2.5-pro",
messages=[{"role": "user", "content": "Write a short story."}],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")Contact us: support@tokenhub.store