Tool Calls / Function Calling
Let the model call functions you define. The gateway handles format conversion between providers — define tools once in OpenAI format and it works everywhere.
Base URL & authentication
Base URL: https://api.openadapter.in
All requests need an Authorization: Bearer sk-cv-... header. Generate or copy your API key from the Dashboard → API Keys page.
Let the model call functions you define. The gateway handles format conversion between providers — define tools once in OpenAI format and it works everywhere.
curl https://api.openadapter.in/v1/chat/completions \
-H "Authorization: Bearer $API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "glm-4.7",
"messages": [{"role": "user", "content": "What is the weather in Tokyo?"}],
"tools": [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather for a city",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string"},
"unit": {"type": "string", "enum": ["celsius","fahrenheit"]}
},
"required": ["city"]
}
}
}],
"tool_choice": "auto"
}'tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather for a city",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "City name"},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
},
"required": ["city"]
}
}
}
]
response = client.chat.completions.create(
model="glm-4.7",
messages=[{"role": "user", "content": "What's the weather in Tokyo?"}],
tools=tools,
tool_choice="auto"
)
# Check if model wants to call a tool
msg = response.choices[0].message
if msg.tool_calls:
for call in msg.tool_calls:
print(f"Call: {call.function.name}({call.function.arguments})")const tools = [{
type: 'function',
function: {
name: 'get_weather',
description: 'Get weather for a city',
parameters: {
type: 'object',
properties: {
city: { type: 'string', description: 'City name' },
unit: { type: 'string', enum: ['celsius', 'fahrenheit'] },
},
required: ['city'],
},
},
}];
const response = await client.chat.completions.create({
model: 'glm-4.7',
messages: [{ role: 'user', content: "What's the weather in Tokyo?" }],
tools,
tool_choice: 'auto',
});
const msg = response.choices[0].message;
if (msg.tool_calls) {
for (const call of msg.tool_calls) {
console.log(`Call: ${call.function.name}(${call.function.arguments})`);
}
}Embeddings
Create vector embeddings for search, RAG, clustering, and semantic similarity. The gateway exposes a standard **OpenAI Embeddings** API at `POST /v1/embeddings`
Structured Output
Force the model to respond in valid JSON. Use `response_format` to get predictable, parseable output for your applications.