Function Calling
Function Calling allows models to automatically select and invoke your predefined tool functions based on user needs, enabling data queries, API calls, task execution, and more.
Core Concepts
The complete Function Calling flow:
- Define tools — Describe available functions and parameters in the request
- Model decides — The model determines whether to call a tool
- Return call — The model returns the function name and arguments
- Execute function — You execute the function and get the result
- Continue conversation — Send the result back to the model to generate the final response
OpenAI Protocol
Python
function_calling.py
from openai import OpenAI
import json
client = OpenAI(
base_url="https://api.ofox.ai/v1",
api_key="<your OFOXAI_API_KEY>"
)
# 1. Define tools
tools = [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get real-time weather information for a given city",
"parameters": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "City name, e.g. New York, London"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "Temperature unit"
}
},
"required": ["city"]
}
}
}]
# 2. Send request
messages = [{"role": "user", "content": "What's the weather like in San Francisco today?"}]
response = client.chat.completions.create(
model="openai/gpt-4o",
messages=messages,
tools=tools,
tool_choice="auto"
)
message = response.choices[0].message
# 3. Handle tool calls
if message.tool_calls:
for tool_call in message.tool_calls:
args = json.loads(tool_call.function.arguments)
# 4. Execute your function
result = get_weather(args["city"]) # Your own implementation
# 5. Send the result back to the model
messages.append(message)
messages.append({
"role": "tool",
"tool_call_id": tool_call.id,
"content": json.dumps(result)
})
# Get the final response
final = client.chat.completions.create(
model="openai/gpt-4o",
messages=messages,
tools=tools
)
print(final.choices[0].message.content)Anthropic Protocol
Anthropic uses the tools parameter with a slightly different format:
anthropic_tools.py
import anthropic
client = anthropic.Anthropic(
base_url="https://api.ofox.ai/anthropic",
api_key="<your OFOXAI_API_KEY>"
)
response = client.messages.create(
model="anthropic/claude-sonnet-4.5",
max_tokens=1024,
tools=[{
"name": "get_weather",
"description": "Get real-time weather information for a given city",
"input_schema": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "City name"}
},
"required": ["city"]
}
}],
messages=[{"role": "user", "content": "What's the weather like in San Francisco today?"}]
)
# Handle tool_use content block
for block in response.content:
if block.type == "tool_use":
print(f"Calling tool: {block.name}, args: {block.input}")Parallel Function Calling
The model can return multiple tool calls in a single response. You should execute them in parallel:
# The model may request multiple tool calls at once
if message.tool_calls:
# Execute all tool calls in parallel
import asyncio
async def execute_tools(tool_calls):
tasks = []
for tc in tool_calls:
args = json.loads(tc.function.arguments)
tasks.append(execute_function(tc.function.name, args))
return await asyncio.gather(*tasks)tool_choice Parameter
| Value | Description |
|---|---|
"auto" | Model automatically decides whether to call tools (default) |
"none" | Disable tool calling |
"required" | Force tool calling |
{"type": "function", "function": {"name": "xxx"}} | Force calling a specific tool |
Supported Models
The following models support Function Calling:
- OpenAI:
gpt-4o,gpt-4o-mini,o1,o3-mini - Anthropic:
claude-opus-4,claude-sonnet-4,claude-3-5-haiku - Google:
gemini-3.1-pro-preview,gemini-3-flash-preview,gemini-3-pro-preview - Others:
deepseek-chat,qwen-max,glm-4
Last updated on