[GH-ISSUE #305] 在 OpenAI 兼容接口下丢失 Function Calling #219

Closed
opened 2026-02-27 07:18:31 +03:00 by kerem · 2 comments
Owner

Originally created by @fhp88888 on GitHub (Feb 4, 2026).
Original GitHub issue: https://github.com/justlovemaki/AIClient-2-API/issues/305

版本:docker部署 v2.8.9

在使用 OpenAI 兼容接口调用 gemini-3-flash-preview模型时,模型原生响应中包含正确的 functionCall 数据,但经过网关转换后,返回给客户端的响应中 tool_calls 字段被丢弃,导致客户端无法获取工具调用指令。

log上的记录:

 [2026-02-04 08:17:15.671] [INFO] [AI Monitor][1770193034203jm10zfyv] [Res Native]: {"candidates":[{"content":{"role":"model","parts":[{"thoughtSignature":"Et4CCtsCVGXM32NqABE----------------------9/s8ororuQh/nDf04zFy1Bj4etebQBmz+qNMymZRK0M6AFA5PNKJ4=","functionCall":{"name":"get_weather","args":{"location":"Beijing"}}}]},"finishReason":"STOP","finishMessage":"Model generated function call(s)."}],"usageMetadata":{"promptTokenCount":38,"candidatesTokenCount":16,"totalTokenCount":130,"thoughtsTokenCount":76}}

[2026-02-04 08:17:15.671] [INFO] [AI Monitor][1770193034203jm10zfyv] [Res Converted]: {"id":"chatcmpl-5d4af4cc-03bf-42b2-b910-106884ed2766","object":"chat.completion","created":1770193035,"model":"gemini-3-flash-preview","choices":[{"index":0,"message":{"role":"assistant","content":""},"finish_reason":"stop"}],"usage":{"prompt_tokens":38,"completion_tokens":16,"total_tokens":130,"cached_tokens":0,"prompt_tokens_details":{"cached_tokens":0},"completion_tokens_details":{"reasoning_tokens":76}}}

可见function call在转换后被丢弃了。

复现脚本:


import requests
import json

def test_llm():
    # OpenAI Style API Configuration
    url = "http://192.168.123.1:8015/gemini-antigravity/v1/chat/completions"
    api_token = "123"
    model_name = "gemini-3-flash-preview"

    headers = {
        "Content-Type": "application/json",
        "Authorization": f"Bearer {api_token}"
    }

    # Define a tool for testing
    tools = [
        {
            "type": "function",
            "function": {
                "name": "get_weather",
                "description": "获取指定城市的实时天气情况",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "city": {
                            "type": "string",
                            "description": "城市名称,例如:北京, 上海"
                        }
                    },
                    "required": ["city"]
                }
            }
        }
    ]

    payload = {
        "model": model_name,
        "messages": [
            {
                "role": "user",
                "content": "北京天气如何?请调用 get_weather 工具进行查询。"
            }
        ],
        "tools": tools,
        "tool_choice": "auto"
    }

    print(f"Connecting to {url} for {model_name} Tool Call Test...")
    print(f"Request Payload: {json.dumps(payload, ensure_ascii=False, indent=2)}")
    try:
        response = requests.post(url, headers=headers, data=json.dumps(payload), timeout=30)
        print(f"HTTP Status Code: {response.status_code}")
        response.raise_for_status()

        raw_response = response.json()
        print(f"Raw Response: {json.dumps(raw_response, ensure_ascii=False, indent=2)}")

        message = raw_response['choices'][0]['message']

        print("\n--- [LLM Response Status] ---")
        if 'tool_calls' in message:
            print("SUCCESS: Tool call detected!")
            for tool_call in message['tool_calls']:
                print(f"Function Name: {tool_call['function']['name']}")
                print(f"Arguments: {tool_call['function']['arguments']}")
        else:
            print("No tool call detected.")
            print(f"Content: {message.get('content', 'None')}")
        print("-----------------------------\n")

    except requests.exceptions.RequestException as e:
        print(f"API Request Failed: {e}")
    except (KeyError, IndexError):
        print("Received unexpected response format from the server.")
        print(f"Raw Response: {response.text}")

if __name__ == "__main__":
    test_llm()

Originally created by @fhp88888 on GitHub (Feb 4, 2026). Original GitHub issue: https://github.com/justlovemaki/AIClient-2-API/issues/305 版本:docker部署 v2.8.9 在使用 OpenAI 兼容接口调用 gemini-3-flash-preview模型时,模型原生响应中包含正确的 functionCall 数据,但经过网关转换后,返回给客户端的响应中 tool_calls 字段被丢弃,导致客户端无法获取工具调用指令。 log上的记录: ``` [2026-02-04 08:17:15.671] [INFO] [AI Monitor][1770193034203jm10zfyv] [Res Native]: {"candidates":[{"content":{"role":"model","parts":[{"thoughtSignature":"Et4CCtsCVGXM32NqABE----------------------9/s8ororuQh/nDf04zFy1Bj4etebQBmz+qNMymZRK0M6AFA5PNKJ4=","functionCall":{"name":"get_weather","args":{"location":"Beijing"}}}]},"finishReason":"STOP","finishMessage":"Model generated function call(s)."}],"usageMetadata":{"promptTokenCount":38,"candidatesTokenCount":16,"totalTokenCount":130,"thoughtsTokenCount":76}} [2026-02-04 08:17:15.671] [INFO] [AI Monitor][1770193034203jm10zfyv] [Res Converted]: {"id":"chatcmpl-5d4af4cc-03bf-42b2-b910-106884ed2766","object":"chat.completion","created":1770193035,"model":"gemini-3-flash-preview","choices":[{"index":0,"message":{"role":"assistant","content":""},"finish_reason":"stop"}],"usage":{"prompt_tokens":38,"completion_tokens":16,"total_tokens":130,"cached_tokens":0,"prompt_tokens_details":{"cached_tokens":0},"completion_tokens_details":{"reasoning_tokens":76}}} ``` 可见function call在转换后被丢弃了。 复现脚本: ```python import requests import json def test_llm(): # OpenAI Style API Configuration url = "http://192.168.123.1:8015/gemini-antigravity/v1/chat/completions" api_token = "123" model_name = "gemini-3-flash-preview" headers = { "Content-Type": "application/json", "Authorization": f"Bearer {api_token}" } # Define a tool for testing tools = [ { "type": "function", "function": { "name": "get_weather", "description": "获取指定城市的实时天气情况", "parameters": { "type": "object", "properties": { "city": { "type": "string", "description": "城市名称,例如:北京, 上海" } }, "required": ["city"] } } } ] payload = { "model": model_name, "messages": [ { "role": "user", "content": "北京天气如何?请调用 get_weather 工具进行查询。" } ], "tools": tools, "tool_choice": "auto" } print(f"Connecting to {url} for {model_name} Tool Call Test...") print(f"Request Payload: {json.dumps(payload, ensure_ascii=False, indent=2)}") try: response = requests.post(url, headers=headers, data=json.dumps(payload), timeout=30) print(f"HTTP Status Code: {response.status_code}") response.raise_for_status() raw_response = response.json() print(f"Raw Response: {json.dumps(raw_response, ensure_ascii=False, indent=2)}") message = raw_response['choices'][0]['message'] print("\n--- [LLM Response Status] ---") if 'tool_calls' in message: print("SUCCESS: Tool call detected!") for tool_call in message['tool_calls']: print(f"Function Name: {tool_call['function']['name']}") print(f"Arguments: {tool_call['function']['arguments']}") else: print("No tool call detected.") print(f"Content: {message.get('content', 'None')}") print("-----------------------------\n") except requests.exceptions.RequestException as e: print(f"API Request Failed: {e}") except (KeyError, IndexError): print("Received unexpected response format from the server.") print(f"Raw Response: {response.text}") if __name__ == "__main__": test_llm() ```
kerem closed this issue 2026-02-27 07:18:32 +03:00
Author
Owner

@justlovemaki commented on GitHub (Feb 4, 2026):

已修复

<!-- gh-comment-id:3846747270 --> @justlovemaki commented on GitHub (Feb 4, 2026): 已修复
Author
Owner

@fhp88888 commented on GitHub (Feb 4, 2026):

AMAZING! 好快的修!

<!-- gh-comment-id:3846934580 --> @fhp88888 commented on GitHub (Feb 4, 2026): AMAZING! 好快的修!
Sign in to join this conversation.
No labels
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/AIClient-2-API-justlovemaki#219
No description provided.