[GH-ISSUE #29] Upstream 400 Bad Request (Schema.ref 'QuestionOption') dropped with "no channel to forward it" #26

Closed
opened 2026-02-27 15:37:59 +03:00 by kerem · 5 comments
Owner

Originally created by @hwang-pixel on GitHub (Feb 20, 2026).
Original GitHub issue: https://github.com/NikkeTryHard/zerogravity/issues/29

What happened?

When sending a request through the zerogravity MITM proxy, the upstream API (daily-cloudcode-pa.googleapis.com) returns a 400 Bad Request due to a schema validation error. Specifically, the upstream API complains that Schema.ref 'QuestionOption' was set alongside unsupported fields.

Instead of propagating this 400 error back to the client gracefully, the proxy logs a warning stating upstream error but no channel to forward it. This indicates a failure in error handling/forwarding, likely leaving the client hanging or experiencing an abrupt connection closure without knowing the actual API error.

How to reproduce

Just basic prompting, i dont know how but this is my prompt.

Image

Diagnostic Report

[logs]

2026-02-20T02:04:21.967112Z  INFO zerogravity::mitm::proxy: MITM: forwarding LLM request domain="daily-cloudcode-pa.googleapis.com" req_path=POST /v1internal:streamGenerateContent?alt=sse body_len=64206 cascade=Some("[REDACTED_SESSION_ID]")
2026-02-20T02:04:21.968079Z  INFO zerogravity::mitm::modify: MITM: request modified [remove 1/2 content messages, replace dummy prompt in USER_REQUEST wrapper (34605 chars), strip all 23 LS tools, inject 1 custom tool group(s), override toolConfig VALIDATED → AUTO, inject toolConfig, inject generationConfig: maxOutputTokens=32000] original=64193 modified=97164 saved_bytes=-32971 saved_pct=-51
2026-02-20T02:04:25.167511Z  INFO zerogravity::mitm::proxy: MITM: streaming response domain="daily-cloudcode-pa.googleapis.com" content_type=text/event-stream status=200
2026-02-20T02:04:25.922768Z  INFO zerogravity::mitm::proxy: MITM: response complete — sending via channel response_text_len=64 thinking_text_len=71
2026-02-20T02:04:25.969300Z  INFO zerogravity::mitm::proxy: MITM: forwarding LLM request domain="daily-cloudcode-pa.googleapis.com" req_path=POST /v1internal:streamGenerateContent?alt=sse body_len=3074 cascade=Some("[REDACTED_SESSION_ID]")
2026-02-20T02:04:25.969625Z  INFO zerogravity::mitm::modify: MITM: request modified [remove 1/4 content messages, replace dummy prompt in USER_REQUEST wrapper (34605 chars), strip all 1 LS tools, inject 1 custom tool group(s), inject toolConfig, inject generationConfig: maxOutputTokens=32000] original=3062 modified=75064 saved_bytes=-72002 saved_pct=-2351
2026-02-20T02:04:25.971868Z  INFO zerogravity::mitm::proxy: MITM: transparent redirect (iptables) domain="daily-cloudcode-pa.googleapis.com"
2026-02-20T02:04:25.971898Z  INFO zerogravity::mitm::proxy: MITM: intercepting TLS domain="daily-cloudcode-pa.googleapis.com"
2026-02-20T02:04:25.973659Z  INFO zerogravity::mitm::proxy: MITM: TLS handshake successful ✓ domain="daily-cloudcode-pa.googleapis.com" alpn=None
2026-02-20T02:04:25.973691Z  INFO zerogravity::mitm::proxy: MITM: routing to HTTP/1.1 handler domain="daily-cloudcode-pa.googleapis.com"
2026-02-20T02:04:25.974416Z  INFO zerogravity::mitm::proxy: MITM: connecting upstream (BoringSSL) domain="daily-cloudcode-pa.googleapis.com" addr=daily-cloudcode-pa.googleapis.com:443
2026-02-20T02:04:26.014535Z  INFO zerogravity::mitm::proxy: MITM: upstream TLS connected ✓ (BoringSSL) domain="daily-cloudcode-pa.googleapis.com"
2026-02-20T02:04:26.983008Z  INFO zerogravity::mitm::proxy: MITM: streaming response domain="daily-cloudcode-pa.googleapis.com" content_type=text/event-stream status=400
2026-02-20T02:04:26.983036Z  WARN zerogravity::mitm::proxy: MITM: upstream error response domain="daily-cloudcode-pa.googleapis.com" status=400 body={
  "error": {
    "code": 400,
    "message": "Schema.ref 'QuestionOption' was set alongside unsupported fields.  If a schema node has Schema.ref set, then only description and default can be set alongside it; other fields they would be replaced by the expanded reference.",
    "status": "INVALID_ARGUMENT"
  }
}

2026-02-20T02:04:26.983046Z  WARN zerogravity::mitm::proxy: MITM: upstream error but no channel to forward it

Additional Context

My opencode.json

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "ZG",
      "options": {
        "baseURL": "http://localhost:8741/v1"
      },
      "models": {
        "opus-4.6": {
          "name": "Opus 4.6 Thinking"
        },
		"sonnet-4.6": {
          "name": "Sonnet 4.6 Thinking"
        },
		"gemini-3.1-pro-high": {
          "name": "Gemini 3.1 Pro High"
        },
		"gemini-3.1-pro": {
          "name": "Gemini 3.1 Pro"
        },
		"gemini-3.1-pro-low": {
          "name": "Gemini 3.1 Pro Low"
        },
		"gemini-3-flash": {
          "name": "Gemini 3 Flash"
        }
      }
    }
  }
}
Originally created by @hwang-pixel on GitHub (Feb 20, 2026). Original GitHub issue: https://github.com/NikkeTryHard/zerogravity/issues/29 ### What happened? When sending a request through the zerogravity MITM proxy, the upstream API (daily-cloudcode-pa.googleapis.com) returns a 400 Bad Request due to a schema validation error. Specifically, the upstream API complains that Schema.ref 'QuestionOption' was set alongside unsupported fields. Instead of propagating this 400 error back to the client gracefully, the proxy logs a warning stating upstream error but no channel to forward it. This indicates a failure in error handling/forwarding, likely leaving the client hanging or experiencing an abrupt connection closure without knowing the actual API error. ### How to reproduce Just basic prompting, i dont know how but this is my prompt. <img width="912" height="726" alt="Image" src="https://github.com/user-attachments/assets/4787b7ec-e991-45ac-825f-c77107ee226e" /> ### Diagnostic Report ```text [logs] 2026-02-20T02:04:21.967112Z INFO zerogravity::mitm::proxy: MITM: forwarding LLM request domain="daily-cloudcode-pa.googleapis.com" req_path=POST /v1internal:streamGenerateContent?alt=sse body_len=64206 cascade=Some("[REDACTED_SESSION_ID]") 2026-02-20T02:04:21.968079Z INFO zerogravity::mitm::modify: MITM: request modified [remove 1/2 content messages, replace dummy prompt in USER_REQUEST wrapper (34605 chars), strip all 23 LS tools, inject 1 custom tool group(s), override toolConfig VALIDATED → AUTO, inject toolConfig, inject generationConfig: maxOutputTokens=32000] original=64193 modified=97164 saved_bytes=-32971 saved_pct=-51 2026-02-20T02:04:25.167511Z INFO zerogravity::mitm::proxy: MITM: streaming response domain="daily-cloudcode-pa.googleapis.com" content_type=text/event-stream status=200 2026-02-20T02:04:25.922768Z INFO zerogravity::mitm::proxy: MITM: response complete — sending via channel response_text_len=64 thinking_text_len=71 2026-02-20T02:04:25.969300Z INFO zerogravity::mitm::proxy: MITM: forwarding LLM request domain="daily-cloudcode-pa.googleapis.com" req_path=POST /v1internal:streamGenerateContent?alt=sse body_len=3074 cascade=Some("[REDACTED_SESSION_ID]") 2026-02-20T02:04:25.969625Z INFO zerogravity::mitm::modify: MITM: request modified [remove 1/4 content messages, replace dummy prompt in USER_REQUEST wrapper (34605 chars), strip all 1 LS tools, inject 1 custom tool group(s), inject toolConfig, inject generationConfig: maxOutputTokens=32000] original=3062 modified=75064 saved_bytes=-72002 saved_pct=-2351 2026-02-20T02:04:25.971868Z INFO zerogravity::mitm::proxy: MITM: transparent redirect (iptables) domain="daily-cloudcode-pa.googleapis.com" 2026-02-20T02:04:25.971898Z INFO zerogravity::mitm::proxy: MITM: intercepting TLS domain="daily-cloudcode-pa.googleapis.com" 2026-02-20T02:04:25.973659Z INFO zerogravity::mitm::proxy: MITM: TLS handshake successful ✓ domain="daily-cloudcode-pa.googleapis.com" alpn=None 2026-02-20T02:04:25.973691Z INFO zerogravity::mitm::proxy: MITM: routing to HTTP/1.1 handler domain="daily-cloudcode-pa.googleapis.com" 2026-02-20T02:04:25.974416Z INFO zerogravity::mitm::proxy: MITM: connecting upstream (BoringSSL) domain="daily-cloudcode-pa.googleapis.com" addr=daily-cloudcode-pa.googleapis.com:443 2026-02-20T02:04:26.014535Z INFO zerogravity::mitm::proxy: MITM: upstream TLS connected ✓ (BoringSSL) domain="daily-cloudcode-pa.googleapis.com" 2026-02-20T02:04:26.983008Z INFO zerogravity::mitm::proxy: MITM: streaming response domain="daily-cloudcode-pa.googleapis.com" content_type=text/event-stream status=400 2026-02-20T02:04:26.983036Z WARN zerogravity::mitm::proxy: MITM: upstream error response domain="daily-cloudcode-pa.googleapis.com" status=400 body={ "error": { "code": 400, "message": "Schema.ref 'QuestionOption' was set alongside unsupported fields. If a schema node has Schema.ref set, then only description and default can be set alongside it; other fields they would be replaced by the expanded reference.", "status": "INVALID_ARGUMENT" } } 2026-02-20T02:04:26.983046Z WARN zerogravity::mitm::proxy: MITM: upstream error but no channel to forward it ``` ### Additional Context My opencode.json ```json { "$schema": "https://opencode.ai/config.json", "provider": { "ollama": { "npm": "@ai-sdk/openai-compatible", "name": "ZG", "options": { "baseURL": "http://localhost:8741/v1" }, "models": { "opus-4.6": { "name": "Opus 4.6 Thinking" }, "sonnet-4.6": { "name": "Sonnet 4.6 Thinking" }, "gemini-3.1-pro-high": { "name": "Gemini 3.1 Pro High" }, "gemini-3.1-pro": { "name": "Gemini 3.1 Pro" }, "gemini-3.1-pro-low": { "name": "Gemini 3.1 Pro Low" }, "gemini-3-flash": { "name": "Gemini 3 Flash" } } } } } ```
kerem 2026-02-27 15:37:59 +03:00
  • closed this issue
  • added the
    bug
    label
Author
Owner

@NikkeTryHard commented on GitHub (Feb 20, 2026):

give me ur zg report

<!-- gh-comment-id:3931235641 --> @NikkeTryHard commented on GitHub (Feb 20, 2026): give me ur `zg report`
Author
Owner

@hwang-pixel commented on GitHub (Feb 20, 2026):

generated_at: 2026-02-20T02:26:38Z
format: TOON/1.0
system:
  os: Linux
  kernel: 6.6.87.2-microsoft-standard-WSL2
  arch: x86_64
  distro: Debian GNU/Linux 13 (trixie)
  glibc: ldd (Debian GLIBC 2.41-12+deb13u1) 2.41
  memory_total: 7.6Gi
  memory_available: 6.8Gi
  in_docker: true
zerogravity:
  version: 1.1.8
  dev_build: false
  zg_binary: <install_dir>/bin/zg
  main_binary: <install_dir>/bin/zerogravity
  main_binary_size: 9516056
  main_binary_max_glibc: GLIBC_2.17
service:
  running: false
  proxy_port: 8741
  mitm_port: 8742
config:
  config_dir: ~/.config/zerogravity
  token_file_exists: false
  token_env_set: true
  api_key_set: true
  env:
    rust_log: info
    zerogravity_ls_path: /usr/local/bin/language_server_linux_x64
    zerogravity_upstream_proxy: ""
ls_binary:
  path: /usr/local/bin/language_server_linux_x64
  size_bytes: 176920000
  executable: true
api:
  health: service_not_running
  models: service_not_running
  quota: service_not_running
  usage: service_not_running
logs:
  exists: false
traces:
  date: 2026-02-20
  total_today: 4
  showing_last: 4
  recent:
    02-02-58.127_c388f49f:
      files[7]: request.json(487b),modified_request.json.gz(44495b),meta.txt(114b),summary.md(2274b),turns.json(716b),response.json(118b),tools.json(8899b)
      meta.txt: cascade=c388f49f endpoint=POST /v1/chat/completions model=sonnet-4.6 outcome=completed duration=3177ms stream=true
      summary[70]:
        - "# Trace: c388f49f — POST /v1/chat/completions"
        - ""
        - | Field | Value |
        - |-------|-------|
        - | Cascade ID | `c388f49f-ad87-45b7-b835-3867a1c04dfb` |
        - | Model | sonnet-4.6 |
        - | Stream | true |
        - "| Started | 2026-02-20T02:02:58.127Z |"
        - "| Finished | 2026-02-20T02:03:01.305Z |"
        - | Duration | 3177ms |
        - | Outcome | **completed** |
        - ""
        - ## Client Request
        - ""
        - "- **Messages:** 2 (user text: 34222 chars)"
        - ""
        - "IMPORTANT: Refuse to write co`"
        - "- **Tools:** 11 | **Tool rounds:** 0"
        - "- **System prompt:** yes"
        - "- **Image:** no"
        - "- **Params:** model=sonnet-4.6, max_tokens=32000"
        - ""
        - "→ Full details in [request.json](./request.json)"
        - ""
        - ## Turns (1 total)
        - ""
        - ### Turn 0
        - ""
        - "- **MITM matched:** ✓ (gate wait: 43ms)"
        - "- **Modify:** remove 1/2 content messages, replace dummy prompt in USER_REQUEST wrapper (34222 chars), strip all 23 LS tools, inject 1 custom tool group(s), override toolConfig VALIDATED → AUTO, inject toolConfig, inject generationConfig: maxOutputTokens=32000 (63755B → 96773B)"
        - "- **Response:** 132 chars text, 28 chars thinking, finish_reason=stop"
        - ""
        - "→ Full turn details in [turns.json](./turns.json)"
        - ""
        - ## Usage
        - ""
        - | Metric | Tokens |
        - |--------|--------|
        - | Input | 23845 |
        - | Output | 54 |
        - ""
        - "→ Full details in [response.json](./response.json)"
        - ""
        - ## Injected Tools (11)
        - ""
        - | # | Name | Params |
        - |---|------|--------|
        - | 0 | `question` | 1 |
        - | 1 | `bash` | 4 |
        - | 2 | `read` | 3 |
        - | 3 | `glob` | 2 |
        - | 4 | `grep` | 3 |
        - | 5 | `edit` | 4 |
        - | 6 | `write` | 2 |
        - | 7 | `task` | 5 |
        - | 8 | `webfetch` | 3 |
        - | 9 | `todowrite` | 1 |
        - | 10 | `skill` | 1 |
        - ""
        - "→ Full schemas in [tools.json](./tools.json)"
        - ""
        - ## Files
        - ""
        - | File | Contains |
        - |------|----------|
        - "| [request.json](./request.json) | Client request summary |"
        - "| [turns.json](./turns.json) | Per-turn MITM/gate/modify/response data |"
        - "| [response.json](./response.json) | Response summaries + token usage |"
        - "| [tools.json](./tools.json) | Injected tool names + schemas |"
        - "| modified_request.json[.gz] | Full MITM-modified request body |"
      response: "{\n  \"usage\": {\n    \"input_tokens\": 23845,\n    \"output_tokens\": 54,\n    \"thinking_tokens\": 0,\n    \"cache_read\": 0\n  }\n}"
      request: "{\n  \"message_count\": 2,\n  \"tool_count\": 11,\n  \"tool_round_count\": 0,\n  \"user_text_len\": 34222,\n  \"system_prompt\": true,\n  \"has_image\": false,\n  \"generation_params\": {\n    \"model\": \"sonnet-4.6\",\n    \"max_output_tokens\": 32000,\n    \"google_search\": false\n  }\n}"
    02-02-58.129_c6d03777:
      files[6]: request.json(512b),modified_request.json(26155b),meta.txt(118b),summary.md(1799b),turns.json(611b),response.json(118b)
      meta.txt: cascade=c6d03777 endpoint=POST /v1/chat/completions model=gemini-3-flash outcome=completed duration=4294ms stream=true
      summary[56]:
        - "# Trace: c6d03777 — POST /v1/chat/completions"
        - ""
        - | Field | Value |
        - |-------|-------|
        - | Cascade ID | `c6d03777-ebdd-4a3d-bd87-d2c11d50a66d` |
        - | Model | gemini-3-flash |
        - | Stream | true |
        - "| Started | 2026-02-20T02:02:58.129Z |"
        - "| Finished | 2026-02-20T02:03:02.424Z |"
        - | Duration | 4294ms |
        - | Outcome | **completed** |
        - ""
        - ## Client Request
        - ""
        - "- **Messages:** 3 (user text: 2175 chars)"
        - ""
        - <task>
        - Generate a brief title that would help the user find this conversation later.
        - ""
        - Follow all rules in <rules>
        - Use the <exa`
        - "- **Tools:** 0 | **Tool rounds:** 0"
        - "- **System prompt:** yes"
        - "- **Image:** no"
        - "- **Params:** model=gemini-3-flash, max_tokens=32000, top_p=0.95"
        - ""
        - "→ Full details in [request.json](./request.json)"
        - ""
        - ## Turns (1 total)
        - ""
        - ### Turn 0
        - ""
        - "- **MITM matched:** ✓ (gate wait: 47ms)"
        - "- **Modify:** remove 1/2 content messages, replace dummy prompt in USER_REQUEST wrapper (2175 chars), strip all 23 LS tools, remove empty tools array, remove toolConfig (no tools), inject generationConfig: temperature=1.0(default-g3), topP=0.95, maxOutputTokens=32000 (63748B → 26155B)"
        - "- **Response:** 22 chars text, 449 chars thinking, finish_reason=stop"
        - ""
        - "→ Full turn details in [turns.json](./turns.json)"
        - ""
        - ## Usage
        - ""
        - | Metric | Tokens |
        - |--------|--------|
        - | Input | 5856 |
        - | Output | 3 |
        - | Thinking | 280 |
        - ""
        - "→ Full details in [response.json](./response.json)"
        - ""
        - ## Files
        - ""
        - | File | Contains |
        - |------|----------|
        - "| [request.json](./request.json) | Client request summary |"
        - "| [turns.json](./turns.json) | Per-turn MITM/gate/modify/response data |"
        - "| [response.json](./response.json) | Response summaries + token usage |"
        - "| modified_request.json[.gz] | Full MITM-modified request body |"
      response: "{\n  \"usage\": {\n    \"input_tokens\": 5856,\n    \"output_tokens\": 3,\n    \"thinking_tokens\": 280,\n    \"cache_read\": 0\n  }\n}"
      request: "{\n  \"message_count\": 3,\n  \"tool_count\": 0,\n  \"tool_round_count\": 0,\n  \"user_text_len\": 2175,\n  \"system_prompt\": true,\n  \"has_image\": false,\n  \"generation_params\": {\n    \"model\": \"gemini-3-flash\",\n    \"top_p\": 0.95,\n    \"max_output_tokens\": 32000,\n    \"google_search\": false\n  }\n}"
    02-03-36.925_b486fbc5:
      files[7]: request.json(487b),modified_request.json.gz(44567b),meta.txt(114b),summary.md(2179b),turns.json(621b),response.json(118b),tools.json(8899b)
      meta.txt: cascade=b486fbc5 endpoint=POST /v1/chat/completions model=sonnet-4.6 outcome=completed duration=3861ms stream=true
      summary[70]:
        - "# Trace: b486fbc5 — POST /v1/chat/completions"
        - ""
        - | Field | Value |
        - |-------|-------|
        - | Cascade ID | `b486fbc5-ff95-495c-b483-aebbe9474e66` |
        - | Model | sonnet-4.6 |
        - | Stream | true |
        - "| Started | 2026-02-20T02:03:36.925Z |"
        - "| Finished | 2026-02-20T02:03:40.787Z |"
        - | Duration | 3861ms |
        - | Outcome | **completed** |
        - ""
        - ## Client Request
        - ""
        - "- **Messages:** 4 (user text: 34420 chars)"
        - ""
        - "IMPORTANT: Refuse to write co`"
        - "- **Tools:** 11 | **Tool rounds:** 0"
        - "- **System prompt:** yes"
        - "- **Image:** no"
        - "- **Params:** model=sonnet-4.6, max_tokens=32000"
        - ""
        - "→ Full details in [request.json](./request.json)"
        - ""
        - ## Turns (1 total)
        - ""
        - ### Turn 0
        - ""
        - "- **MITM matched:** ✓ (gate wait: 4ms)"
        - "- **Modify:** remove 1/2 content messages, replace dummy prompt in USER_REQUEST wrapper (34420 chars), strip all 23 LS tools, inject 1 custom tool group(s), override toolConfig VALIDATED → AUTO, inject toolConfig, inject generationConfig: maxOutputTokens=32000 (64063B → 96975B)"
        - "- **Response:** 39 chars text, 71 chars thinking, finish_reason=stop"
        - ""
        - "→ Full turn details in [turns.json](./turns.json)"
        - ""
        - ## Usage
        - ""
        - | Metric | Tokens |
        - |--------|--------|
        - | Input | 23896 |
        - | Output | 38 |
        - ""
        - "→ Full details in [response.json](./response.json)"
        - ""
        - ## Injected Tools (11)
        - ""
        - | # | Name | Params |
        - |---|------|--------|
        - | 0 | `question` | 1 |
        - | 1 | `bash` | 4 |
        - | 2 | `read` | 3 |
        - | 3 | `glob` | 2 |
        - | 4 | `grep` | 3 |
        - | 5 | `edit` | 4 |
        - | 6 | `write` | 2 |
        - | 7 | `task` | 5 |
        - | 8 | `webfetch` | 3 |
        - | 9 | `todowrite` | 1 |
        - | 10 | `skill` | 1 |
        - ""
        - "→ Full schemas in [tools.json](./tools.json)"
        - ""
        - ## Files
        - ""
        - | File | Contains |
        - |------|----------|
        - "| [request.json](./request.json) | Client request summary |"
        - "| [turns.json](./turns.json) | Per-turn MITM/gate/modify/response data |"
        - "| [response.json](./response.json) | Response summaries + token usage |"
        - "| [tools.json](./tools.json) | Injected tool names + schemas |"
        - "| modified_request.json[.gz] | Full MITM-modified request body |"
      response: "{\n  \"usage\": {\n    \"input_tokens\": 23896,\n    \"output_tokens\": 38,\n    \"thinking_tokens\": 0,\n    \"cache_read\": 0\n  }\n}"
      request: "{\n  \"message_count\": 4,\n  \"tool_count\": 11,\n  \"tool_round_count\": 0,\n  \"user_text_len\": 34420,\n  \"system_prompt\": true,\n  \"has_image\": false,\n  \"generation_params\": {\n    \"model\": \"sonnet-4.6\",\n    \"max_output_tokens\": 32000,\n    \"google_search\": false\n  }\n}"
    02-04-21.840_f8238fca:
      files[7]: request.json(487b),modified_request.json.gz(44621b),meta.txt(114b),summary.md(2204b),turns.json(646b),response.json(118b),tools.json(8899b)
      meta.txt: cascade=f8238fca endpoint=POST /v1/chat/completions model=sonnet-4.6 outcome=completed duration=4085ms stream=true
      summary[70]:
        - "# Trace: f8238fca — POST /v1/chat/completions"
        - ""
        - | Field | Value |
        - |-------|-------|
        - | Cascade ID | `f8238fca-fade-4f7b-b75e-b70c65dc0003` |
        - | Model | sonnet-4.6 |
        - | Stream | true |
        - "| Started | 2026-02-20T02:04:21.840Z |"
        - "| Finished | 2026-02-20T02:04:25.923Z |"
        - | Duration | 4085ms |
        - | Outcome | **completed** |
        - ""
        - ## Client Request
        - ""
        - "- **Messages:** 6 (user text: 34605 chars)"
        - ""
        - "IMPORTANT: Refuse to write co`"
        - "- **Tools:** 11 | **Tool rounds:** 0"
        - "- **System prompt:** yes"
        - "- **Image:** no"
        - "- **Params:** model=sonnet-4.6, max_tokens=32000"
        - ""
        - "→ Full details in [request.json](./request.json)"
        - ""
        - ## Turns (1 total)
        - ""
        - ### Turn 0
        - ""
        - "- **MITM matched:** ✓ (gate wait: 6ms)"
        - "- **Modify:** remove 1/2 content messages, replace dummy prompt in USER_REQUEST wrapper (34605 chars), strip all 23 LS tools, inject 1 custom tool group(s), override toolConfig VALIDATED → AUTO, inject toolConfig, inject generationConfig: maxOutputTokens=32000 (64193B → 97164B)"
        - "- **Response:** 64 chars text, 71 chars thinking, finish_reason=stop"
        - ""
        - "→ Full turn details in [turns.json](./turns.json)"
        - ""
        - ## Usage
        - ""
        - | Metric | Tokens |
        - |--------|--------|
        - | Input | 23941 |
        - | Output | 41 |
        - ""
        - "→ Full details in [response.json](./response.json)"
        - ""
        - ## Injected Tools (11)
        - ""
        - | # | Name | Params |
        - |---|------|--------|
        - | 0 | `question` | 1 |
        - | 1 | `bash` | 4 |
        - | 2 | `read` | 3 |
        - | 3 | `glob` | 2 |
        - | 4 | `grep` | 3 |
        - | 5 | `edit` | 4 |
        - | 6 | `write` | 2 |
        - | 7 | `task` | 5 |
        - | 8 | `webfetch` | 3 |
        - | 9 | `todowrite` | 1 |
        - | 10 | `skill` | 1 |
        - ""
        - "→ Full schemas in [tools.json](./tools.json)"
        - ""
        - ## Files
        - ""
        - | File | Contains |
        - |------|----------|
        - "| [request.json](./request.json) | Client request summary |"
        - "| [turns.json](./turns.json) | Per-turn MITM/gate/modify/response data |"
        - "| [response.json](./response.json) | Response summaries + token usage |"
        - "| [tools.json](./tools.json) | Injected tool names + schemas |"
        - "| modified_request.json[.gz] | Full MITM-modified request body |"
      response: "{\n  \"usage\": {\n    \"input_tokens\": 23941,\n    \"output_tokens\": 41,\n    \"thinking_tokens\": 0,\n    \"cache_read\": 0\n  }\n}"
      request: "{\n  \"message_count\": 6,\n  \"tool_count\": 11,\n  \"tool_round_count\": 0,\n  \"user_text_len\": 34605,\n  \"system_prompt\": true,\n  \"has_image\": false,\n  \"generation_params\": {\n    \"model\": \"sonnet-4.6\",\n    \"max_output_tokens\": 32000,\n    \"google_search\": false\n  }\n}"
<!-- gh-comment-id:3931239117 --> @hwang-pixel commented on GitHub (Feb 20, 2026): ```` generated_at: 2026-02-20T02:26:38Z format: TOON/1.0 system: os: Linux kernel: 6.6.87.2-microsoft-standard-WSL2 arch: x86_64 distro: Debian GNU/Linux 13 (trixie) glibc: ldd (Debian GLIBC 2.41-12+deb13u1) 2.41 memory_total: 7.6Gi memory_available: 6.8Gi in_docker: true zerogravity: version: 1.1.8 dev_build: false zg_binary: <install_dir>/bin/zg main_binary: <install_dir>/bin/zerogravity main_binary_size: 9516056 main_binary_max_glibc: GLIBC_2.17 service: running: false proxy_port: 8741 mitm_port: 8742 config: config_dir: ~/.config/zerogravity token_file_exists: false token_env_set: true api_key_set: true env: rust_log: info zerogravity_ls_path: /usr/local/bin/language_server_linux_x64 zerogravity_upstream_proxy: "" ls_binary: path: /usr/local/bin/language_server_linux_x64 size_bytes: 176920000 executable: true api: health: service_not_running models: service_not_running quota: service_not_running usage: service_not_running logs: exists: false traces: date: 2026-02-20 total_today: 4 showing_last: 4 recent: 02-02-58.127_c388f49f: files[7]: request.json(487b),modified_request.json.gz(44495b),meta.txt(114b),summary.md(2274b),turns.json(716b),response.json(118b),tools.json(8899b) meta.txt: cascade=c388f49f endpoint=POST /v1/chat/completions model=sonnet-4.6 outcome=completed duration=3177ms stream=true summary[70]: - "# Trace: c388f49f — POST /v1/chat/completions" - "" - | Field | Value | - |-------|-------| - | Cascade ID | `c388f49f-ad87-45b7-b835-3867a1c04dfb` | - | Model | sonnet-4.6 | - | Stream | true | - "| Started | 2026-02-20T02:02:58.127Z |" - "| Finished | 2026-02-20T02:03:01.305Z |" - | Duration | 3177ms | - | Outcome | **completed** | - "" - ## Client Request - "" - "- **Messages:** 2 (user text: 34222 chars)" - "" - "IMPORTANT: Refuse to write co`" - "- **Tools:** 11 | **Tool rounds:** 0" - "- **System prompt:** yes" - "- **Image:** no" - "- **Params:** model=sonnet-4.6, max_tokens=32000" - "" - "→ Full details in [request.json](./request.json)" - "" - ## Turns (1 total) - "" - ### Turn 0 - "" - "- **MITM matched:** ✓ (gate wait: 43ms)" - "- **Modify:** remove 1/2 content messages, replace dummy prompt in USER_REQUEST wrapper (34222 chars), strip all 23 LS tools, inject 1 custom tool group(s), override toolConfig VALIDATED → AUTO, inject toolConfig, inject generationConfig: maxOutputTokens=32000 (63755B → 96773B)" - "- **Response:** 132 chars text, 28 chars thinking, finish_reason=stop" - "" - "→ Full turn details in [turns.json](./turns.json)" - "" - ## Usage - "" - | Metric | Tokens | - |--------|--------| - | Input | 23845 | - | Output | 54 | - "" - "→ Full details in [response.json](./response.json)" - "" - ## Injected Tools (11) - "" - | # | Name | Params | - |---|------|--------| - | 0 | `question` | 1 | - | 1 | `bash` | 4 | - | 2 | `read` | 3 | - | 3 | `glob` | 2 | - | 4 | `grep` | 3 | - | 5 | `edit` | 4 | - | 6 | `write` | 2 | - | 7 | `task` | 5 | - | 8 | `webfetch` | 3 | - | 9 | `todowrite` | 1 | - | 10 | `skill` | 1 | - "" - "→ Full schemas in [tools.json](./tools.json)" - "" - ## Files - "" - | File | Contains | - |------|----------| - "| [request.json](./request.json) | Client request summary |" - "| [turns.json](./turns.json) | Per-turn MITM/gate/modify/response data |" - "| [response.json](./response.json) | Response summaries + token usage |" - "| [tools.json](./tools.json) | Injected tool names + schemas |" - "| modified_request.json[.gz] | Full MITM-modified request body |" response: "{\n \"usage\": {\n \"input_tokens\": 23845,\n \"output_tokens\": 54,\n \"thinking_tokens\": 0,\n \"cache_read\": 0\n }\n}" request: "{\n \"message_count\": 2,\n \"tool_count\": 11,\n \"tool_round_count\": 0,\n \"user_text_len\": 34222,\n \"system_prompt\": true,\n \"has_image\": false,\n \"generation_params\": {\n \"model\": \"sonnet-4.6\",\n \"max_output_tokens\": 32000,\n \"google_search\": false\n }\n}" 02-02-58.129_c6d03777: files[6]: request.json(512b),modified_request.json(26155b),meta.txt(118b),summary.md(1799b),turns.json(611b),response.json(118b) meta.txt: cascade=c6d03777 endpoint=POST /v1/chat/completions model=gemini-3-flash outcome=completed duration=4294ms stream=true summary[56]: - "# Trace: c6d03777 — POST /v1/chat/completions" - "" - | Field | Value | - |-------|-------| - | Cascade ID | `c6d03777-ebdd-4a3d-bd87-d2c11d50a66d` | - | Model | gemini-3-flash | - | Stream | true | - "| Started | 2026-02-20T02:02:58.129Z |" - "| Finished | 2026-02-20T02:03:02.424Z |" - | Duration | 4294ms | - | Outcome | **completed** | - "" - ## Client Request - "" - "- **Messages:** 3 (user text: 2175 chars)" - "" - <task> - Generate a brief title that would help the user find this conversation later. - "" - Follow all rules in <rules> - Use the <exa` - "- **Tools:** 0 | **Tool rounds:** 0" - "- **System prompt:** yes" - "- **Image:** no" - "- **Params:** model=gemini-3-flash, max_tokens=32000, top_p=0.95" - "" - "→ Full details in [request.json](./request.json)" - "" - ## Turns (1 total) - "" - ### Turn 0 - "" - "- **MITM matched:** ✓ (gate wait: 47ms)" - "- **Modify:** remove 1/2 content messages, replace dummy prompt in USER_REQUEST wrapper (2175 chars), strip all 23 LS tools, remove empty tools array, remove toolConfig (no tools), inject generationConfig: temperature=1.0(default-g3), topP=0.95, maxOutputTokens=32000 (63748B → 26155B)" - "- **Response:** 22 chars text, 449 chars thinking, finish_reason=stop" - "" - "→ Full turn details in [turns.json](./turns.json)" - "" - ## Usage - "" - | Metric | Tokens | - |--------|--------| - | Input | 5856 | - | Output | 3 | - | Thinking | 280 | - "" - "→ Full details in [response.json](./response.json)" - "" - ## Files - "" - | File | Contains | - |------|----------| - "| [request.json](./request.json) | Client request summary |" - "| [turns.json](./turns.json) | Per-turn MITM/gate/modify/response data |" - "| [response.json](./response.json) | Response summaries + token usage |" - "| modified_request.json[.gz] | Full MITM-modified request body |" response: "{\n \"usage\": {\n \"input_tokens\": 5856,\n \"output_tokens\": 3,\n \"thinking_tokens\": 280,\n \"cache_read\": 0\n }\n}" request: "{\n \"message_count\": 3,\n \"tool_count\": 0,\n \"tool_round_count\": 0,\n \"user_text_len\": 2175,\n \"system_prompt\": true,\n \"has_image\": false,\n \"generation_params\": {\n \"model\": \"gemini-3-flash\",\n \"top_p\": 0.95,\n \"max_output_tokens\": 32000,\n \"google_search\": false\n }\n}" 02-03-36.925_b486fbc5: files[7]: request.json(487b),modified_request.json.gz(44567b),meta.txt(114b),summary.md(2179b),turns.json(621b),response.json(118b),tools.json(8899b) meta.txt: cascade=b486fbc5 endpoint=POST /v1/chat/completions model=sonnet-4.6 outcome=completed duration=3861ms stream=true summary[70]: - "# Trace: b486fbc5 — POST /v1/chat/completions" - "" - | Field | Value | - |-------|-------| - | Cascade ID | `b486fbc5-ff95-495c-b483-aebbe9474e66` | - | Model | sonnet-4.6 | - | Stream | true | - "| Started | 2026-02-20T02:03:36.925Z |" - "| Finished | 2026-02-20T02:03:40.787Z |" - | Duration | 3861ms | - | Outcome | **completed** | - "" - ## Client Request - "" - "- **Messages:** 4 (user text: 34420 chars)" - "" - "IMPORTANT: Refuse to write co`" - "- **Tools:** 11 | **Tool rounds:** 0" - "- **System prompt:** yes" - "- **Image:** no" - "- **Params:** model=sonnet-4.6, max_tokens=32000" - "" - "→ Full details in [request.json](./request.json)" - "" - ## Turns (1 total) - "" - ### Turn 0 - "" - "- **MITM matched:** ✓ (gate wait: 4ms)" - "- **Modify:** remove 1/2 content messages, replace dummy prompt in USER_REQUEST wrapper (34420 chars), strip all 23 LS tools, inject 1 custom tool group(s), override toolConfig VALIDATED → AUTO, inject toolConfig, inject generationConfig: maxOutputTokens=32000 (64063B → 96975B)" - "- **Response:** 39 chars text, 71 chars thinking, finish_reason=stop" - "" - "→ Full turn details in [turns.json](./turns.json)" - "" - ## Usage - "" - | Metric | Tokens | - |--------|--------| - | Input | 23896 | - | Output | 38 | - "" - "→ Full details in [response.json](./response.json)" - "" - ## Injected Tools (11) - "" - | # | Name | Params | - |---|------|--------| - | 0 | `question` | 1 | - | 1 | `bash` | 4 | - | 2 | `read` | 3 | - | 3 | `glob` | 2 | - | 4 | `grep` | 3 | - | 5 | `edit` | 4 | - | 6 | `write` | 2 | - | 7 | `task` | 5 | - | 8 | `webfetch` | 3 | - | 9 | `todowrite` | 1 | - | 10 | `skill` | 1 | - "" - "→ Full schemas in [tools.json](./tools.json)" - "" - ## Files - "" - | File | Contains | - |------|----------| - "| [request.json](./request.json) | Client request summary |" - "| [turns.json](./turns.json) | Per-turn MITM/gate/modify/response data |" - "| [response.json](./response.json) | Response summaries + token usage |" - "| [tools.json](./tools.json) | Injected tool names + schemas |" - "| modified_request.json[.gz] | Full MITM-modified request body |" response: "{\n \"usage\": {\n \"input_tokens\": 23896,\n \"output_tokens\": 38,\n \"thinking_tokens\": 0,\n \"cache_read\": 0\n }\n}" request: "{\n \"message_count\": 4,\n \"tool_count\": 11,\n \"tool_round_count\": 0,\n \"user_text_len\": 34420,\n \"system_prompt\": true,\n \"has_image\": false,\n \"generation_params\": {\n \"model\": \"sonnet-4.6\",\n \"max_output_tokens\": 32000,\n \"google_search\": false\n }\n}" 02-04-21.840_f8238fca: files[7]: request.json(487b),modified_request.json.gz(44621b),meta.txt(114b),summary.md(2204b),turns.json(646b),response.json(118b),tools.json(8899b) meta.txt: cascade=f8238fca endpoint=POST /v1/chat/completions model=sonnet-4.6 outcome=completed duration=4085ms stream=true summary[70]: - "# Trace: f8238fca — POST /v1/chat/completions" - "" - | Field | Value | - |-------|-------| - | Cascade ID | `f8238fca-fade-4f7b-b75e-b70c65dc0003` | - | Model | sonnet-4.6 | - | Stream | true | - "| Started | 2026-02-20T02:04:21.840Z |" - "| Finished | 2026-02-20T02:04:25.923Z |" - | Duration | 4085ms | - | Outcome | **completed** | - "" - ## Client Request - "" - "- **Messages:** 6 (user text: 34605 chars)" - "" - "IMPORTANT: Refuse to write co`" - "- **Tools:** 11 | **Tool rounds:** 0" - "- **System prompt:** yes" - "- **Image:** no" - "- **Params:** model=sonnet-4.6, max_tokens=32000" - "" - "→ Full details in [request.json](./request.json)" - "" - ## Turns (1 total) - "" - ### Turn 0 - "" - "- **MITM matched:** ✓ (gate wait: 6ms)" - "- **Modify:** remove 1/2 content messages, replace dummy prompt in USER_REQUEST wrapper (34605 chars), strip all 23 LS tools, inject 1 custom tool group(s), override toolConfig VALIDATED → AUTO, inject toolConfig, inject generationConfig: maxOutputTokens=32000 (64193B → 97164B)" - "- **Response:** 64 chars text, 71 chars thinking, finish_reason=stop" - "" - "→ Full turn details in [turns.json](./turns.json)" - "" - ## Usage - "" - | Metric | Tokens | - |--------|--------| - | Input | 23941 | - | Output | 41 | - "" - "→ Full details in [response.json](./response.json)" - "" - ## Injected Tools (11) - "" - | # | Name | Params | - |---|------|--------| - | 0 | `question` | 1 | - | 1 | `bash` | 4 | - | 2 | `read` | 3 | - | 3 | `glob` | 2 | - | 4 | `grep` | 3 | - | 5 | `edit` | 4 | - | 6 | `write` | 2 | - | 7 | `task` | 5 | - | 8 | `webfetch` | 3 | - | 9 | `todowrite` | 1 | - | 10 | `skill` | 1 | - "" - "→ Full schemas in [tools.json](./tools.json)" - "" - ## Files - "" - | File | Contains | - |------|----------| - "| [request.json](./request.json) | Client request summary |" - "| [turns.json](./turns.json) | Per-turn MITM/gate/modify/response data |" - "| [response.json](./response.json) | Response summaries + token usage |" - "| [tools.json](./tools.json) | Injected tool names + schemas |" - "| modified_request.json[.gz] | Full MITM-modified request body |" response: "{\n \"usage\": {\n \"input_tokens\": 23941,\n \"output_tokens\": 41,\n \"thinking_tokens\": 0,\n \"cache_read\": 0\n }\n}" request: "{\n \"message_count\": 6,\n \"tool_count\": 11,\n \"tool_round_count\": 0,\n \"user_text_len\": 34605,\n \"system_prompt\": true,\n \"has_image\": false,\n \"generation_params\": {\n \"model\": \"sonnet-4.6\",\n \"max_output_tokens\": 32000,\n \"google_search\": false\n }\n}" ````
Author
Owner

@hwang-pixel commented on GitHub (Feb 20, 2026):

Here you go sir ^

<!-- gh-comment-id:3931240483 --> @hwang-pixel commented on GitHub (Feb 20, 2026): Here you go sir ^
Author
Owner

@NikkeTryHard commented on GitHub (Feb 20, 2026):

see if fixed with v1.1.9

<!-- gh-comment-id:3931522333 --> @NikkeTryHard commented on GitHub (Feb 20, 2026): see if fixed with v1.1.9
Author
Owner

@NikkeTryHard commented on GitHub (Feb 20, 2026):

assume its fixed lol

<!-- gh-comment-id:3931804019 --> @NikkeTryHard commented on GitHub (Feb 20, 2026): assume its fixed lol
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/zerogravity#26
No description provided.