mirror of
https://github.com/jwadow/kiro-gateway.git
synced 2026-04-25 01:15:57 +03:00
[GH-ISSUE #63] BUG: use opencode #43
Labels
No labels
bug
bug
enhancement
enhancement
fixed
fixed
invalid
needs-info
needs-testing
pull-request
question
upstream
wontfix
workaround
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/kiro-gateway-jwadow#43
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @qinjw404 on GitHub (Feb 2, 2026).
Original GitHub issue: https://github.com/jwadow/kiro-gateway/issues/63
Kiro Gateway Version
latest
What happened?
Debug Logs
2026-02-02 13:51:38 | ERROR | kiro.routes_openai:chat_completions:292 - Error from Kiro API: 400 - {"message":"Input is too long.","reason":"CONTENT_LENGTH_EXCEEDS_THRESHOLD"}
2026-02-02 13:51:38 | WARNING | kiro.routes_openai:chat_completions:306 - HTTP 400 - POST /v1/chat/completions - Input is too long. (reason: CONTENT_LENGTH_EXCEEDS_THRESHOLD)
@jwadow commented on GitHub (Feb 2, 2026):
Hi, this isn't a Gateway error; it's because the model context is full of 200K tokens. The only thing I can do is display a more understandable error.
This is most likely a shortcoming of OpenCode, which handles context ineffectively.
I can, of course, trim unnecessary messages by modifying the context input into the model, but that's beyond the scope of Gateway, which should be as transparent as possible without explicitly modifying the content. I'll see what I can do about it.
@qinjw404 commented on GitHub (Feb 2, 2026):
Perhaps this is a limitation of kiro? How can we better use it? Currently, my opencode is using a custom provider.
@jwadow commented on GitHub (Feb 2, 2026):
No, this is a limitation of the Claude model. Even if you used their official Anthropic API, you would get the same error.
Unfortunately, I won’t tell you how to use OpenCode, because I don’t use it. But properly working agents (of which OpenCode is one) should automatically compress the context when it comes to an end, for example when 180K is used. Why this was not done in your situation, I cannot say.