[GH-ISSUE #1272] Proxy settings for LLM tagging? #819

Closed
opened 2026-03-02 11:53:00 +03:00 by kerem · 5 comments
Owner

Originally created by @dzx-dzx on GitHub (Apr 17, 2025).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/1272

Describe the feature you'd like

By the title this looks similar to https://github.com/karakeep-app/karakeep/issues/1240 , but while that issue discussed sending requests to a custom LLM endpoint URL, I want to ask about proxying such requests.

Describe the benefits this would bring to existing Karakeep users

It would be helpful in corporations or regions where outbound connections are limited.

Can the goal of this request already be achieved via other means?

I can host a reverse proxy for the LLM APIs, but it would make things less manageable.

Have you searched for an existing open/closed issue?

  • I have searched for existing issues and none cover my fundamental request

Additional context

No response

Originally created by @dzx-dzx on GitHub (Apr 17, 2025). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/1272 ### Describe the feature you'd like By the title this looks similar to https://github.com/karakeep-app/karakeep/issues/1240 , but while that issue discussed sending requests to a custom LLM endpoint URL, I want to ask about proxying such requests. ### Describe the benefits this would bring to existing Karakeep users It would be helpful in corporations or regions where outbound connections are limited. ### Can the goal of this request already be achieved via other means? I can host a reverse proxy for the LLM APIs, but it would make things less manageable. ### Have you searched for an existing open/closed issue? - [x] I have searched for existing issues and none cover my fundamental request ### Additional context _No response_
Author
Owner

@MohamedBassem commented on GitHub (Apr 20, 2025):

I think it's a reasonable feature request. The openai API library allows you to pass a proxy host: github.com/openai/openai-node@2785c1186b (configuring-an-https-agent-eg-for-proxies)

We can introduce a new OPENAI_PROXY_URL env variable and pass it there if set.

<!-- gh-comment-id:2817137417 --> @MohamedBassem commented on GitHub (Apr 20, 2025): I think it's a reasonable feature request. The openai API library allows you to pass a proxy host: https://github.com/openai/openai-node/tree/2785c1186b528e4ab3a2a7c9282e041aaa4c13f6?tab=readme-ov-file#configuring-an-https-agent-eg-for-proxies We can introduce a new `OPENAI_PROXY_URL` env variable and pass it there if set.
Author
Owner

@dzx-dzx commented on GitHub (Apr 20, 2025):

Thank you!

<!-- gh-comment-id:2817180892 --> @dzx-dzx commented on GitHub (Apr 20, 2025): Thank you!
Author
Owner

@zil commented on GitHub (May 17, 2025):

deepwiki provide guidance on the changes.

https://deepwiki.com/search/how-inferenceclient-use-openai_fd6f971e-989c-4b81-9e2f-aed0ae0c9d56#3

if it's good, I may create a PR.

<!-- gh-comment-id:2888084618 --> @zil commented on GitHub (May 17, 2025): deepwiki provide guidance on the changes. https://deepwiki.com/search/how-inferenceclient-use-openai_fd6f971e-989c-4b81-9e2f-aed0ae0c9d56#3 if it's good, I may create a PR.
Author
Owner

@perdakovich commented on GitHub (Oct 12, 2025):

💯

<!-- gh-comment-id:3395427878 --> @perdakovich commented on GitHub (Oct 12, 2025): 💯
Author
Owner

@rzxczxc commented on GitHub (Dec 9, 2025):

trying in #2231

<!-- gh-comment-id:3631916660 --> @rzxczxc commented on GitHub (Dec 9, 2025): trying in #2231
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#819
No description provided.