[GH-ISSUE #331] Please add to support instead ChatGPT also ollama local LLM #5957

Closed
opened 2026-03-01 17:08:20 +03:00 by kerem · 2 comments
Owner

Originally created by @bitsky6 on GitHub (Mar 18, 2024).
Original GitHub issue: https://github.com/0xJacky/nginx-ui/issues/331

Is your feature request related to a problem? Please describe.
Would like to have the option if possible to add ollama private llm not only ChatGPT.

Describe the solution you'd like
Have an option to selecet LLM type where can be selected ollama(open-source) as type
A box for enter Host and port where to find it to connect.

Chat box can stay as it is as ollama normally uses open Web-UI in same format.

Originally created by @bitsky6 on GitHub (Mar 18, 2024). Original GitHub issue: https://github.com/0xJacky/nginx-ui/issues/331 **Is your feature request related to a problem? Please describe.** Would like to have the option if possible to add ollama private llm not only ChatGPT. **Describe the solution you'd like** Have an option to selecet LLM type where can be selected ollama(open-source) as type A box for enter Host and port where to find it to connect. Chat box can stay as it is as ollama normally uses open Web-UI in same format.
kerem 2026-03-01 17:08:20 +03:00
Author
Owner

@0xJacky commented on GitHub (May 1, 2024):

Hi @bitsky6, I just add this feature in 3b116b3, and it will be released in v2.0.0-beta.20. You can serve your LLM as a openai compatible server locally (https://lmdeploy.readthedocs.io/en/latest/serving/api_server.html). Next, set the api base url and model in Preference > OpenAI page. Finally, you can use your local LLM in Nginx UI.

<!-- gh-comment-id:2088347433 --> @0xJacky commented on GitHub (May 1, 2024): Hi @bitsky6, I just add this feature in 3b116b3, and it will be released in v2.0.0-beta.20. You can serve your LLM as a openai compatible server locally (https://lmdeploy.readthedocs.io/en/latest/serving/api_server.html). Next, set the api base url and model in Preference > OpenAI page. Finally, you can use your local LLM in Nginx UI.
Author
Owner

@bitsky6 commented on GitHub (May 1, 2024):

Cool. thx

<!-- gh-comment-id:2088374046 --> @bitsky6 commented on GitHub (May 1, 2024): Cool. thx
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/nginx-ui#5957
No description provided.