mirror of
https://github.com/0xJacky/nginx-ui.git
synced 2026-04-25 08:45:58 +03:00
[GH-ISSUE #331] Please add to support instead ChatGPT also ollama local LLM #4612
Labels
No labels
Q/A
bug
casdoor
dependencies
docker
documentation
duplicate
enhancement
help wanted
invalid
lego
platform:openwrt
platform:windows
pull-request
question
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/nginx-ui#4612
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @bitsky6 on GitHub (Mar 18, 2024).
Original GitHub issue: https://github.com/0xJacky/nginx-ui/issues/331
Is your feature request related to a problem? Please describe.
Would like to have the option if possible to add ollama private llm not only ChatGPT.
Describe the solution you'd like
Have an option to selecet LLM type where can be selected ollama(open-source) as type
A box for enter Host and port where to find it to connect.
Chat box can stay as it is as ollama normally uses open Web-UI in same format.
@0xJacky commented on GitHub (May 1, 2024):
Hi @bitsky6, I just add this feature in
3b116b3, and it will be released in v2.0.0-beta.20. You can serve your LLM as a openai compatible server locally (https://lmdeploy.readthedocs.io/en/latest/serving/api_server.html). Next, set the api base url and model in Preference > OpenAI page. Finally, you can use your local LLM in Nginx UI.@bitsky6 commented on GitHub (May 1, 2024):
Cool. thx