mirror of
https://github.com/ownpilot/OwnPilot.git
synced 2026-04-25 07:15:51 +03:00
[GH-ISSUE #2] GUI-Add Local AI Provider problem #3
Labels
No labels
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/OwnPilot#3
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @leventorcan on GitHub (Feb 19, 2026).
Original GitHub issue: https://github.com/ownpilot/OwnPilot/issues/2
Issue 1:
When i tried to add my local Ollama server, i got the error message below.
"API Error Validation failed: name: Invalid input: expected string, received undefined; providerType: Invalid option: expected one of "lmstudio"|"ollama"|"localai"|"vllm"|"custom"; baseUrl: Invalid input: expected string, received undefined"
I tried the curl command below with "local-providers" instead of "providers" in terminal and it is added. I can even see the model list in GUI after added.
curl -X POST http://localhost:8080/api/v1/local-providers
-H "Content-Type: application/json"
-d '{
"name": "Ollama",
"providerType": "ollama",
"baseUrl": "http://127.0.0.1:11434"
}'
Issue 2:
When i try to send a test message in chat window, i got this error message below, even after i have set my local model.
"Demo Mode Response
I received your message: "test"
I'm currently running in demo mode with a98be91d-fa5d-4425-8977-5f2ee7412f74 (qwen2.5-coder:7b). To get real AI responses, please configure your API key in the Settings page.
This is a simulated response. Configure your API key for actual AI capabilities."
Elinize saglik, cok guzel olmus bu arada:)
Levent
@Cayrop commented on GitHub (Feb 19, 2026):
I ran into the same issue and found the root cause in the source code.
The problem comes from the demo mode check:
Demo mode is only disabled if one of the providers in this whitelist is configured.
Since LM Studio is not included, the app stays in demo mode even when LM Studio is properly configured and working.
Workarounds
1. Override the OpenAI provider (recommended)
LM Studio exposes an OpenAI-compatible API, so you can configure it as an OpenAI provider:
OpenAIhttp://127.0.0.1:1234/v1This makes
openaiappear as configured and disables demo mode.This approach is cleaner because it uses the intended provider flow.
2. Set a dummy OpenAI API key
Another way to bypass demo mode is simply configuring the OpenAI provider with an invalid API key.
Since the demo check only verifies that a supported provider is configured (not that it actually works), this also disables demo mode. However, this is more of a workaround than a proper solution.
If local providers like LM Studio are meant to be supported, it may be better for the demo mode logic to check whether any provider is configured rather than relying on a fixed whitelist.
@ersinkoc commented on GitHub (Feb 19, 2026):
I hope these screenshots will help you better understand.
@leventorcan commented on GitHub (Feb 19, 2026):
I am running Ollama (no api key), not LM Studio:)
@Cayrop 's 2. workaround (Set a dummy OpenAI API key) worked.
Issue 1 still needs to be fixed though...