[GH-ISSUE #923] No inference client configured #609

Closed
opened 2026-03-02 11:51:17 +03:00 by kerem · 2 comments
Owner

Originally created by @ZenoBell on GitHub (Jan 21, 2025).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/923

Describe the Bug

Cannot use the function "summarize with AI"

Steps to Reproduce

Every time i try to use "summarize with AI" I'm getting the following error :

p [TRPCError]: No inference client configured

    at /app/apps/web/.next/server/chunks/6815.js:1:41479

    at h.middlewares (/app/apps/web/.next/server/chunks/480.js:4:46521)

    at a (/app/apps/web/.next/server/chunks/480.js:4:45915)

    at Object.next (/app/apps/web/.next/server/chunks/480.js:4:46027)

    at /app/apps/web/.next/server/chunks/6815.js:1:26397

    at async a (/app/apps/web/.next/server/chunks/480.js:4:45909)

    at async t (/app/apps/web/.next/server/chunks/480.js:4:45282)

    at async a (/app/apps/web/.next/server/chunks/480.js:4:45909)

    at async a (/app/apps/web/.next/server/chunks/480.js:4:45909)

    at async a (/app/apps/web/.next/server/chunks/480.js:4:45909) {

  code: 'BAD_REQUEST',

  [cause]: undefined

From inside the docker container i can reach the API :

/app # curl http://192.168.10.146:11434
Ollama is running

/app # printenv | grep OLLAMA

OLLAMA_BASE_UR=http://192.168.10.146:11434

/app # printenv | grep INFERENCE

INFERENCE_IMAGE_MODEL=llava-phi3:latest
INFERENCE_CONTEXT_LENGTH=2048
INFERENCE_JOB_TIMEOUT_SEC=300
INFERENCE_TEXT_MODEL=llama3.1:8b
INFERENCE_LANG=english

the models :

C:\Users\sessho>ollama list
NAME ID SIZE MODIFIED
llava-phi3:latest c7edd7b87593 2.9 GB 7 weeks ago
llama3.1:8b 46e0c10c039e 4.9 GB 7 weeks ago

so names are correctly defined.

Expected Behaviour

The AI call should work

Screenshots or Additional Context

No response

Device Details

hoarder is running on docker on a linux host through docker compose and ollama in a separate windows host in the network

Exact Hoarder Version

v0.21.0

Have you checked the troubleshooting guide?

  • I have checked the troubleshooting guide and I haven't found a solution to my problem

EDIT: I just tested also with openai defining the API key and i have the same error so something else i should be doing wrong at this point.

Originally created by @ZenoBell on GitHub (Jan 21, 2025). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/923 ### Describe the Bug Cannot use the function "summarize with AI" ### Steps to Reproduce Every time i try to use "summarize with AI" I'm getting the following error : ``` p [TRPCError]: No inference client configured at /app/apps/web/.next/server/chunks/6815.js:1:41479 at h.middlewares (/app/apps/web/.next/server/chunks/480.js:4:46521) at a (/app/apps/web/.next/server/chunks/480.js:4:45915) at Object.next (/app/apps/web/.next/server/chunks/480.js:4:46027) at /app/apps/web/.next/server/chunks/6815.js:1:26397 at async a (/app/apps/web/.next/server/chunks/480.js:4:45909) at async t (/app/apps/web/.next/server/chunks/480.js:4:45282) at async a (/app/apps/web/.next/server/chunks/480.js:4:45909) at async a (/app/apps/web/.next/server/chunks/480.js:4:45909) at async a (/app/apps/web/.next/server/chunks/480.js:4:45909) { code: 'BAD_REQUEST', [cause]: undefined ``` From inside the docker container i can reach the API : /app # curl http://192.168.10.146:11434 Ollama is running /app # printenv | grep OLLAMA OLLAMA_BASE_UR=http://192.168.10.146:11434 /app # printenv | grep INFERENCE INFERENCE_IMAGE_MODEL=llava-phi3:latest INFERENCE_CONTEXT_LENGTH=2048 INFERENCE_JOB_TIMEOUT_SEC=300 INFERENCE_TEXT_MODEL=llama3.1:8b INFERENCE_LANG=english the models : C:\Users\sessho>ollama list NAME ID SIZE MODIFIED llava-phi3:latest c7edd7b87593 2.9 GB 7 weeks ago llama3.1:8b 46e0c10c039e 4.9 GB 7 weeks ago so names are correctly defined. ### Expected Behaviour The AI call should work ### Screenshots or Additional Context _No response_ ### Device Details hoarder is running on docker on a linux host through docker compose and ollama in a separate windows host in the network ### Exact Hoarder Version v0.21.0 ### Have you checked the troubleshooting guide? - [x] I have checked the troubleshooting guide and I haven't found a solution to my problem EDIT: I just tested also with openai defining the API key and i have the same error so something else i should be doing wrong at this point.
kerem closed this issue 2026-03-02 11:51:17 +03:00
Author
Owner

@MohamedBassem commented on GitHub (Jan 21, 2025):

OLLAMA_BASE_UR

seems to be missing an L?

<!-- gh-comment-id:2605771620 --> @MohamedBassem commented on GitHub (Jan 21, 2025): OLLAMA_BASE_UR seems to be missing an L?
Author
Owner

@ZenoBell commented on GitHub (Jan 21, 2025):

OLLAMA_BASE_UR

seems to be missing an L?

Oh, you're absolutely right! I spent the whole afternoon troubleshooting, but I couldn't spot the misspelling ! Thank you !

<!-- gh-comment-id:2605870451 --> @ZenoBell commented on GitHub (Jan 21, 2025): > OLLAMA_BASE_UR > > seems to be missing an L? Oh, you're absolutely right! I spent the whole afternoon troubleshooting, but I couldn't spot the misspelling ! Thank you !
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#609
No description provided.