[GH-ISSUE #585] ollama error #372

Closed
opened 2026-03-02 11:49:17 +03:00 by kerem · 3 comments
Owner

Originally created by @divemasterjm on GitHub (Oct 24, 2024).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/585

Describe the Bug

i've got error when calling ollama in external computer
version: "3.8"
services:
web:
image: ghcr.io/hoarder-app/hoarder:${HOARDER_VERSION:-release}
restart: unless-stopped
volumes:
- data:/data
ports:
- 3000:3000
env_file:
- .env
environment:
MEILI_ADDR: http://meilisearch:7700
BROWSER_WEB_URL: http://chrome:9222
# OPENAI_API_KEY: ...
OLLAMA_BASE_URL: http://192.168.8.184:11434
INFERENCE_TEXT_MODEL: mistral:7b
INFERENCE_IMAGE_MODEL: llava:latest
DATA_DIR: /data
chrome:
image: gcr.io/zenika-hub/alpine-chrome:123
restart: unless-stopped
command:
- --no-sandbox
- --disable-gpu
- --disable-dev-shm-usage
- --remote-debugging-address=0.0.0.0
- --remote-debugging-port=9222
- --hide-scrollbars
meilisearch:
image: getmeili/meilisearch:v1.6
restart: unless-stopped
env_file:
- .env
environment:
MEILI_NO_ANALYTICS: "true"
volumes:
- meilisearch:/meili_data
volumes:
meilisearch:
data:

Steps to Reproduce

when creating or reicreating tags

Expected Behaviour

2024-10-24T06:14:30.189Z error: [inference][1016] inference job failed: TypeError: fetch failed
TypeError: fetch failed
at node:internal/deps/undici/undici:13185:13
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)
at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)
at async OllamaInferenceClient.runModel (/app/apps/workers/inference.ts:2:3086)
at async OllamaInferenceClient.inferFromImage (/app/apps/workers/inference.ts:2:3915)
at async inferTags (/app/apps/workers/openaiWorker.ts:6:3014)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6316)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/@hoarder+queue@file+packages+queue/node_modules/@hoarder/queue/runner.ts:2:2567)
2024-10-24T06:14:30.234Z info: [inference][1017] Starting an inference job for bookmark with id "kfj9neum3ki0u9hzgfmga4u9"

Screenshots or Additional Context

2024-10-24T06:14:30.189Z error: [inference][1016] inference job failed: TypeError: fetch failed
TypeError: fetch failed
at node:internal/deps/undici/undici:13185:13
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)
at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)
at async OllamaInferenceClient.runModel (/app/apps/workers/inference.ts:2:3086)
at async OllamaInferenceClient.inferFromImage (/app/apps/workers/inference.ts:2:3915)
at async inferTags (/app/apps/workers/openaiWorker.ts:6:3014)
at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6316)
at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/@hoarder+queue@file+packages+queue/node_modules/@hoarder/queue/runner.ts:2:2567)
2024-10-24T06:14:30.234Z info: [inference][1017] Starting an inference job for bookmark with id "kfj9neum3ki0u9hzgfmga4u9"

Device Details

linux debian 12

Exact Hoarder Version

v0.18.0

Originally created by @divemasterjm on GitHub (Oct 24, 2024). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/585 ### Describe the Bug i've got error when calling ollama in external computer version: "3.8" services: web: image: ghcr.io/hoarder-app/hoarder:${HOARDER_VERSION:-release} restart: unless-stopped volumes: - data:/data ports: - 3000:3000 env_file: - .env environment: MEILI_ADDR: http://meilisearch:7700 BROWSER_WEB_URL: http://chrome:9222 # OPENAI_API_KEY: ... OLLAMA_BASE_URL: http://192.168.8.184:11434 INFERENCE_TEXT_MODEL: mistral:7b INFERENCE_IMAGE_MODEL: llava:latest DATA_DIR: /data chrome: image: gcr.io/zenika-hub/alpine-chrome:123 restart: unless-stopped command: - --no-sandbox - --disable-gpu - --disable-dev-shm-usage - --remote-debugging-address=0.0.0.0 - --remote-debugging-port=9222 - --hide-scrollbars meilisearch: image: getmeili/meilisearch:v1.6 restart: unless-stopped env_file: - .env environment: MEILI_NO_ANALYTICS: "true" volumes: - meilisearch:/meili_data volumes: meilisearch: data: ### Steps to Reproduce when creating or reicreating tags ### Expected Behaviour 2024-10-24T06:14:30.189Z error: [inference][1016] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13185:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/inference.ts:2:3086) at async OllamaInferenceClient.inferFromImage (/app/apps/workers/inference.ts:2:3915) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3014) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6316) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/@hoarder+queue@file+packages+queue/node_modules/@hoarder/queue/runner.ts:2:2567) 2024-10-24T06:14:30.234Z info: [inference][1017] Starting an inference job for bookmark with id "kfj9neum3ki0u9hzgfmga4u9" ### Screenshots or Additional Context 2024-10-24T06:14:30.189Z error: [inference][1016] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13185:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/inference.ts:2:3086) at async OllamaInferenceClient.inferFromImage (/app/apps/workers/inference.ts:2:3915) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3014) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6316) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/@hoarder+queue@file+packages+queue/node_modules/@hoarder/queue/runner.ts:2:2567) 2024-10-24T06:14:30.234Z info: [inference][1017] Starting an inference job for bookmark with id "kfj9neum3ki0u9hzgfmga4u9" ### Device Details linux debian 12 ### Exact Hoarder Version v0.18.0
kerem 2026-03-02 11:49:17 +03:00
  • closed this issue
  • added the
    question
    label
Author
Owner

@MohamedBassem commented on GitHub (Oct 24, 2024):

this probably means that ollama is not reachable from Hoarder's container. Any network policies preventing the call? Can you exec into the container and see if you can hit the ollama address from inside the container l?

<!-- gh-comment-id:2434642909 --> @MohamedBassem commented on GitHub (Oct 24, 2024): this probably means that ollama is not reachable from Hoarder's container. Any network policies preventing the call? Can you exec into the container and see if you can hit the ollama address from inside the container l?
Author
Owner

@kamtschatka commented on GitHub (Oct 24, 2024):

additionally: by default ollama does not allow access from external hosts, you have to set that up: https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network

<!-- gh-comment-id:2434646207 --> @kamtschatka commented on GitHub (Oct 24, 2024): additionally: by default ollama does not allow access from external hosts, you have to set that up: https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network
Author
Owner

@divemasterjm commented on GitHub (Oct 24, 2024):

this probably means that ollama is not reachable from Hoarder's container. Any network policies preventing the call? Can you exec into the container and see if you can hit the ollama address from inside the container l?

i can ping to ollama, i use openwebui on an proxmox lxc, using openwebui on docker it works, so maybe is a proxmox lxc ollama setup problem.

<!-- gh-comment-id:2435575469 --> @divemasterjm commented on GitHub (Oct 24, 2024): > this probably means that ollama is not reachable from Hoarder's container. Any network policies preventing the call? Can you exec into the container and see if you can hit the ollama address from inside the container l? i can ping to ollama, i use openwebui on an proxmox lxc, using openwebui on docker it works, so maybe is a proxmox lxc ollama setup problem.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#372
No description provided.