[GH-ISSUE #936] Problem with tag and Ollama #620

Closed
opened 2026-03-02 11:51:22 +03:00 by kerem · 4 comments
Owner

Originally created by @ilnonno70 on GitHub (Jan 25, 2025).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/936

Hi all this my docker compose (docker desktop with WSL)

services:
ollama:
image: ollama/ollama
ports:
- "11434:11434"
volumes:
- c:/docker/container/ollama:/root/.ollama
restart: unless-stopped
networks:
- my_custom_bridge

open-webui:
image: ghcr.io/open-webui/open-webui:main
ports:
- "6079:8080"
depends_on:
- ollama
environment:
- OLLAMA_API_BASE_URL=http://ollama:11434/api
networks:
- my_custom_bridge

web:
image: ghcr.io/hoarder-app/hoarder:${HOARDER_VERSION:-release}
restart: unless-stopped
volumes:
- c:/docker/container/hoarder:/data
ports:
- "3000:3000"
environment:
MEILI_ADDR: http://meilisearch:7700
BROWSER_WEB_URL: http://chrome:9222
DATA_DIR: /data
NEXTAUTH_SECRET: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
MEILI_MASTER_KEY: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
OLLAMA_BASE_URL: http://ollama:11434/api
INFERENCE_TEXT_MODEL: llama3.1
INFERENCE_IMAGE_MODEL: llava
INFERENCE_CONTEXT_LENGTH: 2048
INFERENCE_LANG: italian
networks:
- my_custom_bridge

chrome:
image: gcr.io/zenika-hub/alpine-chrome:123
restart: unless-stopped
command:
- --no-sandbox
- --disable-gpu
- --disable-dev-shm-usage
- --remote-debugging-address=0.0.0.0
- --remote-debugging-port=9222
- --hide-scrollbars

networks:
  - my_custom_bridge

meilisearch:
image: getmeili/meilisearch:v1.11.1
restart: unless-stopped
environment:
MEILI_NO_ANALYTICS: "true"
NEXTAUTH_SECRET: xxxxxxxxxxxxxxxxxxxxxxxxx
MEILI_MASTER_KEY: xxxxxxxxxxxxxxxxxxxxxxxxx
OLLAMA_BASE_URL: http://ollama:11434/api
INFERENCE_TEXT_MODEL: llama3.1:8b
INFERENCE_IMAGE_MODEL: llava:7b
INFERENCE_CONTEXT_LENGTH: 2048
INFERENCE_LANG: english
volumes:
- c:/docker/container/meilisearch:/meili_data
networks:
- my_custom_bridge

networks:
my_custom_bridge:
driver: bridge

When I start hoarder and add links, it can't apply AI tags
The log report this error

2025-01-25T19:05:28.687Z error: [inference][123] inference job failed: ResponseError: 404 page not found

ResponseError: 404 page not found

at checkOk (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:72:9)

at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:120:3)

at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)

at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3482)

at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:4232)

at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3029)

at async inferTags (/app/apps/workers/openaiWorker.ts:6:3264)

at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6686)

at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578)

2025-01-25T19:05:28.716Z info: [search][124] Completed successfullys

Someone can help me to fix it?

Thx

Originally created by @ilnonno70 on GitHub (Jan 25, 2025). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/936 Hi all this my docker compose (docker desktop with WSL) services: ollama: image: ollama/ollama ports: - "11434:11434" volumes: - c:/docker/container/ollama:/root/.ollama restart: unless-stopped networks: - my_custom_bridge open-webui: image: ghcr.io/open-webui/open-webui:main ports: - "6079:8080" depends_on: - ollama environment: - OLLAMA_API_BASE_URL=http://ollama:11434/api networks: - my_custom_bridge web: image: ghcr.io/hoarder-app/hoarder:${HOARDER_VERSION:-release} restart: unless-stopped volumes: - c:/docker/container/hoarder:/data ports: - "3000:3000" environment: MEILI_ADDR: http://meilisearch:7700 BROWSER_WEB_URL: http://chrome:9222 DATA_DIR: /data NEXTAUTH_SECRET: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx MEILI_MASTER_KEY: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx OLLAMA_BASE_URL: http://ollama:11434/api INFERENCE_TEXT_MODEL: llama3.1 INFERENCE_IMAGE_MODEL: llava INFERENCE_CONTEXT_LENGTH: 2048 INFERENCE_LANG: italian networks: - my_custom_bridge chrome: image: gcr.io/zenika-hub/alpine-chrome:123 restart: unless-stopped command: - --no-sandbox - --disable-gpu - --disable-dev-shm-usage - --remote-debugging-address=0.0.0.0 - --remote-debugging-port=9222 - --hide-scrollbars networks: - my_custom_bridge meilisearch: image: getmeili/meilisearch:v1.11.1 restart: unless-stopped environment: MEILI_NO_ANALYTICS: "true" NEXTAUTH_SECRET: xxxxxxxxxxxxxxxxxxxxxxxxx MEILI_MASTER_KEY: xxxxxxxxxxxxxxxxxxxxxxxxx OLLAMA_BASE_URL: http://ollama:11434/api INFERENCE_TEXT_MODEL: llama3.1:8b INFERENCE_IMAGE_MODEL: llava:7b INFERENCE_CONTEXT_LENGTH: 2048 INFERENCE_LANG: english volumes: - c:/docker/container/meilisearch:/meili_data networks: - my_custom_bridge networks: my_custom_bridge: driver: bridge When I start hoarder and add links, it can't apply AI tags The log report this error 2025-01-25T19:05:28.687Z error: [inference][123] inference job failed: ResponseError: 404 page not found ResponseError: 404 page not found at checkOk (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:72:9) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:120:3) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3482) at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:4232) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3029) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3264) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6686) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578) 2025-01-25T19:05:28.716Z info: [search][124] Completed successfullys Someone can help me to fix it? Thx
kerem closed this issue 2026-03-02 11:51:22 +03:00
Author
Owner

@kamtschatka commented on GitHub (Jan 26, 2025):

use
OLLAMA_BASE_URL: http://ollama:11434/

<!-- gh-comment-id:2614262483 --> @kamtschatka commented on GitHub (Jan 26, 2025): use `OLLAMA_BASE_URL: http://ollama:11434/`
Author
Owner

@ilnonno70 commented on GitHub (Jan 26, 2025):

Thanks, now it seems to be working properly. The only thing I noticed is that the tag only works with some bookmarks, with others it doesn't seem to work.

The log returns this message

2025-01-26T09:56:53.519Z error: [inference][166] inference job failed: Error: Timeout

Error: Timeout

at Timeout._onTimeout (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2672)

at listOnTimeout (node:internal/timers:594:17)

at process.processTimers (node:internal/timers:529:7)

Does it depend on the type of bookmark?

<!-- gh-comment-id:2614305379 --> @ilnonno70 commented on GitHub (Jan 26, 2025): Thanks, now it seems to be working properly. The only thing I noticed is that the tag only works with some bookmarks, with others it doesn't seem to work. The log returns this message 2025-01-26T09:56:53.519Z error: [inference][166] inference job failed: Error: Timeout Error: Timeout at Timeout._onTimeout (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2672) at listOnTimeout (node:internal/timers:594:17) at process.processTimers (node:internal/timers:529:7) Does it depend on the type of bookmark?
Author
Owner

@kamtschatka commented on GitHub (Jan 26, 2025):

try increasing INFERENCE_JOB_TIMEOUT_SEC, takes too long to get a response.

<!-- gh-comment-id:2614306369 --> @kamtschatka commented on GitHub (Jan 26, 2025): try increasing INFERENCE_JOB_TIMEOUT_SEC, takes too long to get a response.
Author
Owner

@ilnonno70 commented on GitHub (Jan 26, 2025):

now it works well, thank you

<!-- gh-comment-id:2614317248 --> @ilnonno70 commented on GitHub (Jan 26, 2025): now it works well, thank you
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#620
No description provided.