[GH-ISSUE #821] Only some ollama model names work, other give exceptions #536

Closed
opened 2026-03-02 11:50:40 +03:00 by kerem · 6 comments
Owner

Originally created by @ssuukk on GitHub (Jan 4, 2025).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/821

Describe the Bug

It seems only very plain model names work with hoarder, I couldn't determine which character causes the exception, but it might be colon. For example, if I put my model like this in docker-compose:

      OLLAMA_BASE_URL: http://192.168.1.133:11434
      INFERENCE_TEXT_MODEL: huihui_ai/qwen2.5-abliterate:0.5b

The inference will keep failing with:

2025-01-04T07:59:06.832Z info: [inference][166] Starting an inference job for bookmark with id "xj2r95hzs5hv0bzs1e2dbnq7"

2025-01-04T07:59:06.841Z error: [inference][166] inference job failed: TypeError: fetch failed

TypeError: fetch failed

    at node:internal/deps/undici/undici:13484:13

    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)

    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)

    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206)

    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125)

    at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578)

2025-01-04T08:00:00.035Z info: [feed] Scheduling feed refreshing jobs ...

2025-01-04T09:00:00.582Z info: [feed] Scheduling feed refreshing jobs ...

If the model name is simple as abc, the inference will run.

Steps to Reproduce

  1. update docker-compose.yml with ollama config as in the bug description
  2. rebuild
  3. try to infer

Expected Behaviour

Inference works with any legal ollama model name.

Screenshots or Additional Context

tions
 
 
 
      
2025-01-04T07:53:07.475Z error: [inference][161] inference job failed: TypeError: fetch failed

TypeError: fetch failed

    at node:internal/deps/undici/undici:13484:13

    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)

    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)

    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206)

    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125)

    at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578)

2025-01-04T07:53:08.537Z info: [inference][161] Starting an inference job for bookmark with id "qcr1u84pyvlugywemizatrqy"

2025-01-04T07:53:08.547Z error: [inference][161] inference job failed: TypeError: fetch failed

TypeError: fetch failed

    at node:internal/deps/undici/undici:13484:13

    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)

    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)

    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206)

    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125)

    at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578)

2025-01-04T07:58:46.480Z info: [search][165] Attempting to index bookmark with id xj2r95hzs5hv0bzs1e2dbnq7 ...

2025-01-04T07:58:46.825Z info: [Crawler][164] Will crawl "https://itsfoss.com/homelab-dashboard/?ref=dailydev" for link with id "xj2r95hzs5hv0bzs1e2dbnq7"

2025-01-04T07:58:46.825Z info: [Crawler][164] Attempting to determine the content-type for the url https://itsfoss.com/homelab-dashboard/?ref=dailydev

2025-01-04T07:58:46.992Z info: [Crawler][164] Content-type for the url https://itsfoss.com/homelab-dashboard/?ref=dailydev is "text/html; charset=utf-8"

2025-01-04T07:58:49.425Z info: [search][165] Completed successfully

2025-01-04T07:58:50.699Z info: [Crawler][164] Successfully navigated to "https://itsfoss.com/homelab-dashboard/?ref=dailydev". Waiting for the page to load ...

2025-01-04T07:58:52.614Z info: [Crawler][164] Finished waiting for the page to load.

2025-01-04T07:58:52.637Z info: [Crawler][164] Successfully fetched the page content.

2025-01-04T07:58:53.086Z info: [Crawler][164] Finished capturing page content and a screenshot. FullPageScreenshot: false

2025-01-04T07:58:53.113Z info: [Crawler][164] Will attempt to extract metadata from page ...

2025-01-04T07:58:54.147Z info: [Crawler][164] Will attempt to extract readable content ...

2025-01-04T07:58:55.163Z info: [Crawler][164] Done extracting readable content.

2025-01-04T07:58:55.184Z info: [Crawler][164] Stored the screenshot as assetId: 69b8ffa7-bbfa-42d8-beda-6239ca440aad

2025-01-04T07:58:55.306Z info: [Crawler][164] Done extracting metadata from the page.

2025-01-04T07:58:55.306Z info: [Crawler][164] Downloading image from "https://itsfoss.com/content/images/2024/12/tools-for-homelab-server.png"

2025-01-04T07:58:55.413Z info: [Crawler][164] Downloaded image as assetId: f23bc710-684e-4c73-a58f-907ebbc800ac

2025-01-04T07:58:58.511Z info: [Crawler][164] Completed successfully

2025-01-04T07:59:00.072Z info: [search][167] Attempting to index bookmark with id xj2r95hzs5hv0bzs1e2dbnq7 ...

2025-01-04T07:59:00.717Z info: [VideoCrawler][168] Skipping video download from "https://itsfoss.com/homelab-dashboard/?ref=dailydev", because it is disabled in the config.

2025-01-04T07:59:00.717Z info: [VideoCrawler][168] Video Download Completed successfully

2025-01-04T07:59:01.923Z info: [inference][166] Starting an inference job for bookmark with id "xj2r95hzs5hv0bzs1e2dbnq7"

2025-01-04T07:59:02.334Z error: [inference][166] inference job failed: TypeError: fetch failed

TypeError: fetch failed

    at node:internal/deps/undici/undici:13484:13

    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)

    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)

    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206)

    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125)

    at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578)

2025-01-04T07:59:04.234Z info: [inference][166] Starting an inference job for bookmark with id "xj2r95hzs5hv0bzs1e2dbnq7"

2025-01-04T07:59:04.244Z error: [inference][166] inference job failed: TypeError: fetch failed

TypeError: fetch failed

    at node:internal/deps/undici/undici:13484:13

    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)

    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)

    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206)

    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125)

    at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578)

2025-01-04T07:59:05.201Z info: [inference][166] Starting an inference job for bookmark with id "xj2r95hzs5hv0bzs1e2dbnq7"

2025-01-04T07:59:05.204Z info: [search][167] Completed successfully

2025-01-04T07:59:05.679Z error: [inference][166] inference job failed: TypeError: fetch failed

TypeError: fetch failed

    at node:internal/deps/undici/undici:13484:13

    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)

    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)

    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206)

    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125)

    at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578)

2025-01-04T07:59:06.832Z info: [inference][166] Starting an inference job for bookmark with id "xj2r95hzs5hv0bzs1e2dbnq7"

2025-01-04T07:59:06.841Z error: [inference][166] inference job failed: TypeError: fetch failed

TypeError: fetch failed

    at node:internal/deps/undici/undici:13484:13

    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20)

    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25)

    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206)

    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125)

    at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578)

2025-01-04T08:00:00.035Z info: [feed] Scheduling feed refreshing jobs ...

2025-01-04T09:00:00.582Z info: [feed] Scheduling feed refreshing jobs ...

Device Details

Linux container

Exact Hoarder Version

0.20.0

Have you checked the troubleshooting guide?

  • I have checked the troubleshooting guide and I haven't found a solution to my problem
Originally created by @ssuukk on GitHub (Jan 4, 2025). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/821 ### Describe the Bug It seems only very plain model names work with hoarder, I couldn't determine which character causes the exception, but it might be colon. For example, if I put my model like this in docker-compose: ``` OLLAMA_BASE_URL: http://192.168.1.133:11434 INFERENCE_TEXT_MODEL: huihui_ai/qwen2.5-abliterate:0.5b ``` The inference will keep failing with: ``` 2025-01-04T07:59:06.832Z info: [inference][166] Starting an inference job for bookmark with id "xj2r95hzs5hv0bzs1e2dbnq7" 2025-01-04T07:59:06.841Z error: [inference][166] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13484:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206) at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125) at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578) 2025-01-04T08:00:00.035Z info: [feed] Scheduling feed refreshing jobs ... 2025-01-04T09:00:00.582Z info: [feed] Scheduling feed refreshing jobs ... ``` If the model name is simple as `abc`, the inference will run. ### Steps to Reproduce 1. update docker-compose.yml with ollama config as in the bug description 2. rebuild 3. try to infer ### Expected Behaviour Inference works with any legal ollama model name. ### Screenshots or Additional Context ``` tions 2025-01-04T07:53:07.475Z error: [inference][161] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13484:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206) at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125) at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578) 2025-01-04T07:53:08.537Z info: [inference][161] Starting an inference job for bookmark with id "qcr1u84pyvlugywemizatrqy" 2025-01-04T07:53:08.547Z error: [inference][161] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13484:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206) at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125) at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578) 2025-01-04T07:58:46.480Z info: [search][165] Attempting to index bookmark with id xj2r95hzs5hv0bzs1e2dbnq7 ... 2025-01-04T07:58:46.825Z info: [Crawler][164] Will crawl "https://itsfoss.com/homelab-dashboard/?ref=dailydev" for link with id "xj2r95hzs5hv0bzs1e2dbnq7" 2025-01-04T07:58:46.825Z info: [Crawler][164] Attempting to determine the content-type for the url https://itsfoss.com/homelab-dashboard/?ref=dailydev 2025-01-04T07:58:46.992Z info: [Crawler][164] Content-type for the url https://itsfoss.com/homelab-dashboard/?ref=dailydev is "text/html; charset=utf-8" 2025-01-04T07:58:49.425Z info: [search][165] Completed successfully 2025-01-04T07:58:50.699Z info: [Crawler][164] Successfully navigated to "https://itsfoss.com/homelab-dashboard/?ref=dailydev". Waiting for the page to load ... 2025-01-04T07:58:52.614Z info: [Crawler][164] Finished waiting for the page to load. 2025-01-04T07:58:52.637Z info: [Crawler][164] Successfully fetched the page content. 2025-01-04T07:58:53.086Z info: [Crawler][164] Finished capturing page content and a screenshot. FullPageScreenshot: false 2025-01-04T07:58:53.113Z info: [Crawler][164] Will attempt to extract metadata from page ... 2025-01-04T07:58:54.147Z info: [Crawler][164] Will attempt to extract readable content ... 2025-01-04T07:58:55.163Z info: [Crawler][164] Done extracting readable content. 2025-01-04T07:58:55.184Z info: [Crawler][164] Stored the screenshot as assetId: 69b8ffa7-bbfa-42d8-beda-6239ca440aad 2025-01-04T07:58:55.306Z info: [Crawler][164] Done extracting metadata from the page. 2025-01-04T07:58:55.306Z info: [Crawler][164] Downloading image from "https://itsfoss.com/content/images/2024/12/tools-for-homelab-server.png" 2025-01-04T07:58:55.413Z info: [Crawler][164] Downloaded image as assetId: f23bc710-684e-4c73-a58f-907ebbc800ac 2025-01-04T07:58:58.511Z info: [Crawler][164] Completed successfully 2025-01-04T07:59:00.072Z info: [search][167] Attempting to index bookmark with id xj2r95hzs5hv0bzs1e2dbnq7 ... 2025-01-04T07:59:00.717Z info: [VideoCrawler][168] Skipping video download from "https://itsfoss.com/homelab-dashboard/?ref=dailydev", because it is disabled in the config. 2025-01-04T07:59:00.717Z info: [VideoCrawler][168] Video Download Completed successfully 2025-01-04T07:59:01.923Z info: [inference][166] Starting an inference job for bookmark with id "xj2r95hzs5hv0bzs1e2dbnq7" 2025-01-04T07:59:02.334Z error: [inference][166] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13484:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206) at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125) at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578) 2025-01-04T07:59:04.234Z info: [inference][166] Starting an inference job for bookmark with id "xj2r95hzs5hv0bzs1e2dbnq7" 2025-01-04T07:59:04.244Z error: [inference][166] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13484:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206) at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125) at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578) 2025-01-04T07:59:05.201Z info: [inference][166] Starting an inference job for bookmark with id "xj2r95hzs5hv0bzs1e2dbnq7" 2025-01-04T07:59:05.204Z info: [search][167] Completed successfully 2025-01-04T07:59:05.679Z error: [inference][166] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13484:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206) at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125) at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578) 2025-01-04T07:59:06.832Z info: [inference][166] Starting an inference job for bookmark with id "xj2r95hzs5hv0bzs1e2dbnq7" 2025-01-04T07:59:06.841Z error: [inference][166] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13484:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:114:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.9/node_modules/ollama/dist/shared/ollama.9c897541.cjs:232:25) at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3206) at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:3956) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:4125) at async inferTags (/app/apps/workers/openaiWorker.ts:6:4360) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:7782) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.0_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2578) 2025-01-04T08:00:00.035Z info: [feed] Scheduling feed refreshing jobs ... 2025-01-04T09:00:00.582Z info: [feed] Scheduling feed refreshing jobs ... ``` ### Device Details Linux container ### Exact Hoarder Version 0.20.0 ### Have you checked the troubleshooting guide? - [X] I have checked the troubleshooting guide and I haven't found a solution to my problem
kerem 2026-03-02 11:50:40 +03:00
  • closed this issue
  • added the
    question
    label
Author
Owner

@MohamedBassem commented on GitHub (Jan 5, 2025):

@ssuukk just to rule out the common mistakes, did you run ollama pull on that model already?

<!-- gh-comment-id:2571720030 --> @MohamedBassem commented on GitHub (Jan 5, 2025): @ssuukk just to rule out the common mistakes, did you run `ollama pull` on that model already?
Author
Owner

@ssuukk commented on GitHub (Jan 5, 2025):

of course

niedz., 5 sty 2025, 20:00 użytkownik Mohamed Bassem <
@.***> napisał:

@ssuukk https://github.com/ssuukk just to rule out the common mistakes,
did you run ollama pull on that model already?


Reply to this email directly, view it on GitHub
https://github.com/hoarder-app/hoarder/issues/821#issuecomment-2571720030,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AAV3O6NM3WZL7UHAJ4WBSUD2JF6LFAVCNFSM6AAAAABUS53T6GVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNZRG4ZDAMBTGA
.
You are receiving this because you were mentioned.Message ID:
@.***>

<!-- gh-comment-id:2571720405 --> @ssuukk commented on GitHub (Jan 5, 2025): of course niedz., 5 sty 2025, 20:00 użytkownik Mohamed Bassem < ***@***.***> napisał: > @ssuukk <https://github.com/ssuukk> just to rule out the common mistakes, > did you run ollama pull on that model already? > > — > Reply to this email directly, view it on GitHub > <https://github.com/hoarder-app/hoarder/issues/821#issuecomment-2571720030>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/AAV3O6NM3WZL7UHAJ4WBSUD2JF6LFAVCNFSM6AAAAABUS53T6GVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNZRG4ZDAMBTGA> > . > You are receiving this because you were mentioned.Message ID: > ***@***.***> >
Author
Owner

@Yasand123 commented on GitHub (Jan 6, 2025):

I have similar errors. All AI related tasks fail. Using INFERENCE_TEXT_MODEL=dolphin-mistral:latest. I tried multiple values for OLLAMA_BASE_URL, like http://host.docker.internal:11434 http://127.0.0.1:11434 but all of them fail. I don't think the issue is there. And yes that's the correct port, I made sure. Ollama logs don't show any connections made to it at all, so if we assume it connected but there was an issue with the configuration it would've given an error message or a warning, but it acts as if its API was not used.

web-1  | 2025-01-05T19:23:51.566Z info: Starting asset preprocessing worker ...
web-1  | TypeError: fetch failed
web-1  |     at node:internal/deps/undici/undici:13484:13
web-1  |     ... 8 lines matching cause stack trace ...
web-1  |     at async a (/app/apps/web/.next/server/chunks/480.js:4:45909) {
web-1  |   code: 'INTERNAL_SERVER_ERROR',
web-1  |   name: 'TRPCError',
web-1  |   [cause]: TypeError: fetch failed
web-1  |       at node:internal/deps/undici/undici:13484:13
web-1  |       at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
web-1  |       at async O (/app/apps/web/.next/server/chunks/480.js:19:51834)
web-1  |       at async z.processStreamableRequest (/app/apps/web/.next/server/chunks/480.js:19:53345)
web-1  |       at async J.runModel (/app/apps/web/.next/server/chunks/6815.js:1:20391)
web-1  |       at async J.inferFromText (/app/apps/web/.next/server/chunks/6815.js:1:20943)
web-1  |       at async /app/apps/web/.next/server/chunks/6815.js:7:183
web-1  |       at async h.middlewares (/app/apps/web/.next/server/chunks/480.js:4:46515)
web-1  |       at async a (/app/apps/web/.next/server/chunks/480.js:4:45909)
web-1  |       at async a (/app/apps/web/.next/server/chunks/480.js:4:45909) {
web-1  |     [cause]: Error: getaddrinfo ENOTFOUND host.docker.internal
web-1  |         at GetAddrInfoReqWrap.onlookupall [as oncomplete] (node:dns:120:26)
web-1  |         at GetAddrInfoReqWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
web-1  |       errno: -3008,
web-1  |       code: 'ENOTFOUND',
web-1  |       syscall: 'getaddrinfo',
web-1  |       hostname: 'host.docker.internal'
web-1  |     }
web-1  |   }
web-1  | }
<!-- gh-comment-id:2572867668 --> @Yasand123 commented on GitHub (Jan 6, 2025): I have similar errors. All AI related tasks fail. Using `INFERENCE_TEXT_MODEL=dolphin-mistral:latest`. I tried multiple values for `OLLAMA_BASE_URL`, like `http://host.docker.internal:11434` `http://127.0.0.1:11434` but all of them fail. I don't think the issue is there. And yes that's the correct port, I made sure. Ollama logs don't show any connections made to it at all, so if we assume it connected but there was an issue with the configuration it would've given an error message or a warning, but it acts as if its API was not used. ``` web-1 | 2025-01-05T19:23:51.566Z info: Starting asset preprocessing worker ... web-1 | TypeError: fetch failed web-1 | at node:internal/deps/undici/undici:13484:13 web-1 | ... 8 lines matching cause stack trace ... web-1 | at async a (/app/apps/web/.next/server/chunks/480.js:4:45909) { web-1 | code: 'INTERNAL_SERVER_ERROR', web-1 | name: 'TRPCError', web-1 | [cause]: TypeError: fetch failed web-1 | at node:internal/deps/undici/undici:13484:13 web-1 | at process.processTicksAndRejections (node:internal/process/task_queues:105:5) web-1 | at async O (/app/apps/web/.next/server/chunks/480.js:19:51834) web-1 | at async z.processStreamableRequest (/app/apps/web/.next/server/chunks/480.js:19:53345) web-1 | at async J.runModel (/app/apps/web/.next/server/chunks/6815.js:1:20391) web-1 | at async J.inferFromText (/app/apps/web/.next/server/chunks/6815.js:1:20943) web-1 | at async /app/apps/web/.next/server/chunks/6815.js:7:183 web-1 | at async h.middlewares (/app/apps/web/.next/server/chunks/480.js:4:46515) web-1 | at async a (/app/apps/web/.next/server/chunks/480.js:4:45909) web-1 | at async a (/app/apps/web/.next/server/chunks/480.js:4:45909) { web-1 | [cause]: Error: getaddrinfo ENOTFOUND host.docker.internal web-1 | at GetAddrInfoReqWrap.onlookupall [as oncomplete] (node:dns:120:26) web-1 | at GetAddrInfoReqWrap.callbackTrampoline (node:internal/async_hooks:130:17) { web-1 | errno: -3008, web-1 | code: 'ENOTFOUND', web-1 | syscall: 'getaddrinfo', web-1 | hostname: 'host.docker.internal' web-1 | } web-1 | } web-1 | } ```
Author
Owner

@MohamedBassem commented on GitHub (Jan 6, 2025):

@Yasand123 your problem is different. Check bullet point number 4 in the troubleshooting guide (https://docs.hoarder.app/troubleshooting). The stackoverflow answer says that you have to add the extra hosts thing to the docker compose. Did you do that?

<!-- gh-comment-id:2572872696 --> @MohamedBassem commented on GitHub (Jan 6, 2025): @Yasand123 your problem is different. Check bullet point number 4 in the troubleshooting guide (https://docs.hoarder.app/troubleshooting). The stackoverflow answer says that you have to add the extra hosts thing to the docker compose. Did you do that?
Author
Owner

@kamtschatka commented on GitHub (Jan 6, 2025):

this is definitely an issue with encoding/special meaning of characters in the docker-compose file.
I tried the names in the env file (local development setup and with my production setup) and "huihui_ai/qwen2.5-abliterate:0.5b" was not an issue (INFERENCE_TEXT_MODEL=huihui_ai/qwen2.5-abliterate:0.5b)
https://docs.docker.com/compose/how-tos/environment-variables/set-environment-variables/ shows that you can also provide env variables like this:

    environment:
      - INFERENCE_TEXT_MODEL=huihui_ai/qwen2.5-abliterate:0.5b

If not, I would recommend asking in a docker related forum, where they have more experience

<!-- gh-comment-id:2572914189 --> @kamtschatka commented on GitHub (Jan 6, 2025): this is definitely an issue with encoding/special meaning of characters in the docker-compose file. I tried the names in the env file (local development setup and with my production setup) and "huihui_ai/qwen2.5-abliterate:0.5b" was not an issue (`INFERENCE_TEXT_MODEL=huihui_ai/qwen2.5-abliterate:0.5b`) https://docs.docker.com/compose/how-tos/environment-variables/set-environment-variables/ shows that you can also provide env variables like this: ``` environment: - INFERENCE_TEXT_MODEL=huihui_ai/qwen2.5-abliterate:0.5b ``` If not, I would recommend asking in a docker related forum, where they have more experience
Author
Owner

@MohamedBassem commented on GitHub (Jan 11, 2025):

Closing the issue as per @kamtschatka's triage. Feel free to reopen if that doesn't work for you.

<!-- gh-comment-id:2585249163 --> @MohamedBassem commented on GitHub (Jan 11, 2025): Closing the issue as per @kamtschatka's triage. Feel free to reopen if that doesn't work for you.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#536
No description provided.