[GH-ISSUE #1160] AI tags not working after 0.23 update #758

Closed
opened 2026-03-02 11:52:28 +03:00 by kerem · 24 comments
Owner

Originally created by @maidou-00 on GitHub (Mar 26, 2025).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/1160

Describe the Bug

hello, I recently update hoarder from 0.22 to 0.23, and it appears that AI tags stopped working (however AI summarize is working). I've tried redeploying, re-indexing and some other normal troubleshooting drills but still unable to fix it.

Additionally, smart list is not working either My bad, smart list is working and I love it!
Also, I am wondering if it is possible to include a TZ env variable? So that I can fix timestamps on logs from Portainer

Steps to Reproduce

OPENAI_BASE_URL=https://api.deepseek.com
OPENAI_API_KEY=sk-88****** (OPENAI compatible, worked in hoarder 0.22)
INFERENCE_TEXT_MODEL=deepseek-chat
INFERENCE_JOB_TIMEOUT_SEC=120 (was 60 previously)

Logs related to AI Tag


    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)
    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)
    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)
2025-03-26T01:02:22.150Z info: [search][2566] Attempting to index bookmark with id nwpmszgbm7b1xhcdsadnljfp ...
2025-03-26T01:02:23.005Z info: [Crawler][2565] Will crawl "https://www.baidu.com" for link with id "nwpmszgbm7b1xhcdsadnljfp"
2025-03-26T01:02:23.005Z info: [Crawler][2565] Attempting to determine the content-type for the url https://www.baidu.com
2025-03-26T01:02:23.818Z info: [webhook][2567] Starting a webhook job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp for operation "created"
2025-03-26T01:02:23.818Z info: [webhook][2567] Completed successfully
2025-03-26T01:02:25.814Z info: [Crawler][2565] Content-type for the url https://www.baidu.com is "text/html"
2025-03-26T01:02:31.246Z error: [search][2566] search job failed: MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1290 when waiting a task to be resolved.
MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1290 when waiting a task to be resolved.
    at TaskClient.<anonymous> (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:717:19)
    at Generator.next (<anonymous>)
    at fulfilled (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:298:58)
2025-03-26T01:02:35.204Z info: [search][2566] Attempting to index bookmark with id nwpmszgbm7b1xhcdsadnljfp ...
2025-03-26T01:02:35.228Z info: [Crawler][2565] Successfully navigated to "https://www.baidu.com". Waiting for the page to load ...
2025-03-26T01:02:37.331Z info: [Crawler][2565] Finished waiting for the page to load.
2025-03-26T01:02:37.743Z info: [Crawler][2565] Successfully fetched the page content.
2025-03-26T01:02:38.024Z info: [Crawler][2565] Finished capturing page content and a screenshot. FullPageScreenshot: true
2025-03-26T01:02:38.028Z info: [Crawler][2565] Will attempt to extract metadata from page ...
2025-03-26T01:02:39.444Z info: [Crawler][2565] Will attempt to extract readable content ...
2025-03-26T01:02:40.650Z info: [Crawler][2565] Done extracting readable content.
2025-03-26T01:02:40.778Z info: [Crawler][2565] Stored the screenshot as assetId: 04eb1286-efc8-4678-87af-b3df5072e48a
2025-03-26T01:02:41.040Z info: [Crawler][2565] Done extracting metadata from the page.
2025-03-26T01:02:41.040Z info: [Crawler][2565] Downloading image from "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAE4AAAAqCAMAAAAqEZ1jAAAAAXNSR0IArs4c6QAAAAlwSFlzAAAhOAAAITgBRZYxYAAAAKJQTFRFAAAAenb/RpP/k2j/XXf7cXX6P5z7i2n8W3j7ZHP7jmn9VID7Qpf8jWn8Xnr6iWr8YHb7i2n8RJb9dWz8WXr8mWj9RpP8W3n8bW/8l2j+QJz9Q5f9ToT8fWv9ZW/8lWj9VID8YnX7Pp79QZr9pWb+RJT8m2f9k2j9SYz8jWn9hmr8UIT8fmv8VX78Xnn7d2z8ZHT7WHr7bm77Xnb7Z3D7YHH7RJOQRAAAACJ0Uk5TABAgICAwQEBAWF5gZXBwgICbn5+fo7+/vsLP39/f3urv73XwOA8AAAKfSURBVHja7dbJcuIwFIXhIzCxMTMNcdwMDoMZY4NxeP9X66srEckYQlPVvcvPBhZ8dVSIAvz03xNV/Luq4Xq9Hgo8nfDqjXq97hS1yXZNhXg2Z3o+f1KnLqx62y17Pp5smp+158EUaW6I52rkxLF3eoNpt1Ne+bR+Fd8U5Oc8Px6P5M0Evgq11ytpy0jgbk6e5cSxd2pb7yKOikpTouWyhbsNMsnpefZpOzsCI7c8brkM74/LqFSC7HkwVTu9lsB14ZJy745LlDdV5+3iQTXC7s9z3pNEggOh9s3woOFqJT2BmzUSLnMQEEeeh2+rkraiOrjZeL+XXEBwyp71Ybg+hWL+iltGt8ftKQI9QMzSlDxz9VofssIMdxitdJEvbo3jxqAGKXtt6N4OkpvgkmiFa50Ch1UU8zab/Ya4Br/IUnnewHCHg+FEh7/ExUIXdq+KG4N7z3hfzXDkaa5FmGpdMENrYWWjaoDrJtJLu4ajFFfbUduryN1ue4brb2KpjSvgvETe53R24U6Ggz+RYJnsCTMujmMJ9qELErkv9y7cyXAGtNCoI2Dqx9ymAl1bcYMLJ2NO5fZ2dmHLxiBGrJlxEAl7M1HkTFUzseeiWHOxYK+CrwLltRX3qTk7/4PbTXDdaBHLxytMdcllaaA56U1RSEyU56M0jotfYBLvynM091ni0GGtPO73nLkR7AaK6yqOKnG1g+wXrnqZz9nrw87bszdjjn/dDGffbgdX9YmT4HykaoLTp/UABGfyypx7a1yFNb0wpl7BddW8geSO5JU5TE63xhmOPb2uok8reN1Nrn4qjQNBlmfdvkB5beaoMgdP4LpFgbNuX3svtSwABoZ7WPOKa0Inxgl7DurMdfFXVV7savjK8WQ1/cz5+Q9e7A/jUZeiPQO0fwAAAABJRU5ErkJggg=="
2025-03-26T01:02:41.043Z info: [Crawler][2565] Downloaded image as assetId: a4a3b860-7231-451f-b4d9-aa5bb932789a
2025-03-26T01:02:56.384Z info: [Crawler][2565] Will attempt to archive page ...
2025-03-26T01:02:56.396Z error: [search][2566] search job failed: MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1291 when waiting a task to be resolved.
MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1291 when waiting a task to be resolved.
    at TaskClient.<anonymous> (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:717:19)
    at Generator.next (<anonymous>)
    at fulfilled (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:298:58)
    at runNextTicks (node:internal/process/task_queues:65:5)
    at process.processTimers (node:internal/timers:526:9)
2025-03-26T01:02:58.820Z info: [search][2566] Attempting to index bookmark with id nwpmszgbm7b1xhcdsadnljfp ...
2025-03-26T01:03:01.606Z info: [webhook][2571] Starting a webhook job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp for operation "crawled"
2025-03-26T01:03:01.606Z info: [webhook][2571] Completed successfully
2025-03-26T01:03:07.157Z info: [inference][2568] Starting an inference job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp"
2025-03-26T01:03:08.948Z info: [VideoCrawler][2570] Skipping video download from "https://www.baidu.com", because it is disabled in the config.
2025-03-26T01:03:08.948Z info: [VideoCrawler][2570] Video Download Completed successfully
2025-03-26T01:03:10.703Z info: [search][2566] Completed successfully
2025-03-26T01:03:14.701Z info: [search][2569] Attempting to index bookmark with id nwpmszgbm7b1xhcdsadnljfp ...
2025-03-26T01:03:14.814Z error: [inference][2568] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1
Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1
    at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20)
    at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33)
    at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157)
    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)
    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)
    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)
    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)
2025-03-26T01:03:21.543Z info: [inference][2568] Starting an inference job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp"
2025-03-26T01:03:21.595Z info: [Crawler][2565] Done archiving the page as assetId: 175972f4-06ca-42f1-b04b-ff5895af346f
2025-03-26T01:03:24.035Z info: [Crawler][2565] Completed successfully
2025-03-26T01:03:26.612Z error: [search][2569] search job failed: MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1293 when waiting a task to be resolved.
MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1293 when waiting a task to be resolved.
    at TaskClient.<anonymous> (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:717:19)
    at Generator.next (<anonymous>)
    at fulfilled (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:298:58)
    at runNextTicks (node:internal/process/task_queues:65:5)
    at process.processTimers (node:internal/timers:526:9)
2025-03-26T01:03:28.988Z info: [search][2569] Attempting to index bookmark with id nwpmszgbm7b1xhcdsadnljfp ...
2025-03-26T01:03:29.003Z error: [inference][2568] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1
Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1
    at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20)
    at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33)
    at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157)
    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)
    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)
    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)
    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)
2025-03-26T01:03:34.842Z info: [inference][2568] Starting an inference job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp"
2025-03-26T01:03:34.858Z info: [search][2569] Completed successfully
2025-03-26T01:03:35.758Z error: [inference][2568] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1
Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1
    at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20)
    at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33)
    at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157)
    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)
    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)
    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)
    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)
2025-03-26T01:03:40.241Z info: [inference][2568] Starting an inference job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp"
2025-03-26T01:03:40.300Z error: [inference][2568] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1
Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1
    at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20)
    at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33)
    at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157)
    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)
    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)
    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)
    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)

Expected Behaviour

generate AI tag

Screenshots or Additional Context

No response

Device Details

running on Synology NAS (Celeron CPU, could be a low power-consumption CPU)

Exact Hoarder Version

0.23

Have you checked the troubleshooting guide?

  • I have checked the troubleshooting guide and I haven't found a solution to my problem
Originally created by @maidou-00 on GitHub (Mar 26, 2025). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/1160 ### Describe the Bug hello, I recently update hoarder from 0.22 to 0.23, and it appears that AI tags stopped working (however AI summarize is working). I've tried redeploying, re-indexing and some other normal troubleshooting drills but still unable to fix it. ~~Additionally, smart list is not working either~~ My bad, smart list is working and I love it! Also, I am wondering if it is possible to include a TZ env variable? So that I can fix timestamps on logs from Portainer ### Steps to Reproduce OPENAI_BASE_URL=https://api.deepseek.com OPENAI_API_KEY=sk-88****** (OPENAI compatible, worked in hoarder 0.22) INFERENCE_TEXT_MODEL=deepseek-chat INFERENCE_JOB_TIMEOUT_SEC=120 (was 60 previously) **Logs related to AI Tag** ___ ``` at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) 2025-03-26T01:02:22.150Z info: [search][2566] Attempting to index bookmark with id nwpmszgbm7b1xhcdsadnljfp ... 2025-03-26T01:02:23.005Z info: [Crawler][2565] Will crawl "https://www.baidu.com" for link with id "nwpmszgbm7b1xhcdsadnljfp" 2025-03-26T01:02:23.005Z info: [Crawler][2565] Attempting to determine the content-type for the url https://www.baidu.com 2025-03-26T01:02:23.818Z info: [webhook][2567] Starting a webhook job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp for operation "created" 2025-03-26T01:02:23.818Z info: [webhook][2567] Completed successfully 2025-03-26T01:02:25.814Z info: [Crawler][2565] Content-type for the url https://www.baidu.com is "text/html" 2025-03-26T01:02:31.246Z error: [search][2566] search job failed: MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1290 when waiting a task to be resolved. MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1290 when waiting a task to be resolved. at TaskClient.<anonymous> (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:717:19) at Generator.next (<anonymous>) at fulfilled (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:298:58) 2025-03-26T01:02:35.204Z info: [search][2566] Attempting to index bookmark with id nwpmszgbm7b1xhcdsadnljfp ... 2025-03-26T01:02:35.228Z info: [Crawler][2565] Successfully navigated to "https://www.baidu.com". Waiting for the page to load ... 2025-03-26T01:02:37.331Z info: [Crawler][2565] Finished waiting for the page to load. 2025-03-26T01:02:37.743Z info: [Crawler][2565] Successfully fetched the page content. 2025-03-26T01:02:38.024Z info: [Crawler][2565] Finished capturing page content and a screenshot. FullPageScreenshot: true 2025-03-26T01:02:38.028Z info: [Crawler][2565] Will attempt to extract metadata from page ... 2025-03-26T01:02:39.444Z info: [Crawler][2565] Will attempt to extract readable content ... 2025-03-26T01:02:40.650Z info: [Crawler][2565] Done extracting readable content. 2025-03-26T01:02:40.778Z info: [Crawler][2565] Stored the screenshot as assetId: 04eb1286-efc8-4678-87af-b3df5072e48a 2025-03-26T01:02:41.040Z info: [Crawler][2565] Done extracting metadata from the page. 2025-03-26T01:02:41.040Z info: [Crawler][2565] Downloading image from "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAE4AAAAqCAMAAAAqEZ1jAAAAAXNSR0IArs4c6QAAAAlwSFlzAAAhOAAAITgBRZYxYAAAAKJQTFRFAAAAenb/RpP/k2j/XXf7cXX6P5z7i2n8W3j7ZHP7jmn9VID7Qpf8jWn8Xnr6iWr8YHb7i2n8RJb9dWz8WXr8mWj9RpP8W3n8bW/8l2j+QJz9Q5f9ToT8fWv9ZW/8lWj9VID8YnX7Pp79QZr9pWb+RJT8m2f9k2j9SYz8jWn9hmr8UIT8fmv8VX78Xnn7d2z8ZHT7WHr7bm77Xnb7Z3D7YHH7RJOQRAAAACJ0Uk5TABAgICAwQEBAWF5gZXBwgICbn5+fo7+/vsLP39/f3urv73XwOA8AAAKfSURBVHja7dbJcuIwFIXhIzCxMTMNcdwMDoMZY4NxeP9X66srEckYQlPVvcvPBhZ8dVSIAvz03xNV/Luq4Xq9Hgo8nfDqjXq97hS1yXZNhXg2Z3o+f1KnLqx62y17Pp5smp+158EUaW6I52rkxLF3eoNpt1Ne+bR+Fd8U5Oc8Px6P5M0Evgq11ytpy0jgbk6e5cSxd2pb7yKOikpTouWyhbsNMsnpefZpOzsCI7c8brkM74/LqFSC7HkwVTu9lsB14ZJy745LlDdV5+3iQTXC7s9z3pNEggOh9s3woOFqJT2BmzUSLnMQEEeeh2+rkraiOrjZeL+XXEBwyp71Ybg+hWL+iltGt8ftKQI9QMzSlDxz9VofssIMdxitdJEvbo3jxqAGKXtt6N4OkpvgkmiFa50Ch1UU8zab/Ya4Br/IUnnewHCHg+FEh7/ExUIXdq+KG4N7z3hfzXDkaa5FmGpdMENrYWWjaoDrJtJLu4ajFFfbUduryN1ue4brb2KpjSvgvETe53R24U6Ggz+RYJnsCTMujmMJ9qELErkv9y7cyXAGtNCoI2Dqx9ymAl1bcYMLJ2NO5fZ2dmHLxiBGrJlxEAl7M1HkTFUzseeiWHOxYK+CrwLltRX3qTk7/4PbTXDdaBHLxytMdcllaaA56U1RSEyU56M0jotfYBLvynM091ni0GGtPO73nLkR7AaK6yqOKnG1g+wXrnqZz9nrw87bszdjjn/dDGffbgdX9YmT4HykaoLTp/UABGfyypx7a1yFNb0wpl7BddW8geSO5JU5TE63xhmOPb2uok8reN1Nrn4qjQNBlmfdvkB5beaoMgdP4LpFgbNuX3svtSwABoZ7WPOKa0Inxgl7DurMdfFXVV7savjK8WQ1/cz5+Q9e7A/jUZeiPQO0fwAAAABJRU5ErkJggg==" 2025-03-26T01:02:41.043Z info: [Crawler][2565] Downloaded image as assetId: a4a3b860-7231-451f-b4d9-aa5bb932789a 2025-03-26T01:02:56.384Z info: [Crawler][2565] Will attempt to archive page ... 2025-03-26T01:02:56.396Z error: [search][2566] search job failed: MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1291 when waiting a task to be resolved. MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1291 when waiting a task to be resolved. at TaskClient.<anonymous> (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:717:19) at Generator.next (<anonymous>) at fulfilled (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:298:58) at runNextTicks (node:internal/process/task_queues:65:5) at process.processTimers (node:internal/timers:526:9) 2025-03-26T01:02:58.820Z info: [search][2566] Attempting to index bookmark with id nwpmszgbm7b1xhcdsadnljfp ... 2025-03-26T01:03:01.606Z info: [webhook][2571] Starting a webhook job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp for operation "crawled" 2025-03-26T01:03:01.606Z info: [webhook][2571] Completed successfully 2025-03-26T01:03:07.157Z info: [inference][2568] Starting an inference job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp" 2025-03-26T01:03:08.948Z info: [VideoCrawler][2570] Skipping video download from "https://www.baidu.com", because it is disabled in the config. 2025-03-26T01:03:08.948Z info: [VideoCrawler][2570] Video Download Completed successfully 2025-03-26T01:03:10.703Z info: [search][2566] Completed successfully 2025-03-26T01:03:14.701Z info: [search][2569] Attempting to index bookmark with id nwpmszgbm7b1xhcdsadnljfp ... 2025-03-26T01:03:14.814Z error: [inference][2568] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20) at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33) at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) 2025-03-26T01:03:21.543Z info: [inference][2568] Starting an inference job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp" 2025-03-26T01:03:21.595Z info: [Crawler][2565] Done archiving the page as assetId: 175972f4-06ca-42f1-b04b-ff5895af346f 2025-03-26T01:03:24.035Z info: [Crawler][2565] Completed successfully 2025-03-26T01:03:26.612Z error: [search][2569] search job failed: MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1293 when waiting a task to be resolved. MeiliSearchTimeOutError: timeout of 5000ms has exceeded on process 1293 when waiting a task to be resolved. at TaskClient.<anonymous> (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:717:19) at Generator.next (<anonymous>) at fulfilled (/app/apps/workers/node_modules/.pnpm/meilisearch@0.37.0/node_modules/meilisearch/dist/bundles/meilisearch.cjs.js:298:58) at runNextTicks (node:internal/process/task_queues:65:5) at process.processTimers (node:internal/timers:526:9) 2025-03-26T01:03:28.988Z info: [search][2569] Attempting to index bookmark with id nwpmszgbm7b1xhcdsadnljfp ... 2025-03-26T01:03:29.003Z error: [inference][2568] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20) at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33) at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) 2025-03-26T01:03:34.842Z info: [inference][2568] Starting an inference job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp" 2025-03-26T01:03:34.858Z info: [search][2569] Completed successfully 2025-03-26T01:03:35.758Z error: [inference][2568] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20) at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33) at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) 2025-03-26T01:03:40.241Z info: [inference][2568] Starting an inference job for bookmark with id "nwpmszgbm7b1xhcdsadnljfp" 2025-03-26T01:03:40.300Z error: [inference][2568] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20) at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33) at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) ``` ___ ### Expected Behaviour generate AI tag ### Screenshots or Additional Context _No response_ ### Device Details running on Synology NAS (Celeron CPU, could be a low power-consumption CPU) ### Exact Hoarder Version 0.23 ### Have you checked the troubleshooting guide? - [x] I have checked the troubleshooting guide and I haven't found a solution to my problem
kerem 2026-03-02 11:52:28 +03:00
Author
Owner

@jeheda commented on GitHub (Mar 26, 2025):

Same issue, if i downgrade it to 0.22.0 works fine, i am also using deepseek

2025-03-26T04:35:20.174Z error: [inference][33] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1
Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1
    at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20)
    at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33)
    at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157)
    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)
    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)
    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)
    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)

<!-- gh-comment-id:2753226275 --> @jeheda commented on GitHub (Mar 26, 2025): Same issue, if i downgrade it to 0.22.0 works fine, i am also using deepseek ``` 2025-03-26T04:35:20.174Z error: [inference][33] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20) at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33) at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) ```
Author
Owner

@maidou-00 commented on GitHub (Mar 26, 2025):

hey jehada, thanks for sharing. Deepseek released chat-V3 model in the past few days, I was worried that this might be the cause... you mentioned that everything works fine after downgrading to 0.22, hence I think it is safe to assume the problem isn't caused by the deepseek update.

Has anyone else using different LLM encounter similar problem?

<!-- gh-comment-id:2753241793 --> @maidou-00 commented on GitHub (Mar 26, 2025): hey jehada, thanks for sharing. Deepseek released chat-V3 model in the past few days, I was worried that this might be the cause... you mentioned that everything works fine after downgrading to 0.22, hence I think it is safe to assume the problem isn't caused by the deepseek update. Has anyone else using different LLM encounter similar problem?
Author
Owner

@jeheda commented on GitHub (Mar 26, 2025):

Has anyone else using different LLM encounter similar problem?

Using deepinfra with 0.23.0 throws the following, and as deepseek downgrading to 0.22.0 works fine

2025-03-26T05:00:29.377Z error: [inference][69] inference job failed: Error: 500 inference exception
Error: 500 inference exception
    at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:66:20)
    at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33)
    at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157)
    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)
    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)
    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)
    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)
<!-- gh-comment-id:2753259320 --> @jeheda commented on GitHub (Mar 26, 2025): > Has anyone else using different LLM encounter similar problem? Using deepinfra with 0.23.0 throws the following, and as deepseek downgrading to 0.22.0 works fine ``` 2025-03-26T05:00:29.377Z error: [inference][69] inference job failed: Error: 500 inference exception Error: 500 inference exception at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:66:20) at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33) at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) ```
Author
Owner

@jeheda commented on GitHub (Mar 26, 2025):

The issue is in the following lines

github.com/hoarder-app/hoarder@b3417d87a0/packages/shared/inference.ts (L79-L83)

previously it was like this

github.com/hoarder-app/hoarder@ed8fd793f2/packages/shared/inference.ts (L74-L76)

so maybe set INFERENCE_SUPPORTS_STRUCTURED_OUTPUT to false by default and use { type: "json_object" } instead of undefined? not sure just throwing this out there in case it helps

        response_format:
          optsWithDefaults.schema &&
          serverConfig.inference.supportsStructuredOutput
            ? zodResponseFormat(optsWithDefaults.schema, "schema")
            : { type: "json_object" },
<!-- gh-comment-id:2753359255 --> @jeheda commented on GitHub (Mar 26, 2025): The issue is in the following lines https://github.com/hoarder-app/hoarder/blob/b3417d87a0565b3536029e3b1e347611b5c6679b/packages/shared/inference.ts#L79-L83 previously it was like this https://github.com/hoarder-app/hoarder/blob/ed8fd793f2c3ee0135c9fa230af7ead867a614bc/packages/shared/inference.ts#L74-L76 so maybe set `INFERENCE_SUPPORTS_STRUCTURED_OUTPUT` to false by default and use `{ type: "json_object" } ` instead of `undefined`? not sure just throwing this out there in case it helps ```ts response_format: optsWithDefaults.schema && serverConfig.inference.supportsStructuredOutput ? zodResponseFormat(optsWithDefaults.schema, "schema") : { type: "json_object" }, ```
Author
Owner

@MohamedBassem commented on GitHub (Mar 26, 2025):

it seems that deepseek doesn't support structured outputs. Can you try: INFERENCE_SUPPORTS_STRUCTURED_OUTPUT=false?

<!-- gh-comment-id:2753476507 --> @MohamedBassem commented on GitHub (Mar 26, 2025): it seems that deepseek doesn't support structured outputs. Can you try: INFERENCE_SUPPORTS_STRUCTURED_OUTPUT=false?
Author
Owner

@maidou-00 commented on GitHub (Mar 26, 2025):

it seems that deepseek doesn't support structured outputs. Can you try: INFERENCE_SUPPORTS_STRUCTURED_OUTPUT=false?
no luck so far, any luck on your side? @jeheda

2025-03-26T07:45:34.436Z error: [inference][2578] inference job failed: Error: [inference][2578] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response: ```json

{

  "tags": 

Error: [inference][2578] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response: ```json

{

  "tags": 

    at inferTags (/app/apps/workers/openaiWorker.ts:6:4346)

    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)
<!-- gh-comment-id:2753502264 --> @maidou-00 commented on GitHub (Mar 26, 2025): > it seems that deepseek doesn't support structured outputs. Can you try: INFERENCE_SUPPORTS_STRUCTURED_OUTPUT=false? no luck so far, any luck on your side? @jeheda ``` 2025-03-26T07:45:34.436Z error: [inference][2578] inference job failed: Error: [inference][2578] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response: ```json { "tags": Error: [inference][2578] The model ignored our prompt and didn't respond with the expected JSON: {}. Here's a sneak peak from the response: ```json { "tags": at inferTags (/app/apps/workers/openaiWorker.ts:6:4346) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) ```
Author
Owner

@jeheda commented on GitHub (Mar 26, 2025):

same issue as @maidou-00 when only using INFERENCE_SUPPORTS_STRUCTURED_OUTPUT=false if i change undefined to { type: "json_object" } it works

2025-03-26T07:51:04.518Z info: [inference][65] Inferring tag for bookmark "ilh0kyirto2n824agmmb2obm" used 3079 tokens and inferred: Pierre Boulez,Contemporary Classical Music,Avant-garde Music,Music Conductors,IRCAM
<!-- gh-comment-id:2753507239 --> @jeheda commented on GitHub (Mar 26, 2025): same issue as @maidou-00 when only using `INFERENCE_SUPPORTS_STRUCTURED_OUTPUT=false` if i change `undefined` to `{ type: "json_object" }` it works ``` 2025-03-26T07:51:04.518Z info: [inference][65] Inferring tag for bookmark "ilh0kyirto2n824agmmb2obm" used 3079 tokens and inferred: Pierre Boulez,Contemporary Classical Music,Avant-garde Music,Music Conductors,IRCAM ```
Author
Owner

@MohamedBassem commented on GitHub (Mar 26, 2025):

ok, over the weekend I'll send a hotfix to allow enabling the json mode back. So basically we'll have three modes: 1. structured outputs. 2. Json mode. 3. No schema at all.
Would happily accept a PR if someone wants to send it earlier.

<!-- gh-comment-id:2753512376 --> @MohamedBassem commented on GitHub (Mar 26, 2025): ok, over the weekend I'll send a hotfix to allow enabling the json mode back. So basically we'll have three modes: 1. structured outputs. 2. Json mode. 3. No schema at all. Would happily accept a PR if someone wants to send it earlier.
Author
Owner

@maidou-00 commented on GitHub (Mar 26, 2025):

same issue as @maidou-00 when only using INFERENCE_SUPPORTS_STRUCTURED_OUTPUT=false if i change undefined to { type: "json_object" } it works

2025-03-26T07:51:04.518Z info: [inference][65] Inferring tag for bookmark "ilh0kyirto2n824agmmb2obm" used 3079 tokens and inferred: Pierre Boulez,Contemporary Classical Music,Avant-garde Music,Music Conductors,IRCAM

Glad to hear it! I am gonna have to wait for the dev team to release a new update as I am terrible with coding. In the meantime, the "change layout" function also seemed problematic, will open a new issue.

<!-- gh-comment-id:2753513574 --> @maidou-00 commented on GitHub (Mar 26, 2025): > same issue as [@maidou-00](https://github.com/maidou-00) when only using `INFERENCE_SUPPORTS_STRUCTURED_OUTPUT=false` if i change `undefined` to `{ type: "json_object" }` it works > > ``` > 2025-03-26T07:51:04.518Z info: [inference][65] Inferring tag for bookmark "ilh0kyirto2n824agmmb2obm" used 3079 tokens and inferred: Pierre Boulez,Contemporary Classical Music,Avant-garde Music,Music Conductors,IRCAM > ``` Glad to hear it! I am gonna have to wait for the dev team to release a new update as I am terrible with coding. In the meantime, the "change layout" function also seemed problematic, will open a new issue.
Author
Owner

@maidou-00 commented on GitHub (Mar 26, 2025):

ok, over the weekend I'll send a hotfix to allow enabling the json mode back. So basically we'll have three modes: 1. structured outputs. 2. Json mode. 3. No schema at all. Would happily accept a PR if someone wants to send it earlier.

Thanks Mohamed and the team for this great project and the continuous effort!

<!-- gh-comment-id:2753516041 --> @maidou-00 commented on GitHub (Mar 26, 2025): > ok, over the weekend I'll send a hotfix to allow enabling the json mode back. So basically we'll have three modes: 1. structured outputs. 2. Json mode. 3. No schema at all. Would happily accept a PR if someone wants to send it earlier. Thanks Mohamed and the team for this great project and the continuous effort!
Author
Owner

@MohamedBassem commented on GitHub (Mar 26, 2025):

let's keep the issue open until the fix is there

<!-- gh-comment-id:2753518411 --> @MohamedBassem commented on GitHub (Mar 26, 2025): let's keep the issue open until the fix is there
Author
Owner

@maidou-00 commented on GitHub (Mar 30, 2025):

let's keep the issue open until the fix is there

Hi Mohamed, I am wondering if the hotfix has been released?

<!-- gh-comment-id:2764412107 --> @maidou-00 commented on GitHub (Mar 30, 2025): > let's keep the issue open until the fix is there Hi Mohamed, I am wondering if the hotfix has been released?
Author
Owner

@wouldntyouknow commented on GitHub (Mar 31, 2025):

I won't open another thread as might be related, so putting it here for now - I am having the same issue with Ollama. In the past it worked perfectly and I didn't change anything in my compose file, still points properly to Ollama. My last two hoards from the past days fail to get AI tags. Here is the latest output:

`
2025-03-31T21:07:10.698Z info: [inference][481] Starting an inference job for bookmark with id "v7glx3aqni320g912r0gypxy"

2025-03-31T21:07:10.702Z error: [inference][481] inference job failed: TypeError: fetch failed

TypeError: fetch failed

at node:internal/deps/undici/undici:13502:13

at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:135:20)

at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:252:25)

at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:4271)

at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:5305)

at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)

at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)

at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)

at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)

`

<!-- gh-comment-id:2767495497 --> @wouldntyouknow commented on GitHub (Mar 31, 2025): I won't open another thread as might be related, so putting it here for now - I am having the same issue with Ollama. In the past it worked perfectly and I didn't change anything in my compose file, still points properly to Ollama. My last two hoards from the past days fail to get AI tags. Here is the latest output: ` 2025-03-31T21:07:10.698Z info: [inference][481] Starting an inference job for bookmark with id "v7glx3aqni320g912r0gypxy" 2025-03-31T21:07:10.702Z error: [inference][481] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13502:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:135:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:252:25) at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:4271) at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:5305) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) `
Author
Owner

@maidou-00 commented on GitHub (Apr 9, 2025):

@MohamedBassem just upgraded to the new karakeep and loving it! Congratz on the rebrand! However I noticed that the AI-tagging is still not function, wondering if it will be fixed in the next update? Thanks!

<!-- gh-comment-id:2789134799 --> @maidou-00 commented on GitHub (Apr 9, 2025): @MohamedBassem just upgraded to the new karakeep and loving it! Congratz on the rebrand! However I noticed that the AI-tagging is still not function, wondering if it will be fixed in the next update? Thanks!
Author
Owner

@MohamedBassem commented on GitHub (Apr 13, 2025):

I'll get to this today, sorry for the delay folks.

<!-- gh-comment-id:2800020234 --> @MohamedBassem commented on GitHub (Apr 13, 2025): I'll get to this today, sorry for the delay folks.
Author
Owner

@MohamedBassem commented on GitHub (Apr 13, 2025):

Hey folks, sorry this took too long. The nightly build will contain the fix (and it'll be ready in 15mins). To restore the old behavior, you need to remove INFERENCE_SUPPORTS_STRUCTURED_OUTPUT (important) from the env file, and replace it with:

INFERENCE_OUTPUT_SCHEMA=json

The docs for this new flag are here: https://docs.karakeep.app/next/configuration#inference-configs-for-automatic-tagging

<!-- gh-comment-id:2800035203 --> @MohamedBassem commented on GitHub (Apr 13, 2025): Hey folks, sorry this took too long. The nightly build will contain the fix (and it'll be ready in 15mins). To restore the old behavior, you need to remove `INFERENCE_SUPPORTS_STRUCTURED_OUTPUT` (important) from the env file, and replace it with: ``` INFERENCE_OUTPUT_SCHEMA=json ``` The docs for this new flag are here: https://docs.karakeep.app/next/configuration#inference-configs-for-automatic-tagging
Author
Owner

@maidou-00 commented on GitHub (Apr 14, 2025):

@MohamedBassem unfortunately it is still not working on my end...

  1. I have redeployed kara-keep and "Server Stats" is showing "Server Version" 0.23.2
    Image

  2. I have never included INFERENCE_SUPPORTS_STRUCTURED_OUTPUT as an env variable, but I have included INFERENCE_OUTPUT_SCHEMA=json in portainer as you suggested.
    Image

  3. AI summarize is working. I use deepseek for summarize and tagging, it was fine before

  4. restarted multiple times, tagging is still not working. The following are the logs:

Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1

    at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20)

    at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33)

    at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30)

    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

    at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)

    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)

2025-04-14T01:22:38.589Z info: [inference][2763] Starting an inference job for bookmark with id "nxm18x5arkthrq4e5gulscqk"

2025-04-14T01:22:38.652Z error: [inference][2763] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1

Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1

    at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20)

    at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33)

    at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30)

    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

    at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)

    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)

<!-- gh-comment-id:2800257988 --> @maidou-00 commented on GitHub (Apr 14, 2025): @MohamedBassem unfortunately it is still not working on my end... 1. I have redeployed kara-keep and "Server Stats" is showing "Server Version" 0.23.2 <img width="236" alt="Image" src="https://github.com/user-attachments/assets/daaca359-5081-425e-81e4-ab30ae8b5855" /> 2. I have never included INFERENCE_SUPPORTS_STRUCTURED_OUTPUT as an env variable, but I have included `INFERENCE_OUTPUT_SCHEMA=json` in portainer as you suggested. <img width="1042" alt="Image" src="https://github.com/user-attachments/assets/038729da-dada-4501-a30a-92a55d1acbef" /> 3. AI summarize is working. I use deepseek for summarize and tagging, it was fine before 4. restarted multiple times, tagging is still not working. The following are the logs: ``` Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20) at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33) at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) 2025-04-14T01:22:38.589Z info: [inference][2763] Starting an inference job for bookmark with id "nxm18x5arkthrq4e5gulscqk" 2025-04-14T01:22:38.652Z error: [inference][2763] inference job failed: Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 Error: 422 Failed to deserialize the JSON body into the target type: response_format: response_format.type `json_schema` is unavailable now at line 32 column 1 at APIError.generate (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/error.js:60:20) at OpenAI.makeStatusError (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:299:33) at OpenAI.makeRequest (/app/apps/workers/node_modules/.pnpm/openai@4.86.1_zod@3.22.4/node_modules/openai/core.js:343:30) at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async OpenAIInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:2157) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) ```
Author
Owner

@maidou-00 commented on GitHub (Apr 14, 2025):

@jeheda @wouldntyouknow any luck on your end?

<!-- gh-comment-id:2800259973 --> @maidou-00 commented on GitHub (Apr 14, 2025): @jeheda @wouldntyouknow any luck on your end?
Author
Owner

@stark1tty commented on GitHub (Apr 14, 2025):

No luck here either. Was going a bit mad, but it looks like others have the issue too. @maidou-00

<!-- gh-comment-id:2800266047 --> @stark1tty commented on GitHub (Apr 14, 2025): No luck here either. Was going a bit mad, but it looks like others have the issue too. @maidou-00
Author
Owner

@maidou-00 commented on GitHub (Apr 14, 2025):

@stark1tty Thanks for the reply, guess we will have to wait for the dev team to look into this again

<!-- gh-comment-id:2800270971 --> @maidou-00 commented on GitHub (Apr 14, 2025): @stark1tty Thanks for the reply, guess we will have to wait for the dev team to look into this again
Author
Owner

@MohamedBassem commented on GitHub (Apr 14, 2025):

Folks, the fix is in the nightly build. It's not in the stable release yet. This means that you need to change the image to use the "latest" tag instead of "release" tag.

<!-- gh-comment-id:2800323082 --> @MohamedBassem commented on GitHub (Apr 14, 2025): Folks, the fix is in the nightly build. It's not in the stable release yet. This means that you need to change the image to use the "latest" tag instead of "release" tag.
Author
Owner

@stark1tty commented on GitHub (Apr 14, 2025):

Edit: Got it, just had to increast time out length. Cheers! Thanks.

<!-- gh-comment-id:2801405853 --> @stark1tty commented on GitHub (Apr 14, 2025): Edit: Got it, just had to increast time out length. Cheers! Thanks.
Author
Owner

@maidou-00 commented on GitHub (Apr 14, 2025):

My bad. It is working now:) thanks Mohamed!

Folks, the fix is in the nightly build. It's not in the stable release yet. This means that you need to change the image to use the "latest" tag instead of "release" tag.

<!-- gh-comment-id:2801426353 --> @maidou-00 commented on GitHub (Apr 14, 2025): My bad. It is working now:) thanks Mohamed! > Folks, the fix is in the nightly build. It's not in the stable release yet. This means that you need to change the image to use the "latest" tag instead of "release" tag.
Author
Owner

@wouldntyouknow commented on GitHub (Apr 14, 2025):

@jeheda @wouldntyouknow any luck on your end?

No, unfortunately. Might be a different error, I don't know :)

2025-04-14T11:44:31.075Z info: [inference][585] Starting an inference job for bookmark with id "iqa1jj77jli8yyr0kvkglu54"

2025-04-14T11:44:31.079Z error: [inference][585] inference job failed: TypeError: fetch failed

TypeError: fetch failed

    at node:internal/deps/undici/undici:13502:13

    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)

    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:135:20)

    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:252:25)

    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@karakeep+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@karakeep/shared/inference.ts:2:4500)

    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@karakeep+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@karakeep/shared/inference.ts:2:5585)

    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)

    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)

    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)

    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)
<!-- gh-comment-id:2801441519 --> @wouldntyouknow commented on GitHub (Apr 14, 2025): > [@jeheda](https://github.com/jeheda) [@wouldntyouknow](https://github.com/wouldntyouknow) any luck on your end? No, unfortunately. Might be a different error, I don't know :) ``` 2025-04-14T11:44:31.075Z info: [inference][585] Starting an inference job for bookmark with id "iqa1jj77jli8yyr0kvkglu54" 2025-04-14T11:44:31.079Z error: [inference][585] inference job failed: TypeError: fetch failed TypeError: fetch failed at node:internal/deps/undici/undici:13502:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:135:20) at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:252:25) at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@karakeep+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@karakeep/shared/inference.ts:2:4500) at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@karakeep+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@karakeep/shared/inference.ts:2:5585) at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133) at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410) at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) ```
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#758
No description provided.