[GH-ISSUE #1904] [0.27][Proxmox/Linux Installation] Pages not refreshing anymore / Workers are not running after upgrading to 0.27 #1180

Closed
opened 2026-03-02 11:55:34 +03:00 by kerem · 23 comments
Owner

Originally created by @CrazyWolf13 on GitHub (Sep 7, 2025).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/1904

Describe the Bug

@MohamedBassem
After updating to 0.27, my pages are not updating, they just stay in queued mode and nothing more happens.

I have a local AI (ollama) connected, but that config worked perfectly fine before the update

Steps to Reproduce

  1. Add a new bookmark
  2. see that it never gets crawled and stays in queued )it does not fail)

Expected Behaviour

I expect it to be checked and it should appear like before.

Screenshots or Additional Context

Image Image

Device Details

No response

Exact Karakeep Version

0.27

Have you checked the troubleshooting guide?

  • I have checked the troubleshooting guide and I haven't found a solution to my problem
Originally created by @CrazyWolf13 on GitHub (Sep 7, 2025). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/1904 ### Describe the Bug @MohamedBassem After updating to 0.27, my pages are not updating, they just stay in queued mode and nothing more happens. I have a local AI (ollama) connected, but that config worked perfectly fine before the update ### Steps to Reproduce 1. Add a new bookmark 2. see that it never gets crawled and stays in queued )it does not fail) ### Expected Behaviour I expect it to be checked and it should appear like before. ### Screenshots or Additional Context <img width="1028" height="422" alt="Image" src="https://github.com/user-attachments/assets/048baa78-e5f5-46b9-950c-b4bfc1c1d900" /> <img width="690" height="393" alt="Image" src="https://github.com/user-attachments/assets/b047866b-305d-4f8c-961d-589a9224b1da" /> ### Device Details _No response_ ### Exact Karakeep Version 0.27 ### Have you checked the troubleshooting guide? - [x] I have checked the troubleshooting guide and I haven't found a solution to my problem
kerem 2026-03-02 11:55:34 +03:00
Author
Owner

@MohamedBassem commented on GitHub (Sep 7, 2025):

@CrazyWolf13 can you share the logs? Also how are you deploying karakeep? Is it proxmox?

<!-- gh-comment-id:3263971807 --> @MohamedBassem commented on GitHub (Sep 7, 2025): @CrazyWolf13 can you share the logs? Also how are you deploying karakeep? Is it proxmox?
Author
Owner

@CrazyWolf13 commented on GitHub (Sep 7, 2025):

@MohamedBassem

Sep 07 20:02:42 karakeep systemd[1]: karakeep-workers.service: Scheduled restart job, restart counter is at 3.
Sep 07 20:02:42 karakeep systemd[1]: Started karakeep-workers.service - karakeep Workers.
Sep 07 20:02:42 karakeep node[60687]: node:internal/modules/cjs/loader:1386
Sep 07 20:02:42 karakeep node[60687]:   throw err;
Sep 07 20:02:42 karakeep node[60687]:   ^
Sep 07 20:02:42 karakeep node[60687]: Error: Cannot find module '/opt/karakeep/apps/workers/dist/index.mjs'
Sep 07 20:02:42 karakeep node[60687]:     at Function._resolveFilename (node:internal/modules/cjs/loader:1383:15)
Sep 07 20:02:42 karakeep node[60687]:     at defaultResolveImpl (node:internal/modules/cjs/loader:1025:19)
Sep 07 20:02:42 karakeep node[60687]:     at resolveForCJSWithHooks (node:internal/modules/cjs/loader:1030:22)
Sep 07 20:02:42 karakeep node[60687]:     at Function._load (node:internal/modules/cjs/loader:1192:37)
Sep 07 20:02:42 karakeep node[60687]:     at TracingChannel.traceSync (node:diagnostics_channel:322:14)
Sep 07 20:02:42 karakeep node[60687]:     at wrapModuleLoad (node:internal/modules/cjs/loader:237:24)
Sep 07 20:02:42 karakeep node[60687]:     at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:171:5)
Sep 07 20:02:42 karakeep node[60687]:     at node:internal/main/run_main_module:36:49 {
Sep 07 20:02:42 karakeep node[60687]:   code: 'MODULE_NOT_FOUND',
Sep 07 20:02:42 karakeep node[60687]:   requireStack: []
Sep 07 20:02:42 karakeep node[60687]: }
Sep 07 20:02:42 karakeep node[60687]: Node.js v22.19.0
Sep 07 20:02:42 karakeep systemd[1]: karakeep-workers.service: Main process exited, code=exited, status=1/FAILURE
Sep 07 20:02:42 karakeep systemd[1]: karakeep-workers.service: Failed with result 'exit-code'

Yes, I'm from community-scripts team :)

<!-- gh-comment-id:3263980080 --> @CrazyWolf13 commented on GitHub (Sep 7, 2025): @MohamedBassem ```bash Sep 07 20:02:42 karakeep systemd[1]: karakeep-workers.service: Scheduled restart job, restart counter is at 3. Sep 07 20:02:42 karakeep systemd[1]: Started karakeep-workers.service - karakeep Workers. Sep 07 20:02:42 karakeep node[60687]: node:internal/modules/cjs/loader:1386 Sep 07 20:02:42 karakeep node[60687]: throw err; Sep 07 20:02:42 karakeep node[60687]: ^ Sep 07 20:02:42 karakeep node[60687]: Error: Cannot find module '/opt/karakeep/apps/workers/dist/index.mjs' Sep 07 20:02:42 karakeep node[60687]: at Function._resolveFilename (node:internal/modules/cjs/loader:1383:15) Sep 07 20:02:42 karakeep node[60687]: at defaultResolveImpl (node:internal/modules/cjs/loader:1025:19) Sep 07 20:02:42 karakeep node[60687]: at resolveForCJSWithHooks (node:internal/modules/cjs/loader:1030:22) Sep 07 20:02:42 karakeep node[60687]: at Function._load (node:internal/modules/cjs/loader:1192:37) Sep 07 20:02:42 karakeep node[60687]: at TracingChannel.traceSync (node:diagnostics_channel:322:14) Sep 07 20:02:42 karakeep node[60687]: at wrapModuleLoad (node:internal/modules/cjs/loader:237:24) Sep 07 20:02:42 karakeep node[60687]: at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:171:5) Sep 07 20:02:42 karakeep node[60687]: at node:internal/main/run_main_module:36:49 { Sep 07 20:02:42 karakeep node[60687]: code: 'MODULE_NOT_FOUND', Sep 07 20:02:42 karakeep node[60687]: requireStack: [] Sep 07 20:02:42 karakeep node[60687]: } Sep 07 20:02:42 karakeep node[60687]: Node.js v22.19.0 Sep 07 20:02:42 karakeep systemd[1]: karakeep-workers.service: Main process exited, code=exited, status=1/FAILURE Sep 07 20:02:42 karakeep systemd[1]: karakeep-workers.service: Failed with result 'exit-code' ``` Yes, I'm from community-scripts team :)
Author
Owner

@MohamedBassem commented on GitHub (Sep 7, 2025):

@CrazyWolf13 oh yeah, I gave a heads up to the community-scripts team an hour ago (https://github.com/community-scripts/ProxmoxVE/issues/7457) about this. Really sorry for the hassle. Let me pin the issue in this repo until it's fixed.

<!-- gh-comment-id:3263981094 --> @MohamedBassem commented on GitHub (Sep 7, 2025): @CrazyWolf13 oh yeah, I gave a heads up to the community-scripts team an hour ago (https://github.com/community-scripts/ProxmoxVE/issues/7457) about this. Really sorry for the hassle. Let me pin the issue in this repo until it's fixed.
Author
Owner

@CrazyWolf13 commented on GitHub (Sep 7, 2025):

@MohamedBassem Oh well, guess I looked too early at our repo :P

wil look into a fix :)

<!-- gh-comment-id:3263982300 --> @CrazyWolf13 commented on GitHub (Sep 7, 2025): @MohamedBassem Oh well, guess I looked too early at our repo :P wil look into a fix :)
Author
Owner

@CrazyWolf13 commented on GitHub (Sep 7, 2025):

@MohamedBassem Quickfix was just merged, thanks for the heads up :)

This means from now on all community-script users that update won't see this issue. So you can safely remove the issue from pins.

<!-- gh-comment-id:3263985847 --> @CrazyWolf13 commented on GitHub (Sep 7, 2025): @MohamedBassem Quickfix was just merged, thanks for the heads up :) This means from now on all community-script users that update won't see this issue. So you can safely remove the issue from pins.
Author
Owner

@CrazyWolf13 commented on GitHub (Sep 7, 2025):

@MohamedBassem For me the issue does not seem resolved, the links still don't get processed:

Sep 07 21:44:05 karakeep node[140]: page.goto: Timeout 30000ms exceeded.
Sep 07 21:44:05 karakeep node[140]: Call log:
Sep 07 21:44:05 karakeep node[140]:   - navigating to "https://google.com/", waiting until "domcontentloaded"
Sep 07 21:44:05 karakeep node[140]:     at crawlPage (/opt/karakeep/apps/workers/dist/index.js:43304:31)
Sep 07 21:44:05 karakeep node[140]:     at async crawlAndParseUrl (/opt/karakeep/apps/workers/dist/index.js:43588:18)
Sep 07 21:44:05 karakeep node[140]:     at async Object.runCrawler (/opt/karakeep/apps/workers/dist/index.js:43705:25)
Sep 07 21:44:05 karakeep node[140]:     at async Runner.runOnce (/opt/karakeep/node_modules/liteque/dist/index.js:229:4)
Sep 07 21:44:05 karakeep node[140]: 2025-09-07T19:44:05.528Z info: [Crawler][2899] Will crawl "https://google.com" for link with id "ocbyi6j5kz38614r2px4qnyy"
Sep 07 21:44:05 karakeep node[140]: 2025-09-07T19:44:05.528Z info: [Crawler][2899] Attempting to determine the content-type for the url https://google.com
Sep 07 21:44:05 karakeep node[140]: 2025-09-07T19:44:05.701Z info: [Crawler][2899] Content-type for the url https://google.com is "text/html"
Sep 07 21:44:05 karakeep node[140]: 2025-09-07T19:44:05.934Z info: [Crawler][2899] Navigating to "https://google.com"
Sep 07 21:44:23 karakeep node[140]: 2025-09-07T19:44:23.818Z info: [ruleEngine][2916] Completed successfully
Sep 07 21:44:24 karakeep node[140]: 2025-09-07T19:44:24.635Z info: [webhook][2918] Starting a webhook job for bookmark with id "eef0izzmlz7j72gccrpfl09y for operation "created"
Sep 07 21:44:24 karakeep node[140]: 2025-09-07T19:44:24.635Z info: [webhook][2918] Completed successfully
Sep 07 21:44:24 karakeep node[140]: 2025-09-07T19:44:24.703Z info: [search][2917] Attempting to index bookmark with id eef0izzmlz7j72gccrpfl09y ...
Sep 07 21:44:24 karakeep node[140]: 2025-09-07T19:44:24.920Z info: [search][2917] Completed successfully
Sep 07 21:44:29 karakeep node[140]: 2025-09-07T19:44:29.735Z error: [inference][2914] inference job failed: Error: Timeout
Sep 07 21:44:29 karakeep node[140]: Error: Timeout
Sep 07 21:44:29 karakeep node[140]:     at Timeout._onTimeout (file:///opt/karakeep/node_modules/liteque/dist/index.js:231:28)
Sep 07 21:44:29 karakeep node[140]:     at listOnTimeout (node:internal/timers:588:17)
Sep 07 21:44:29 karakeep node[140]:     at process.processTimers (node:internal/timers:523:7)
Sep 07 21:44:29 karakeep node[140]: 2025-09-07T19:44:29.786Z info: [inference][2914] Starting a summary job for bookmark with id "b3uhzeynr10s2qrqiiusv19o"
Sep 07 21:44:36 karakeep node[140]: 2025-09-07T19:44:36.056Z error: [Crawler][2899] Crawling job failed: TimeoutError: page.goto: Timeout 30000ms exceeded.
Sep 07 21:44:36 karakeep node[140]: Call log:
Sep 07 21:44:36 karakeep node[140]:   - navigating to "https://google.com/", waiting until "domcontentloaded"
Sep 07 21:44:36 karakeep node[140]: page.goto: Timeout 30000ms exceeded.
Sep 07 21:44:36 karakeep node[140]: Call log:
Sep 07 21:44:36 karakeep node[140]:   - navigating to "https://google.com/", waiting until "domcontentloaded"
Sep 07 21:44:36 karakeep node[140]:     at crawlPage (/opt/karakeep/apps/workers/dist/index.js:43304:31)
Sep 07 21:44:36 karakeep node[140]:     at async crawlAndParseUrl (/opt/karakeep/apps/workers/dist/index.js:43588:18)
Sep 07 21:44:36 karakeep node[140]:     at async Object.runCrawler (/opt/karakeep/apps/workers/dist/index.js:43705:25)
Sep 07 21:44:36 karakeep node[140]:     at async Runner.runOnce (/opt/karakeep/node_modules/liteque/dist/index.js:229:4)
Sep 07 21:44:36 karakeep node[140]: 2025-09-07T19:44:36.101Z info: [Crawler][2899] Will crawl "https://google.com" for link with id "ocbyi6j5kz38614r2px4qnyy"
Sep 07 21:44:36 karakeep node[140]: 2025-09-07T19:44:36.101Z info: [Crawler][2899] Attempting to determine the content-type for the url https://google.com
Sep 07 21:44:36 karakeep node[140]: 2025-09-07T19:44:36.417Z info: [Crawler][2899] Content-type for the url https://google.com is "text/html"
Sep 07 21:44:36 karakeep node[140]: 2025-09-07T19:44:36.674Z info: [Crawler][2899] Navigating to "https://google.com"
Sep 07 21:44:59 karakeep node[140]: 2025-09-07T19:44:59.787Z error: [inference][2914] inference job failed: Error: Timeout
Sep 07 21:44:59 karakeep node[140]: Error: Timeout
Sep 07 21:44:59 karakeep node[140]:     at Timeout._onTimeout (file:///opt/karakeep/node_modules/liteque/dist/index.js:231:28)
Sep 07 21:44:59 karakeep node[140]:     at listOnTimeout (node:internal/timers:588:17)
Sep 07 21:44:59 karakeep node[140]:     at process.processTimers (node:internal/timers:523:7)
Sep 07 21:44:59 karakeep node[140]: 2025-09-07T19:44:59.833Z info: [inference][2914] Starting a summary job for bookmark with id "b3uhzeynr10s2qrqiiusv19o"
Sep 07 21:45:06 karakeep node[140]: 2025-09-07T19:45:06.791Z error: [Crawler][2899] Crawling job failed: TimeoutError: page.goto: Timeout 30000ms exceeded.
Sep 07 21:45:06 karakeep node[140]: Call log:
Sep 07 21:45:06 karakeep node[140]:   - navigating to "https://google.com/", waiting until "domcontentloaded"
Sep 07 21:45:06 karakeep node[140]: page.goto: Timeout 30000ms exceeded.
Sep 07 21:45:06 karakeep node[140]: Call log:
Sep 07 21:45:06 karakeep node[140]:   - navigating to "https://google.com/", waiting until "domcontentloaded"
Sep 07 21:45:06 karakeep node[140]:     at crawlPage (/opt/karakeep/apps/workers/dist/index.js:43304:31)
Sep 07 21:45:06 karakeep node[140]:     at async crawlAndParseUrl (/opt/karakeep/apps/workers/dist/index.js:43588:18)
Sep 07 21:45:06 karakeep node[140]:     at async Object.runCrawler (/opt/karakeep/apps/workers/dist/index.js:43705:25)
Sep 07 21:45:06 karakeep node[140]:     at async Runner.runOnce (/opt/karakeep/node_modules/liteque/dist/index.js:229:4)
Sep 07 21:45:06 karakeep node[140]: 2025-09-07T19:45:06.841Z info: [Crawler][2899] Will crawl "https://google.com" for link with id "ocbyi6j5kz38614r2px4qnyy"
Sep 07 21:45:06 karakeep node[140]: 2025-09-07T19:45:06.841Z info: [Crawler][2899] Attempting to determine the content-type for the url https://google.com
Sep 07 21:45:07 karakeep node[140]: 2025-09-07T19:45:07.009Z info: [Crawler][2899] Content-type for the url https://google.com is "text/html"
Sep 07 21:45:07 karakeep node[140]: 2025-09-07T19:45:07.238Z info: [Crawler][2899] Navigating to "https://google.com"
root@karakeep:~# 
<!-- gh-comment-id:3263998913 --> @CrazyWolf13 commented on GitHub (Sep 7, 2025): @MohamedBassem For me the issue does not seem resolved, the links still don't get processed: ```bash Sep 07 21:44:05 karakeep node[140]: page.goto: Timeout 30000ms exceeded. Sep 07 21:44:05 karakeep node[140]: Call log: Sep 07 21:44:05 karakeep node[140]: - navigating to "https://google.com/", waiting until "domcontentloaded" Sep 07 21:44:05 karakeep node[140]: at crawlPage (/opt/karakeep/apps/workers/dist/index.js:43304:31) Sep 07 21:44:05 karakeep node[140]: at async crawlAndParseUrl (/opt/karakeep/apps/workers/dist/index.js:43588:18) Sep 07 21:44:05 karakeep node[140]: at async Object.runCrawler (/opt/karakeep/apps/workers/dist/index.js:43705:25) Sep 07 21:44:05 karakeep node[140]: at async Runner.runOnce (/opt/karakeep/node_modules/liteque/dist/index.js:229:4) Sep 07 21:44:05 karakeep node[140]: 2025-09-07T19:44:05.528Z info: [Crawler][2899] Will crawl "https://google.com" for link with id "ocbyi6j5kz38614r2px4qnyy" Sep 07 21:44:05 karakeep node[140]: 2025-09-07T19:44:05.528Z info: [Crawler][2899] Attempting to determine the content-type for the url https://google.com Sep 07 21:44:05 karakeep node[140]: 2025-09-07T19:44:05.701Z info: [Crawler][2899] Content-type for the url https://google.com is "text/html" Sep 07 21:44:05 karakeep node[140]: 2025-09-07T19:44:05.934Z info: [Crawler][2899] Navigating to "https://google.com" Sep 07 21:44:23 karakeep node[140]: 2025-09-07T19:44:23.818Z info: [ruleEngine][2916] Completed successfully Sep 07 21:44:24 karakeep node[140]: 2025-09-07T19:44:24.635Z info: [webhook][2918] Starting a webhook job for bookmark with id "eef0izzmlz7j72gccrpfl09y for operation "created" Sep 07 21:44:24 karakeep node[140]: 2025-09-07T19:44:24.635Z info: [webhook][2918] Completed successfully Sep 07 21:44:24 karakeep node[140]: 2025-09-07T19:44:24.703Z info: [search][2917] Attempting to index bookmark with id eef0izzmlz7j72gccrpfl09y ... Sep 07 21:44:24 karakeep node[140]: 2025-09-07T19:44:24.920Z info: [search][2917] Completed successfully Sep 07 21:44:29 karakeep node[140]: 2025-09-07T19:44:29.735Z error: [inference][2914] inference job failed: Error: Timeout Sep 07 21:44:29 karakeep node[140]: Error: Timeout Sep 07 21:44:29 karakeep node[140]: at Timeout._onTimeout (file:///opt/karakeep/node_modules/liteque/dist/index.js:231:28) Sep 07 21:44:29 karakeep node[140]: at listOnTimeout (node:internal/timers:588:17) Sep 07 21:44:29 karakeep node[140]: at process.processTimers (node:internal/timers:523:7) Sep 07 21:44:29 karakeep node[140]: 2025-09-07T19:44:29.786Z info: [inference][2914] Starting a summary job for bookmark with id "b3uhzeynr10s2qrqiiusv19o" Sep 07 21:44:36 karakeep node[140]: 2025-09-07T19:44:36.056Z error: [Crawler][2899] Crawling job failed: TimeoutError: page.goto: Timeout 30000ms exceeded. Sep 07 21:44:36 karakeep node[140]: Call log: Sep 07 21:44:36 karakeep node[140]: - navigating to "https://google.com/", waiting until "domcontentloaded" Sep 07 21:44:36 karakeep node[140]: page.goto: Timeout 30000ms exceeded. Sep 07 21:44:36 karakeep node[140]: Call log: Sep 07 21:44:36 karakeep node[140]: - navigating to "https://google.com/", waiting until "domcontentloaded" Sep 07 21:44:36 karakeep node[140]: at crawlPage (/opt/karakeep/apps/workers/dist/index.js:43304:31) Sep 07 21:44:36 karakeep node[140]: at async crawlAndParseUrl (/opt/karakeep/apps/workers/dist/index.js:43588:18) Sep 07 21:44:36 karakeep node[140]: at async Object.runCrawler (/opt/karakeep/apps/workers/dist/index.js:43705:25) Sep 07 21:44:36 karakeep node[140]: at async Runner.runOnce (/opt/karakeep/node_modules/liteque/dist/index.js:229:4) Sep 07 21:44:36 karakeep node[140]: 2025-09-07T19:44:36.101Z info: [Crawler][2899] Will crawl "https://google.com" for link with id "ocbyi6j5kz38614r2px4qnyy" Sep 07 21:44:36 karakeep node[140]: 2025-09-07T19:44:36.101Z info: [Crawler][2899] Attempting to determine the content-type for the url https://google.com Sep 07 21:44:36 karakeep node[140]: 2025-09-07T19:44:36.417Z info: [Crawler][2899] Content-type for the url https://google.com is "text/html" Sep 07 21:44:36 karakeep node[140]: 2025-09-07T19:44:36.674Z info: [Crawler][2899] Navigating to "https://google.com" Sep 07 21:44:59 karakeep node[140]: 2025-09-07T19:44:59.787Z error: [inference][2914] inference job failed: Error: Timeout Sep 07 21:44:59 karakeep node[140]: Error: Timeout Sep 07 21:44:59 karakeep node[140]: at Timeout._onTimeout (file:///opt/karakeep/node_modules/liteque/dist/index.js:231:28) Sep 07 21:44:59 karakeep node[140]: at listOnTimeout (node:internal/timers:588:17) Sep 07 21:44:59 karakeep node[140]: at process.processTimers (node:internal/timers:523:7) Sep 07 21:44:59 karakeep node[140]: 2025-09-07T19:44:59.833Z info: [inference][2914] Starting a summary job for bookmark with id "b3uhzeynr10s2qrqiiusv19o" Sep 07 21:45:06 karakeep node[140]: 2025-09-07T19:45:06.791Z error: [Crawler][2899] Crawling job failed: TimeoutError: page.goto: Timeout 30000ms exceeded. Sep 07 21:45:06 karakeep node[140]: Call log: Sep 07 21:45:06 karakeep node[140]: - navigating to "https://google.com/", waiting until "domcontentloaded" Sep 07 21:45:06 karakeep node[140]: page.goto: Timeout 30000ms exceeded. Sep 07 21:45:06 karakeep node[140]: Call log: Sep 07 21:45:06 karakeep node[140]: - navigating to "https://google.com/", waiting until "domcontentloaded" Sep 07 21:45:06 karakeep node[140]: at crawlPage (/opt/karakeep/apps/workers/dist/index.js:43304:31) Sep 07 21:45:06 karakeep node[140]: at async crawlAndParseUrl (/opt/karakeep/apps/workers/dist/index.js:43588:18) Sep 07 21:45:06 karakeep node[140]: at async Object.runCrawler (/opt/karakeep/apps/workers/dist/index.js:43705:25) Sep 07 21:45:06 karakeep node[140]: at async Runner.runOnce (/opt/karakeep/node_modules/liteque/dist/index.js:229:4) Sep 07 21:45:06 karakeep node[140]: 2025-09-07T19:45:06.841Z info: [Crawler][2899] Will crawl "https://google.com" for link with id "ocbyi6j5kz38614r2px4qnyy" Sep 07 21:45:06 karakeep node[140]: 2025-09-07T19:45:06.841Z info: [Crawler][2899] Attempting to determine the content-type for the url https://google.com Sep 07 21:45:07 karakeep node[140]: 2025-09-07T19:45:07.009Z info: [Crawler][2899] Content-type for the url https://google.com is "text/html" Sep 07 21:45:07 karakeep node[140]: 2025-09-07T19:45:07.238Z info: [Crawler][2899] Navigating to "https://google.com" root@karakeep:~# ```
Author
Owner

@MohamedBassem commented on GitHub (Sep 7, 2025):

@CrazyWolf13 is the server that this is running on healthy (not saturated on CPU or anything)?

<!-- gh-comment-id:3264003300 --> @MohamedBassem commented on GitHub (Sep 7, 2025): @CrazyWolf13 is the server that this is running on healthy (not saturated on CPU or anything)?
Author
Owner

@CrazyWolf13 commented on GitHub (Sep 7, 2025):

@MohamedBassem Not really no

Image

Though I made sure to bump ressources to 5GB RAM and 3 Coes but it's still not working :/

<!-- gh-comment-id:3264009665 --> @CrazyWolf13 commented on GitHub (Sep 7, 2025): @MohamedBassem Not really no <img width="3390" height="651" alt="Image" src="https://github.com/user-attachments/assets/21c1a142-3c8b-4c22-b106-692e438db659" /> Though I made sure to bump ressources to 5GB RAM and 3 Coes but it's still not working :/
Author
Owner

@MohamedBassem commented on GitHub (Sep 7, 2025):

@CrazyWolf13 and is the chrome service healthy? The timeouts are very weird specially with the inference job. Are you setting any proxy settings in the env variables? Anything special in your setup? And I assume given that you bumped the resources that restarts doesn't help?

<!-- gh-comment-id:3264012249 --> @MohamedBassem commented on GitHub (Sep 7, 2025): @CrazyWolf13 and is the chrome service healthy? The timeouts are very weird specially with the inference job. Are you setting any proxy settings in the env variables? Anything special in your setup? And I assume given that you bumped the resources that restarts doesn't help?
Author
Owner

@CrazyWolf13 commented on GitHub (Sep 7, 2025):

@MohamedBassem
It is reported as healthy:

**root@karakeep:~# systemctl status karakeep-browser
● karakeep-browser.service - karakeep Headless Browser
     Loaded: loaded (/etc/systemd/system/karakeep-browser.service; enabled; preset: enabled)
     Active: active (running) since Sun 2025-09-07 22:13:17 CEST; 5min ago
 Invocation: 4c6183ce6ead44a691040cb5f103a028
   Main PID: 131 (chromium)
      Tasks: 100 (limit: 38255)
     Memory: 623.7M (peak: 628.7M)
        CPU: 20.561s
     CGroup: /system.slice/karakeep-browser.service**

Here the recent logs:

Sep 07 22:13:17 karakeep systemd[1]: Started karakeep-browser.service - karakeep Headless Browser.
Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.513447:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix")
Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.664504:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix")
Sep 07 22:13:17 karakeep chromium[131]: DevTools listening on ws://127.0.0.1:9222/devtools/browser/da405ac0-9746-4760-95fe-da7a41d8ab88
Sep 07 22:13:17 karakeep chromium[131]: [131:131:0907/221317.904084:ERROR:dbus/object_proxy.cc:573] Failed to call method: org.freedesktop.DBus.NameHasOwner: object_path= /org/freedesktop/DBus: unknown error type:
Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904403:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix")
Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904426:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix")
Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904436:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix")
Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904443:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix")
Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904450:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix")
Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904458:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix")
Sep 07 22:13:17 karakeep chromium[131]: [131:131:0907/221317.909821:ERROR:dbus/object_proxy.cc:573] Failed to call method: org.freedesktop.DBus.NameHasOwner: object_path= /org/freedesktop/DBus: unknown error type:
Sep 07 22:13:17 karakeep chromium[131]: [131:131:0907/221317.933966:ERROR:dbus/object_proxy.cc:573] Failed to call method: org.freedesktop.DBus.NameHasOwner: object_path= /org/freedesktop/DBus: unknown error type:
Sep 07 22:13:17 karakeep chromium[131]: [131:131:0907/221317.933994:ERROR:dbus/object_proxy.cc:573] Failed to call method: org.freedesktop.DBus.NameHasOwner: object_path= /org/freedesktop/DBus: unknown error type:
Sep 07 22:13:17 karakeep chromium[131]: [131:131:0907/221317.934256:ERROR:dbus/object_proxy.cc:573] Failed to call method: org.freedesktop.DBus.NameHasOwner: object_path= /org/freedesktop/DBus: unknown error type:
Sep 07 22:13:21 karakeep chromium[131]: [131:249:0907/221321.199759:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: PHONE_REGISTRATION_ERROR
Sep 07 22:13:21 karakeep chromium[131]: [131:249:0907/221321.200669:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: PHONE_REGISTRATION_ERROR
Sep 07 22:13:21 karakeep chromium[131]: [131:249:0907/221321.202067:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: PHONE_REGISTRATION_ERROR
Sep 07 22:13:46 karakeep chromium[131]: [131:249:0907/221346.897811:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: DEPRECATED_ENDPOINT
Sep 07 22:14:45 karakeep chromium[131]: [131:249:0907/221445.795465:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: DEPRECATED_ENDPOINT
Sep 07 22:16:17 karakeep chromium[795]: Warning: maxDynamicUniformBuffersPerPipelineLayout artificially reduced from 1000000 to 16 to fit dynamic offset allocation limit.
Sep 07 22:16:17 karakeep chromium[795]: Warning: maxDynamicStorageBuffersPerPipelineLayout artificially reduced from 1000000 to 16 to fit dynamic offset allocation limit.
Sep 07 22:16:17 karakeep chromium[131]: [131:131:0907/221617.730558:ERROR:services/on_device_model/public/cpp/service_client.cc:36] Unexpected on_device_model service disconnect: Error loading backend.
Sep 07 22:16:30 karakeep chromium[131]: [131:249:0907/221630.722373:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: DEPRECATED_ENDPOINT

We use this config:
https://github.com/community-scripts/ProxmoxVE/blob/main/install/karakeep-install.sh#L140C1-L152C4
It was done by @vhsdream maybe he has an idea?

I don't see anything special, give that it worked perfectly fine pre-0-27 upgrade.

No, restarting doesn't fix the issue.

<!-- gh-comment-id:3264018719 --> @CrazyWolf13 commented on GitHub (Sep 7, 2025): @MohamedBassem It is reported as healthy: ```bash **root@karakeep:~# systemctl status karakeep-browser ● karakeep-browser.service - karakeep Headless Browser Loaded: loaded (/etc/systemd/system/karakeep-browser.service; enabled; preset: enabled) Active: active (running) since Sun 2025-09-07 22:13:17 CEST; 5min ago Invocation: 4c6183ce6ead44a691040cb5f103a028 Main PID: 131 (chromium) Tasks: 100 (limit: 38255) Memory: 623.7M (peak: 628.7M) CPU: 20.561s CGroup: /system.slice/karakeep-browser.service** ``` Here the recent logs: ```bash Sep 07 22:13:17 karakeep systemd[1]: Started karakeep-browser.service - karakeep Headless Browser. Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.513447:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix") Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.664504:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix") Sep 07 22:13:17 karakeep chromium[131]: DevTools listening on ws://127.0.0.1:9222/devtools/browser/da405ac0-9746-4760-95fe-da7a41d8ab88 Sep 07 22:13:17 karakeep chromium[131]: [131:131:0907/221317.904084:ERROR:dbus/object_proxy.cc:573] Failed to call method: org.freedesktop.DBus.NameHasOwner: object_path= /org/freedesktop/DBus: unknown error type: Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904403:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix") Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904426:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix") Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904436:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix") Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904443:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix") Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904450:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix") Sep 07 22:13:17 karakeep chromium[131]: [131:248:0907/221317.904458:ERROR:dbus/bus.cc:408] Failed to connect to the bus: Could not parse server address: Unknown address type (examples of valid types are "tcp" and on UNIX "unix") Sep 07 22:13:17 karakeep chromium[131]: [131:131:0907/221317.909821:ERROR:dbus/object_proxy.cc:573] Failed to call method: org.freedesktop.DBus.NameHasOwner: object_path= /org/freedesktop/DBus: unknown error type: Sep 07 22:13:17 karakeep chromium[131]: [131:131:0907/221317.933966:ERROR:dbus/object_proxy.cc:573] Failed to call method: org.freedesktop.DBus.NameHasOwner: object_path= /org/freedesktop/DBus: unknown error type: Sep 07 22:13:17 karakeep chromium[131]: [131:131:0907/221317.933994:ERROR:dbus/object_proxy.cc:573] Failed to call method: org.freedesktop.DBus.NameHasOwner: object_path= /org/freedesktop/DBus: unknown error type: Sep 07 22:13:17 karakeep chromium[131]: [131:131:0907/221317.934256:ERROR:dbus/object_proxy.cc:573] Failed to call method: org.freedesktop.DBus.NameHasOwner: object_path= /org/freedesktop/DBus: unknown error type: Sep 07 22:13:21 karakeep chromium[131]: [131:249:0907/221321.199759:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: PHONE_REGISTRATION_ERROR Sep 07 22:13:21 karakeep chromium[131]: [131:249:0907/221321.200669:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: PHONE_REGISTRATION_ERROR Sep 07 22:13:21 karakeep chromium[131]: [131:249:0907/221321.202067:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: PHONE_REGISTRATION_ERROR Sep 07 22:13:46 karakeep chromium[131]: [131:249:0907/221346.897811:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: DEPRECATED_ENDPOINT Sep 07 22:14:45 karakeep chromium[131]: [131:249:0907/221445.795465:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: DEPRECATED_ENDPOINT Sep 07 22:16:17 karakeep chromium[795]: Warning: maxDynamicUniformBuffersPerPipelineLayout artificially reduced from 1000000 to 16 to fit dynamic offset allocation limit. Sep 07 22:16:17 karakeep chromium[795]: Warning: maxDynamicStorageBuffersPerPipelineLayout artificially reduced from 1000000 to 16 to fit dynamic offset allocation limit. Sep 07 22:16:17 karakeep chromium[131]: [131:131:0907/221617.730558:ERROR:services/on_device_model/public/cpp/service_client.cc:36] Unexpected on_device_model service disconnect: Error loading backend. Sep 07 22:16:30 karakeep chromium[131]: [131:249:0907/221630.722373:ERROR:google_apis/gcm/engine/registration_request.cc:291] Registration response error message: DEPRECATED_ENDPOINT ``` We use this config: https://github.com/community-scripts/ProxmoxVE/blob/main/install/karakeep-install.sh#L140C1-L152C4 It was done by @vhsdream maybe he has an idea? I don't see anything special, give that it worked perfectly fine pre-0-27 upgrade. No, restarting doesn't fix the issue.
Author
Owner

@MohamedBassem commented on GitHub (Sep 7, 2025):

@CrazyWolf13 I meant in your own env variables. Maybe a bit too much to ask, but does the same problem repro on a fresh installation?

<!-- gh-comment-id:3264020751 --> @MohamedBassem commented on GitHub (Sep 7, 2025): @CrazyWolf13 I meant in your own env variables. Maybe a bit too much to ask, but does the same problem repro on a fresh installation?
Author
Owner

@MohamedBassem commented on GitHub (Sep 7, 2025):

@CrazyWolf13 also maybe do you want to join discord so that we can debug there faster?

<!-- gh-comment-id:3264020888 --> @MohamedBassem commented on GitHub (Sep 7, 2025): @CrazyWolf13 also maybe do you want to join discord so that we can debug there faster?
Author
Owner

@CrazyWolf13 commented on GitHub (Sep 7, 2025):

@MohamedBassem sure I joined the dc and started a thread.

<!-- gh-comment-id:3264023724 --> @CrazyWolf13 commented on GitHub (Sep 7, 2025): @MohamedBassem sure I joined the dc and started a thread.
Author
Owner

@CrazyWolf13 commented on GitHub (Sep 7, 2025):

Resolved by adding the following:

CRAWLER_JOB_TIMEOUT_SEC=120
CRAWLER_NAVIGATE_TIMEOUT_SEC=100
<!-- gh-comment-id:3264039334 --> @CrazyWolf13 commented on GitHub (Sep 7, 2025): Resolved by adding the following: ```bash CRAWLER_JOB_TIMEOUT_SEC=120 CRAWLER_NAVIGATE_TIMEOUT_SEC=100 ```
Author
Owner

@vhsdream commented on GitHub (Sep 7, 2025):

@MohamedBassem @CrazyWolf13 For what it's worth, I'm seeing the same output with chromium but web crawling is still working for me, so I don't think the problem is with Chromium.

PS you can suppress all that nasty output by adding --log-level=3 to the ExecStart= in the service file.

<!-- gh-comment-id:3264040884 --> @vhsdream commented on GitHub (Sep 7, 2025): @MohamedBassem @CrazyWolf13 For what it's worth, I'm seeing the same output with chromium but web crawling is still working for me, so I don't think the problem is with Chromium. PS you can suppress all that nasty output by adding `--log-level=3` to the `ExecStart=` in the service file.
Author
Owner

@abjeroen commented on GitHub (Sep 8, 2025):

same issue here. systemctl: ● karakeep-workers.service loaded failed failed

<!-- gh-comment-id:3265387225 --> @abjeroen commented on GitHub (Sep 8, 2025): same issue here. systemctl: ● karakeep-workers.service loaded failed failed
Author
Owner

@MohamedBassem commented on GitHub (Sep 8, 2025):

@abjeroen are you also using proxmox? The upgrade script should be fixed by now

<!-- gh-comment-id:3265403581 --> @MohamedBassem commented on GitHub (Sep 8, 2025): @abjeroen are you also using proxmox? The upgrade script should be fixed by now
Author
Owner

@abjeroen commented on GitHub (Sep 8, 2025):

@MohamedBassem yes. I updated 3 hours ago. When I run the update script again, it says its already on the correct version. Things were fine before the update process. Restart no use

<!-- gh-comment-id:3265462481 --> @abjeroen commented on GitHub (Sep 8, 2025): @MohamedBassem yes. I updated 3 hours ago. When I run the update script again, it says its already on the correct version. Things were fine before the update process. Restart no use
Author
Owner

@MohamedBassem commented on GitHub (Sep 8, 2025):

@abjeroen what's the content of /etc/systemd/system/karakeep-workers.service?

If it says ExecStart=/usr/bin/node dist/index.mjs you'll need to update this to ExecStart=/usr/bin/node dist/index.js and then run systemctl daemon-reloadand thensystemctl restart karakeep-workers.service`. This should hopefully fix it.

<!-- gh-comment-id:3265472023 --> @MohamedBassem commented on GitHub (Sep 8, 2025): @abjeroen what's the content of `/etc/systemd/system/karakeep-workers.service`? If it says `ExecStart=/usr/bin/node dist/index.mjs` you'll need to update this to `ExecStart=/usr/bin/node dist/index.js and then run `systemctl daemon-reload` and then `systemctl restart karakeep-workers.service`. This should hopefully fix it.
Author
Owner

@abjeroen commented on GitHub (Sep 8, 2025):

hmm.. mine already says ExecStart=/usr/bin/node dist/index.js . Still did a reload and restart, but no luck.. Also, when I load karakeep in the browser, it says: "Application error: a server-side exception has occurred while loading [ip] (see the server logs for more information)."

<!-- gh-comment-id:3265727878 --> @abjeroen commented on GitHub (Sep 8, 2025): hmm.. mine already says ExecStart=/usr/bin/node dist/index.js . Still did a reload and restart, but no luck.. Also, when I load karakeep in the browser, it says: "Application error: a server-side exception has occurred while loading [ip] (see the server logs for more information)."
Author
Owner

@MohamedBassem commented on GitHub (Sep 8, 2025):

@abjeroen if the web is not working as well then you have another problem. Can you share your logs maybe in a separate issue?

<!-- gh-comment-id:3265752983 --> @MohamedBassem commented on GitHub (Sep 8, 2025): @abjeroen if the web is not working as well then you have another problem. Can you share your logs maybe in a separate issue?
Author
Owner

@abjeroen commented on GitHub (Sep 8, 2025):

update: I removed the LXC and installed it fresh again. Now the web works, but I still had the above issue. The workers file indeed said *.mjs, which I replaced with *.js. Now the AI tagging and the workers are both fixed. Thanks!

<!-- gh-comment-id:3265951616 --> @abjeroen commented on GitHub (Sep 8, 2025): update: I removed the LXC and installed it fresh again. Now the web works, but I still had the above issue. The workers file indeed said *.mjs, which I replaced with *.js. Now the AI tagging and the workers are both fixed. Thanks!
Author
Owner

@CrazyWolf13 commented on GitHub (Sep 8, 2025):

@MohamedBassem @abjeroen Just submitted a fix for our install script.
https://github.com/community-scripts/ProxmoxVE/pull/7482

<!-- gh-comment-id:3266020394 --> @CrazyWolf13 commented on GitHub (Sep 8, 2025): @MohamedBassem @abjeroen Just submitted a fix for our install script. https://github.com/community-scripts/ProxmoxVE/pull/7482
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#1180
No description provided.