[GH-ISSUE #1067] Failed to fetch link content #701

Closed
opened 2026-03-02 11:52:01 +03:00 by kerem · 1 comment
Owner

Originally created by @maidou-00 on GitHub (Feb 24, 2025).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/1067

Describe the Bug

Hello, first of all I would like to thank the team for creating this awesome project and I enjoy using it very much.

I've noticed that most of the webpages take more than a couple hours to save successfully. That being said, some of them were successfully saved eventually after a couple hours, but others were still stuck in the "Failed to fetch link content" page. I've also tried using Wallabag and the pages can be saved instantly. Any tips would be appreciated.

Steps to Reproduce

  1. Go to Hoarder
  2. Try to save webpages such as "https://blog.ddsrem.com/archives/alist-xiaoya", "https://blog.csdn.net/LinkSLA/article/details/129946162" . I've tried saving different pages, most of them were problematic
  3. click on the saved pages, nothing but "Failed to fetch link content ..." is showing
  4. it would take hours for the pages to be successfully saved, and some of the pages are still stuck in "failed to fetch" even after hours of waiting

Expected Behaviour

expect the pages to be save relatively quickly

Screenshots or Additional Context

here are some logs:

2025-02-24T08:46:12.271Z error: [Crawler][88] Crawling job failed: Error: The bookmark either doesn't exist or is not a link

Error: The bookmark either doesn't exist or is not a link

at getBookmarkDetails (/app/apps/workers/workerUtils.ts:2:1575)

at runNextTicks (node:internal/process/task_queues:65:5)

at listOnTimeout (node:internal/timers:555:9)

at process.processTimers (node:internal/timers:529:7)

at async runCrawler (/app/apps/workers/crawlerWorker.ts:3:12089)

at async Object.run (/app/apps/workers/utils.ts:2:940)

at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)

2025-02-24T08:46:14.147Z error: [Crawler][88] Crawling job failed: Error: The bookmark either doesn't exist or is not a link

Error: The bookmark either doesn't exist or is not a link

at getBookmarkDetails (/app/apps/workers/workerUtils.ts:2:1575)

at runNextTicks (node:internal/process/task_queues:65:5)

at listOnTimeout (node:internal/timers:555:9)

at process.processTimers (node:internal/timers:529:7)

at async runCrawler (/app/apps/workers/crawlerWorker.ts:3:12089)

at async Object.run (/app/apps/workers/utils.ts:2:940)

at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)

2025-02-24T08:46:15.490Z error: [Crawler][88] Crawling job failed: Error: The bookmark either doesn't exist or is not a link

Error: The bookmark either doesn't exist or is not a link

at getBookmarkDetails (/app/apps/workers/workerUtils.ts:2:1575)

at runNextTicks (node:internal/process/task_queues:65:5)

at listOnTimeout (node:internal/timers:555:9)

at process.processTimers (node:internal/timers:529:7)

at async runCrawler (/app/apps/workers/crawlerWorker.ts:3:12089)

at async Object.run (/app/apps/workers/utils.ts:2:940)

at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)

2025-02-24T08:46:16.804Z error: [Crawler][88] Crawling job failed: Error: The bookmark either doesn't exist or is not a link

Error: The bookmark either doesn't exist or is not a link

at getBookmarkDetails (/app/apps/workers/workerUtils.ts:2:1575)

at runNextTicks (node:internal/process/task_queues:65:5)

at listOnTimeout (node:internal/timers:555:9)

at process.processTimers (node:internal/timers:529:7)

at async runCrawler (/app/apps/workers/crawlerWorker.ts:3:12089)

at async Object.run (/app/apps/workers/utils.ts:2:940)

at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)

2025-02-24T08:46:18.747Z error: [Crawler][88] Crawling job failed: Error: The bookmark either doesn't exist or is not a link

Error: The bookmark either doesn't exist or is not a link

at getBookmarkDetails (/app/apps/workers/workerUtils.ts:2:1575)

at runNextTicks (node:internal/process/task_queues:65:5)

at listOnTimeout (node:internal/timers:555:9)

at process.processTimers (node:internal/timers:529:7)

at async runCrawler (/app/apps/workers/crawlerWorker.ts:3:12089)

at async Object.run (/app/apps/workers/utils.ts:2:940)

at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)

2025-02-24T08:46:21.526Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23"

2025-02-24T08:46:21.526Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya

2025-02-24T08:46:21.735Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8"

2025-02-24T08:46:24.045Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ...

2025-02-24T08:46:29.047Z info: [Crawler][92] Finished waiting for the page to load.

2025-02-24T08:47:21.525Z error: [Crawler][92] Crawling job failed: Error: Timed-out after 60 secs

Error: Timed-out after 60 secs

at Timeout._onTimeout (/app/apps/workers/utils.ts:2:1025)

at listOnTimeout (node:internal/timers:594:17)

at process.processTimers (node:internal/timers:529:7)

2025-02-24T08:47:23.236Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23"

2025-02-24T08:47:23.237Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya

2025-02-24T08:47:23.421Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8"

2025-02-24T08:47:25.997Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ...

2025-02-24T08:47:30.998Z info: [Crawler][92] Finished waiting for the page to load.

2025-02-24T08:48:23.235Z error: [Crawler][92] Crawling job failed: Error: Timed-out after 60 secs

Error: Timed-out after 60 secs

at Timeout._onTimeout (/app/apps/workers/utils.ts:2:1025)

at listOnTimeout (node:internal/timers:594:17)

at process.processTimers (node:internal/timers:529:7)

2025-02-24T08:48:24.756Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23"

2025-02-24T08:48:24.757Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya

2025-02-24T08:48:24.958Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8"

2025-02-24T08:48:27.109Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ...

2025-02-24T08:48:32.110Z info: [Crawler][92] Finished waiting for the page to load.

2025-02-24T08:49:24.755Z error: [Crawler][92] Crawling job failed: Error: Timed-out after 60 secs

Error: Timed-out after 60 secs

at Timeout._onTimeout (/app/apps/workers/utils.ts:2:1025)

at listOnTimeout (node:internal/timers:594:17)

at process.processTimers (node:internal/timers:529:7)

2025-02-24T08:49:26.808Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23"

2025-02-24T08:49:26.808Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya

2025-02-24T08:49:27.001Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8"

2025-02-24T08:49:29.107Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ...

2025-02-24T08:49:34.108Z info: [Crawler][92] Finished waiting for the page to load.

2025-02-24T08:50:26.807Z error: [Crawler][92] Crawling job failed: Error: Timed-out after 60 secs

Error: Timed-out after 60 secs

at Timeout._onTimeout (/app/apps/workers/utils.ts:2:1025)

at listOnTimeout (node:internal/timers:594:17)

at process.processTimers (node:internal/timers:529:7)

2025-02-24T08:50:27.954Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23"

2025-02-24T08:50:27.955Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya

2025-02-24T08:50:28.135Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8"

2025-02-24T08:50:30.063Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ...

2025-02-24T08:50:35.063Z info: [Crawler][92] Finished waiting for the page to load.

2025-02-24T08:51:27.955Z error: [Crawler][92] Crawling job failed: Error: Timed-out after 60 secs

Error: Timed-out after 60 secs

at Timeout._onTimeout (/app/apps/workers/utils.ts:2:1025)

at listOnTimeout (node:internal/timers:594:17)

at process.processTimers (node:internal/timers:529:7)

2025-02-24T08:51:29.538Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23"

2025-02-24T08:51:29.538Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya

2025-02-24T08:51:29.739Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8"

2025-02-24T08:51:31.916Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ...

2025-02-24T08:51:36.918Z info: [Crawler][92] Finished waiting for the page to load.

Device Details

No response

Exact Hoarder Version

nightly 0.22.0

Have you checked the troubleshooting guide?

  • I have checked the troubleshooting guide and I haven't found a solution to my problem
Originally created by @maidou-00 on GitHub (Feb 24, 2025). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/1067 ### Describe the Bug Hello, first of all I would like to thank the team for creating this awesome project and I enjoy using it very much. I've noticed that most of the webpages take more than a couple hours to save successfully. That being said, some of them were successfully saved eventually after a couple hours, but others were still stuck in the "Failed to fetch link content" page. I've also tried using Wallabag and the pages can be saved instantly. Any tips would be appreciated. ### Steps to Reproduce 1. Go to Hoarder 2. Try to save webpages such as "https://blog.ddsrem.com/archives/alist-xiaoya", "https://blog.csdn.net/LinkSLA/article/details/129946162" . I've tried saving different pages, most of them were problematic 3. click on the saved pages, nothing but "Failed to fetch link content ..." is showing 4. it would take hours for the pages to be successfully saved, and some of the pages are still stuck in "failed to fetch" even after hours of waiting ### Expected Behaviour expect the pages to be save relatively quickly ### Screenshots or Additional Context here are some logs: 2025-02-24T08:46:12.271Z error: [Crawler][88] Crawling job failed: Error: The bookmark either doesn't exist or is not a link Error: The bookmark either doesn't exist or is not a link at getBookmarkDetails (/app/apps/workers/workerUtils.ts:2:1575) at runNextTicks (node:internal/process/task_queues:65:5) at listOnTimeout (node:internal/timers:555:9) at process.processTimers (node:internal/timers:529:7) at async runCrawler (/app/apps/workers/crawlerWorker.ts:3:12089) at async Object.run (/app/apps/workers/utils.ts:2:940) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) 2025-02-24T08:46:14.147Z error: [Crawler][88] Crawling job failed: Error: The bookmark either doesn't exist or is not a link Error: The bookmark either doesn't exist or is not a link at getBookmarkDetails (/app/apps/workers/workerUtils.ts:2:1575) at runNextTicks (node:internal/process/task_queues:65:5) at listOnTimeout (node:internal/timers:555:9) at process.processTimers (node:internal/timers:529:7) at async runCrawler (/app/apps/workers/crawlerWorker.ts:3:12089) at async Object.run (/app/apps/workers/utils.ts:2:940) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) 2025-02-24T08:46:15.490Z error: [Crawler][88] Crawling job failed: Error: The bookmark either doesn't exist or is not a link Error: The bookmark either doesn't exist or is not a link at getBookmarkDetails (/app/apps/workers/workerUtils.ts:2:1575) at runNextTicks (node:internal/process/task_queues:65:5) at listOnTimeout (node:internal/timers:555:9) at process.processTimers (node:internal/timers:529:7) at async runCrawler (/app/apps/workers/crawlerWorker.ts:3:12089) at async Object.run (/app/apps/workers/utils.ts:2:940) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) 2025-02-24T08:46:16.804Z error: [Crawler][88] Crawling job failed: Error: The bookmark either doesn't exist or is not a link Error: The bookmark either doesn't exist or is not a link at getBookmarkDetails (/app/apps/workers/workerUtils.ts:2:1575) at runNextTicks (node:internal/process/task_queues:65:5) at listOnTimeout (node:internal/timers:555:9) at process.processTimers (node:internal/timers:529:7) at async runCrawler (/app/apps/workers/crawlerWorker.ts:3:12089) at async Object.run (/app/apps/workers/utils.ts:2:940) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) 2025-02-24T08:46:18.747Z error: [Crawler][88] Crawling job failed: Error: The bookmark either doesn't exist or is not a link Error: The bookmark either doesn't exist or is not a link at getBookmarkDetails (/app/apps/workers/workerUtils.ts:2:1575) at runNextTicks (node:internal/process/task_queues:65:5) at listOnTimeout (node:internal/timers:555:9) at process.processTimers (node:internal/timers:529:7) at async runCrawler (/app/apps/workers/crawlerWorker.ts:3:12089) at async Object.run (/app/apps/workers/utils.ts:2:940) at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656) 2025-02-24T08:46:21.526Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23" 2025-02-24T08:46:21.526Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya 2025-02-24T08:46:21.735Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8" 2025-02-24T08:46:24.045Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ... 2025-02-24T08:46:29.047Z info: [Crawler][92] Finished waiting for the page to load. 2025-02-24T08:47:21.525Z error: [Crawler][92] Crawling job failed: Error: Timed-out after 60 secs Error: Timed-out after 60 secs at Timeout._onTimeout (/app/apps/workers/utils.ts:2:1025) at listOnTimeout (node:internal/timers:594:17) at process.processTimers (node:internal/timers:529:7) 2025-02-24T08:47:23.236Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23" 2025-02-24T08:47:23.237Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya 2025-02-24T08:47:23.421Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8" 2025-02-24T08:47:25.997Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ... 2025-02-24T08:47:30.998Z info: [Crawler][92] Finished waiting for the page to load. 2025-02-24T08:48:23.235Z error: [Crawler][92] Crawling job failed: Error: Timed-out after 60 secs Error: Timed-out after 60 secs at Timeout._onTimeout (/app/apps/workers/utils.ts:2:1025) at listOnTimeout (node:internal/timers:594:17) at process.processTimers (node:internal/timers:529:7) 2025-02-24T08:48:24.756Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23" 2025-02-24T08:48:24.757Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya 2025-02-24T08:48:24.958Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8" 2025-02-24T08:48:27.109Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ... 2025-02-24T08:48:32.110Z info: [Crawler][92] Finished waiting for the page to load. 2025-02-24T08:49:24.755Z error: [Crawler][92] Crawling job failed: Error: Timed-out after 60 secs Error: Timed-out after 60 secs at Timeout._onTimeout (/app/apps/workers/utils.ts:2:1025) at listOnTimeout (node:internal/timers:594:17) at process.processTimers (node:internal/timers:529:7) 2025-02-24T08:49:26.808Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23" 2025-02-24T08:49:26.808Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya 2025-02-24T08:49:27.001Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8" 2025-02-24T08:49:29.107Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ... 2025-02-24T08:49:34.108Z info: [Crawler][92] Finished waiting for the page to load. 2025-02-24T08:50:26.807Z error: [Crawler][92] Crawling job failed: Error: Timed-out after 60 secs Error: Timed-out after 60 secs at Timeout._onTimeout (/app/apps/workers/utils.ts:2:1025) at listOnTimeout (node:internal/timers:594:17) at process.processTimers (node:internal/timers:529:7) 2025-02-24T08:50:27.954Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23" 2025-02-24T08:50:27.955Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya 2025-02-24T08:50:28.135Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8" 2025-02-24T08:50:30.063Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ... 2025-02-24T08:50:35.063Z info: [Crawler][92] Finished waiting for the page to load. 2025-02-24T08:51:27.955Z error: [Crawler][92] Crawling job failed: Error: Timed-out after 60 secs Error: Timed-out after 60 secs at Timeout._onTimeout (/app/apps/workers/utils.ts:2:1025) at listOnTimeout (node:internal/timers:594:17) at process.processTimers (node:internal/timers:529:7) 2025-02-24T08:51:29.538Z info: [Crawler][92] Will crawl "https://blog.ddsrem.com/archives/alist-xiaoya" for link with id "dmu24f8wozgo01tv3qjfoq23" 2025-02-24T08:51:29.538Z info: [Crawler][92] Attempting to determine the content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya 2025-02-24T08:51:29.739Z info: [Crawler][92] Content-type for the url https://blog.ddsrem.com/archives/alist-xiaoya is "text/html;charset=utf-8" 2025-02-24T08:51:31.916Z info: [Crawler][92] Successfully navigated to "https://blog.ddsrem.com/archives/alist-xiaoya". Waiting for the page to load ... 2025-02-24T08:51:36.918Z info: [Crawler][92] Finished waiting for the page to load. ### Device Details _No response_ ### Exact Hoarder Version nightly 0.22.0 ### Have you checked the troubleshooting guide? - [x] I have checked the troubleshooting guide and I haven't found a solution to my problem
kerem 2026-03-02 11:52:01 +03:00
Author
Owner

@maidou-00 commented on GitHub (Feb 25, 2025):

hello, I've re-deployed the contained and everything works fine now.

<!-- gh-comment-id:2681362870 --> @maidou-00 commented on GitHub (Feb 25, 2025): hello, I've re-deployed the contained and everything works fine now.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#701
No description provided.