[GH-ISSUE #466] [Crawler] Failed to connect to the browser instance, will retry in 5 secs #300

Closed
opened 2026-03-02 11:48:36 +03:00 by kerem · 2 comments
Owner

Originally created by @JMartJonesy on GitHub (Oct 4, 2024).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/466

I setup hoarder on my Synology NAS but am seeing crawler issues whenever I bookmark anything. I've set them up as a Project in Container Manager on Synology using the compose.yml file given in the Installation guide.

Not sure if there is some extra setup I need for getting this to work on Synology.

2024/10/03 18:56:37	stdout	2024-10-04T01:56:37.056Z [32minfo[39m: [Crawler] Successfully resolved IP address, new address: <REDACTED>
2024/10/03 18:56:02	stdout	2024-10-04T01:56:02.019Z [32minfo[39m: [Crawler] Connecting to existing browser instance: http://chrome:9222
2024/10/03 18:55:57	stdout	2024-10-04T01:55:57.018Z [31merror[39m: [Crawler] Failed to connect to the browser instance, will retry in 5 secs
2024/10/03 18:53:36	stdout	2024-10-04T01:53:36.893Z [32minfo[39m: [Crawler] Successfully resolved IP address, new address: <REDACTED>
2024/10/03 18:53:01	stdout	2024-10-04T01:53:01.862Z [32minfo[39m: [Crawler] Connecting to existing browser instance: http://chrome:9222
2024/10/03 18:52:56	stdout	2024-10-04T01:52:56.861Z [31merror[39m: [Crawler] Failed to connect to the browser instance, will retry in 5 secs
2024/10/03 18:52:01	stdout	    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
2024/10/03 18:52:01	stdout	    at node:internal/deps/undici/undici:12500:13
2024/10/03 18:52:01	stdout	MeiliSearchCommunicationError: fetch failed
2024/10/03 18:52:01	stdout	2024-10-04T01:52:01.143Z [31merror[39m: [search][10] search job failed: MeiliSearchCommunicationError: fetch failed
2024/10/03 18:50:58	stdout	    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
2024/10/03 18:50:58	stdout	    at node:internal/deps/undici/undici:12500:13
2024/10/03 18:50:58	stdout	MeiliSearchCommunicationError: fetch failed 
Originally created by @JMartJonesy on GitHub (Oct 4, 2024). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/466 I setup hoarder on my Synology NAS but am seeing crawler issues whenever I bookmark anything. I've set them up as a Project in Container Manager on Synology using the compose.yml file given in the Installation guide. Not sure if there is some extra setup I need for getting this to work on Synology. ``` 2024/10/03 18:56:37 stdout 2024-10-04T01:56:37.056Z [32minfo[39m: [Crawler] Successfully resolved IP address, new address: <REDACTED> 2024/10/03 18:56:02 stdout 2024-10-04T01:56:02.019Z [32minfo[39m: [Crawler] Connecting to existing browser instance: http://chrome:9222 2024/10/03 18:55:57 stdout 2024-10-04T01:55:57.018Z [31merror[39m: [Crawler] Failed to connect to the browser instance, will retry in 5 secs 2024/10/03 18:53:36 stdout 2024-10-04T01:53:36.893Z [32minfo[39m: [Crawler] Successfully resolved IP address, new address: <REDACTED> 2024/10/03 18:53:01 stdout 2024-10-04T01:53:01.862Z [32minfo[39m: [Crawler] Connecting to existing browser instance: http://chrome:9222 2024/10/03 18:52:56 stdout 2024-10-04T01:52:56.861Z [31merror[39m: [Crawler] Failed to connect to the browser instance, will retry in 5 secs 2024/10/03 18:52:01 stdout at process.processTicksAndRejections (node:internal/process/task_queues:95:5) 2024/10/03 18:52:01 stdout at node:internal/deps/undici/undici:12500:13 2024/10/03 18:52:01 stdout MeiliSearchCommunicationError: fetch failed 2024/10/03 18:52:01 stdout 2024-10-04T01:52:01.143Z [31merror[39m: [search][10] search job failed: MeiliSearchCommunicationError: fetch failed 2024/10/03 18:50:58 stdout at process.processTicksAndRejections (node:internal/process/task_queues:95:5) 2024/10/03 18:50:58 stdout at node:internal/deps/undici/undici:12500:13 2024/10/03 18:50:58 stdout MeiliSearchCommunicationError: fetch failed ```
kerem closed this issue 2026-03-02 11:48:36 +03:00
Author
Owner

@JMartJonesy commented on GitHub (Oct 4, 2024):

I figured out that I needed to allow my docker container network bridge ip range through the Synology firewall. Seems to be working now

<!-- gh-comment-id:2392962521 --> @JMartJonesy commented on GitHub (Oct 4, 2024): I figured out that I needed to allow my docker container network bridge ip range through the Synology firewall. Seems to be working now
Author
Owner

@rightsaidfred99 commented on GitHub (Dec 30, 2024):

What ports did you open up? Having the same issue but can't get past

[cause]: o [MeiliSearchCommunicationError]: fetch failed
at node:internal/deps/undici/undici:13484:13
at process.processTicksAndRejections (node:internal/process/task_queues:105:5) {
errno: undefined,
code: undefined

<!-- gh-comment-id:2565076799 --> @rightsaidfred99 commented on GitHub (Dec 30, 2024): What ports did you open up? Having the same issue but can't get past [cause]: o [MeiliSearchCommunicationError]: fetch failed at node:internal/deps/undici/undici:13484:13 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) { errno: undefined, code: undefined
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#300
No description provided.