[GH-ISSUE #60] Something went wrong #57

Closed
opened 2026-03-02 11:46:08 +03:00 by kerem · 8 comments
Owner

Originally created by @normal-painter on GitHub (Apr 1, 2024).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/60

Hello! Thank you for the amazing work. This seems like an amazing application. I am trying to get this running on unraid with docker compose, but when I try to do so with the following docker compose

`version: "3.8"
services:
web:
image: ghcr.io/mohamedbassem/hoarder-web:${HOARDER_VERSION:-release}
restart: unless-stopped
volumes:
- data:/data
ports:
- 3000:3000
env_file:
- settings.env
environment:
REDIS_HOST: redis
MEILI_ADDR: http://meilisearch:7700
DATA_DIR: /data
redis:
image: redis:7.2-alpine
restart: unless-stopped
volumes:
- redis:/data
chrome:
image: gcr.io/zenika-hub/alpine-chrome:123
restart: unless-stopped
command:
- --no-sandbox
- --disable-gpu
- --remote-debugging-address=0.0.0.0
- --remote-debugging-port=9222
meilisearch:
image: getmeili/meilisearch:v1.6
restart: unless-stopped
env_file:
- settings.env
volumes:
- meilisearch:/meili_data
workers:
image: ghcr.io/mohamedbassem/hoarder-workers:${HOARDER_VERSION:-release}
restart: unless-stopped
volumes:
- data:/data
env_file:
- settings.env
environment:
REDIS_HOST: redis
MEILI_ADDR: http://meilisearch:7700
BROWSER_WEB_URL: http://chrome:9222
DATA_DIR: /data
# OPENAI_API_KEY: ...
depends_on:
web:
condition: service_started

volumes:
redis:
meilisearch:
data:`

and the following env

DATA_DIR=/mnt/user/appdata/hoarder/data NEXTAUTH_SECRET=ee2r…F HOARDER_VERSION=release MEILI_MASTER_KEY=20…RPn3s8t1

I get the an error stating something went wrong.

I’m not sure if it’s related, but in the logs I’m seeing the following error:
sqliteerror: FOREIGN KEY constraint failed

Any thoughts on what I am doing wrong?

Thank you in advance!

Originally created by @normal-painter on GitHub (Apr 1, 2024). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/60 Hello! Thank you for the amazing work. This seems like an amazing application. I am trying to get this running on unraid with docker compose, but when I try to do so with the following docker compose `version: "3.8" services: web: image: ghcr.io/mohamedbassem/hoarder-web:${HOARDER_VERSION:-release} restart: unless-stopped volumes: - data:/data ports: - 3000:3000 env_file: - settings.env environment: REDIS_HOST: redis MEILI_ADDR: http://meilisearch:7700 DATA_DIR: /data redis: image: redis:7.2-alpine restart: unless-stopped volumes: - redis:/data chrome: image: gcr.io/zenika-hub/alpine-chrome:123 restart: unless-stopped command: - --no-sandbox - --disable-gpu - --remote-debugging-address=0.0.0.0 - --remote-debugging-port=9222 meilisearch: image: getmeili/meilisearch:v1.6 restart: unless-stopped env_file: - settings.env volumes: - meilisearch:/meili_data workers: image: ghcr.io/mohamedbassem/hoarder-workers:${HOARDER_VERSION:-release} restart: unless-stopped volumes: - data:/data env_file: - settings.env environment: REDIS_HOST: redis MEILI_ADDR: http://meilisearch:7700 BROWSER_WEB_URL: http://chrome:9222 DATA_DIR: /data # OPENAI_API_KEY: ... depends_on: web: condition: service_started volumes: redis: meilisearch: data:` and the following env `DATA_DIR=/mnt/user/appdata/hoarder/data NEXTAUTH_SECRET=ee2r…F HOARDER_VERSION=release MEILI_MASTER_KEY=20…RPn3s8t1` I get the an error stating something went wrong. I’m not sure if it’s related, but in the logs I’m seeing the following error: sqliteerror: FOREIGN KEY constraint failed Any thoughts on what I am doing wrong? Thank you in advance!
kerem 2026-03-02 11:46:08 +03:00
  • closed this issue
  • added the
    question
    label
Author
Owner

@MohamedBassem commented on GitHub (Apr 1, 2024):

Hey, I think I see the problem. DATA_DIR should point to the path inside the container and in this case, should remain /data as is.

What you want to do is to replace the volumes in the docker compose with the host path. For example:

  web:
    image: ghcr.io/mohamedbassem/hoarder-web:${HOARDER_VERSION:-release}
    restart: unless-stopped
    volumes:
      - /mnt/user/appdata/hoarder/data:/data
    ports:
      - 3000:3000
    env_file:
      - .env
    environment:
      REDIS_HOST: redis
      MEILI_ADDR: http://meilisearch:7700
      DATA_DIR: /data

Notice how the volumes section now points to your host path.

You'll need to change this for all the containers that have volumes.
So web: /mnt/user/appdata/hoarder/data
redis: /mnt/user/appdata/hoarder/redis
meilisearch /mnt/user/appdata/hoarder/meili
workers /mnt/user/appdata/hoarder/data

Notice that web and workers should share the same path.

<!-- gh-comment-id:2029009680 --> @MohamedBassem commented on GitHub (Apr 1, 2024): Hey, I think I see the problem. `DATA_DIR` should point to the path inside the container and in this case, should remain `/data` as is. What you want to do is to replace the volumes in the docker compose with the host path. For example: ``` web: image: ghcr.io/mohamedbassem/hoarder-web:${HOARDER_VERSION:-release} restart: unless-stopped volumes: - /mnt/user/appdata/hoarder/data:/data ports: - 3000:3000 env_file: - .env environment: REDIS_HOST: redis MEILI_ADDR: http://meilisearch:7700 DATA_DIR: /data ``` Notice how the volumes section now points to your host path. You'll need to change this for all the containers that have `volumes`. So web: `/mnt/user/appdata/hoarder/data` redis: `/mnt/user/appdata/hoarder/redis` meilisearch `/mnt/user/appdata/hoarder/meili` workers `/mnt/user/appdata/hoarder/data` Notice that web and workers should share the same path.
Author
Owner

@normal-painter commented on GitHub (Apr 1, 2024):

Thank you! I now no longer get the same error. However, when I add a website it does not create a preview or pull the description.

In the web logs I see several error that look like the following:
[next-auth] [error][JWT_SESSION_ERROR] https://next-auth.js.org/errors#jwt_ message: _session_error decryption operation failed { decryption operation failed', stack: 'o: decryption operation failed\n' + at /app/apps/web/.next/server/chunks/917.js:30:67340\n' + t.default (/app/apps/web/.next/server/chunks/917.js:30:67367)\n' + at f (/app/apps/web/.next/server/chunks/917.js:30:11132)\n' at async a (/app/apps/web/.next/server/chunks/917.js:30:6802/\n'+ at async a (/app/apps/web/.next/server/chunks/917.js:30:33791/\n' + at async Object.p [as decode] (/app/apps/web/.next/server/chunks/917.js:25:17267)In' + async Object.n (/app/apps/web/.next/server/chunks/917.js:25:6863)\n' + async m (/app/apps/web/.next/server/chunks/917.js:1:46021)\n' async o (/app/apps/web/.next/server/chunks/917.js:25:20596/\n' + at async y (/app/apps/web/.next/server/chunks/193.js:1:18812)', name [next-auth]

also, in the chrome container logs I see
[0401/222954.560675: ERROR:bus.cc(407)] Failed to connect to the bus: Failed to connect to socket /var/run/ dbus/system_bus_socket: No such file or directory (0401/222954.565558: ERROR:bus.cc(407)] Failed to connect to the bus: Failed to connect to socket /var/run/ dbus/system bus_socket: No such file or directory [0401/222954.565629:ERROR:bus.cc(407)] Failed to connect to the bus: Failed to connect to socket /var/run/ dbus/system_bus_socket: No such file or directory [0401/222954.567840:WARNING:dns_config_service_linux.cc(427)] Failed to read DnsConfig. [0401/222954.626966: INFO:policy_logger.cc(145)] :components/policy/core/common/config_dir_policy_loader.cc (118) Skipping mandatory platform policies because no policy file was found at: /etc/chromium/policies/man aged [0401/222954.626990:INFO:policy_logger.cc(145)] :components/policy/core/common/config_dir_policy_loader.cc (118) Skipping recommended platform policies because no policy file was found at: /etc/chromium/policies/r ecommended DevTools listening on ws://0.0.0.0:9222/devtools/browser/e4fad952-70c9-405e-937f-eeb8cbdadee3 [0401/222954.643008:WARNING:bluez_dbus_manager.cc(248)] Floss manager not present, cannot set Floss enable /disable. [0401/222954.663000:WARNING:sandbox_linux.cc(418)] InitializeSandbox() called with multiple threads in pro cess gpu-process. {0401/222954.699866:WARNING:dns_config_service_linux.cc(427)] Failed to read DnsConfig.

Thank you again for taking time to help!!

<!-- gh-comment-id:2030699971 --> @normal-painter commented on GitHub (Apr 1, 2024): Thank you! I now no longer get the same error. However, when I add a website it does not create a preview or pull the description. In the web logs I see several error that look like the following: `[next-auth] [error][JWT_SESSION_ERROR] https://next-auth.js.org/errors#jwt_ message: _session_error decryption operation failed { decryption operation failed', stack: 'o: decryption operation failed\n' + at /app/apps/web/.next/server/chunks/917.js:30:67340\n' + t.default (/app/apps/web/.next/server/chunks/917.js:30:67367)\n' + at f (/app/apps/web/.next/server/chunks/917.js:30:11132)\n' at async a (/app/apps/web/.next/server/chunks/917.js:30:6802/\n'+ at async a (/app/apps/web/.next/server/chunks/917.js:30:33791/\n' + at async Object.p [as decode] (/app/apps/web/.next/server/chunks/917.js:25:17267)In' + async Object.n (/app/apps/web/.next/server/chunks/917.js:25:6863)\n' + async m (/app/apps/web/.next/server/chunks/917.js:1:46021)\n' async o (/app/apps/web/.next/server/chunks/917.js:25:20596/\n' + at async y (/app/apps/web/.next/server/chunks/193.js:1:18812)', name [next-auth]` also, in the chrome container logs I see `[0401/222954.560675: ERROR:bus.cc(407)] Failed to connect to the bus: Failed to connect to socket /var/run/ dbus/system_bus_socket: No such file or directory (0401/222954.565558: ERROR:bus.cc(407)] Failed to connect to the bus: Failed to connect to socket /var/run/ dbus/system bus_socket: No such file or directory [0401/222954.565629:ERROR:bus.cc(407)] Failed to connect to the bus: Failed to connect to socket /var/run/ dbus/system_bus_socket: No such file or directory [0401/222954.567840:WARNING:dns_config_service_linux.cc(427)] Failed to read DnsConfig. [0401/222954.626966: INFO:policy_logger.cc(145)] :components/policy/core/common/config_dir_policy_loader.cc (118) Skipping mandatory platform policies because no policy file was found at: /etc/chromium/policies/man aged [0401/222954.626990:INFO:policy_logger.cc(145)] :components/policy/core/common/config_dir_policy_loader.cc (118) Skipping recommended platform policies because no policy file was found at: /etc/chromium/policies/r ecommended DevTools listening on ws://0.0.0.0:9222/devtools/browser/e4fad952-70c9-405e-937f-eeb8cbdadee3 [0401/222954.643008:WARNING:bluez_dbus_manager.cc(248)] Floss manager not present, cannot set Floss enable /disable. [0401/222954.663000:WARNING:sandbox_linux.cc(418)] InitializeSandbox() called with multiple threads in pro cess gpu-process. {0401/222954.699866:WARNING:dns_config_service_linux.cc(427)] Failed to read DnsConfig.` Thank you again for taking time to help!!
Author
Owner

@MohamedBassem commented on GitHub (Apr 1, 2024):

what about the logs from the workers container? It’s the one that talks to chrome to fetch the previews.

<!-- gh-comment-id:2030708040 --> @MohamedBassem commented on GitHub (Apr 1, 2024): what about the logs from the workers container? It’s the one that talks to chrome to fetch the previews.
Author
Owner

@normal-painter commented on GitHub (Apr 1, 2024):

Here’s the whole log:
`Corepack is about to download https://registry.npmjs.org/pnpm/-/pnpm-8.15.6.tgz.
(node: 34) [DEP0040] DeprecationWarning: The
punycode module is deprecated. Please use a userland alternative instead.
(Use node --trace-deprecation ... to
show where the warning was created)

@hoarder/workers@0.1.0 start:prod /app/apps/workers
tsx index.ts
2024-04-01T22:29:58.6922 info: Connecting to existing browser instance: http://chrome: 9222
2024-04-01T22:29:58.711z info: Successfully
resolved IP address, new address: http://172.21.0.2:9222/
2024-04-0122:30:00.2902 info: Starting
crawler worker...
2024-04-01722:30:00.2922 info: Starting
inference worker...
2024-04-01722:30:00.2922 info: Starting
search indexing worker ...
2024-04-01T22:30:47.844 debug: [inference]|1] No inference client configured, nothing to do now
2024-04-0122:30:47.8472 info:
[inference][1] Completed successfully
2024-04-01T22:30:48.014z info: [search][1] Completed
successfully`

<!-- gh-comment-id:2030728094 --> @normal-painter commented on GitHub (Apr 1, 2024): Here’s the whole log: `Corepack is about to download https://registry.npmjs.org/pnpm/-/pnpm-8.15.6.tgz. (node: 34) [DEP0040] DeprecationWarning: The punycode module is deprecated. Please use a userland alternative instead. (Use node --trace-deprecation ... to show where the warning was created) > @hoarder/workers@0.1.0 start:prod /app/apps/workers > tsx index.ts 2024-04-01T22:29:58.6922 info: Connecting to existing browser instance: http://chrome: 9222 2024-04-01T22:29:58.711z info: Successfully resolved IP address, new address: http://172.21.0.2:9222/ 2024-04-0122:30:00.2902 info: Starting crawler worker... 2024-04-01722:30:00.2922 info: Starting inference worker... 2024-04-01722:30:00.2922 info: Starting search indexing worker ... 2024-04-01T22:30:47.844 debug: [inference]|1] No inference client configured, nothing to do now 2024-04-0122:30:47.8472 info: [inference][1] Completed successfully 2024-04-01T22:30:48.014z info: [search][1] Completed successfully`
Author
Owner

@MohamedBassem commented on GitHub (Apr 1, 2024):

The logs doesn’t show anything being crawled. Can you go to the admin panel and press the “crawl all bookmarks” button? If the previews don’t appear afterwards, please share the new workers logs.

<!-- gh-comment-id:2030806235 --> @MohamedBassem commented on GitHub (Apr 1, 2024): The logs doesn’t show anything being crawled. Can you go to the admin panel and press the “crawl all bookmarks” button? If the previews don’t appear afterwards, please share the new workers logs.
Author
Owner

@normal-painter commented on GitHub (Apr 2, 2024):

Each time I press reindex all bookmarks I get the following:
2024-04-02T00:36:55.698Z info: [search][21] Completed successfully 2024-04-02T00:36:55.704Z info: [search][22] Completed successfully 2024-04-02T00:36:55.714Z info: [search][23] Completed successfully

I currently have 3 sites bookmarked (for testing purposes), unraid.net, hoarder,app, and reddit.com.

<!-- gh-comment-id:2030856988 --> @normal-painter commented on GitHub (Apr 2, 2024): Each time I press reindex all bookmarks I get the following: `2024-04-02T00:36:55.698Z info: [search][21] Completed successfully 2024-04-02T00:36:55.704Z info: [search][22] Completed successfully 2024-04-02T00:36:55.714Z info: [search][23] Completed successfully` I currently have 3 sites bookmarked (for testing purposes), unraid.net, hoarder,app, and reddit.com.
Author
Owner

@MohamedBassem commented on GitHub (Apr 2, 2024):

Hmm, did you try restarting the workers container? I've seen some instances of it getting stuck and it's on my todolist to fix that.

<!-- gh-comment-id:2031612282 --> @MohamedBassem commented on GitHub (Apr 2, 2024): Hmm, did you try restarting the workers container? I've seen some instances of it getting stuck and it's on my todolist to fix that.
Author
Owner

@normal-painter commented on GitHub (Apr 4, 2024):

I had a chance to try reinstalling it again and it works! Thank you for all your help. The reason it wasn’t fetching the previews is because I was adding websites as “www.reddit.com” or “hoarder.app”. Once I started adding links with http(s) it worked as expected. Thank you!!

<!-- gh-comment-id:2037428810 --> @normal-painter commented on GitHub (Apr 4, 2024): I had a chance to try reinstalling it again and it works! Thank you for all your help. The reason it wasn’t fetching the previews is because I was adding websites as “www.reddit.com” or “hoarder.app”. Once I started adding links with http(s) it worked as expected. Thank you!!
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#57
No description provided.