[GH-ISSUE #53] Crawling job failed: {"code":"SQLITE_ERROR"} #49

Closed
opened 2026-03-02 11:46:05 +03:00 by kerem · 3 comments
Owner

Originally created by @keithstric on GitHub (Mar 30, 2024).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/53

Hi just got this setup with Ollama for inference. I'm seeing the above error in the workers container when I click the "Recrawl All Links" button on the admin page. Also I get this series of errors adding a note:

2024-03-30T18:38:18.740Z error: [inference][3] inference job failed: SqliteError: no such table: bookmarks
2024-03-30T18:38:18.742Z error: Something went wrong when marking the tagging status: SqliteError: no such table: bookmarks
2024-03-30T18:38:18.760Z error: [search][5] openai job failed: SqliteError: no such table: bookmarks
2024-03-30T18:38:19.245Z error: [inference][3] inference job failed: SqliteError: no such table: bookmarks
2024-03-30T18:38:19.246Z error: Something went wrong when marking the tagging status: SqliteError: no such table: bookmarks
2024-03-30T18:38:19.851Z error: [search][5] openai job failed: SqliteError: no such table: bookmarks
2024-03-30T18:38:20.250Z error: [inference][3] inference job failed: SqliteError: no such table: bookmarks
2024-03-30T18:38:20.250Z error: Something went wrong when marking the tagging status: SqliteError: no such table: bookmarks
2024-03-30T18:38:21.956Z error: [search][5] openai job failed: SqliteError: no such table: bookmarks
2024-03-30T18:38:25.971Z error: [search][5] openai job failed: SqliteError: no such table: bookmarks
2024-03-30T18:38:33.996Z error: [search][5] openai job failed: SqliteError: no such table: bookmarks

Here is my docker-compose file:

version: "3.8"
services:
  web:
    image: ghcr.io/mohamedbassem/hoarder-web:${HOARDER_VERSION:-release}
    restart: unless-stopped
    volumes:
      - /docker-data/hoarder/data:/data
    ports:
      - 81:3000
    env_file:
      - stackstack.env
    environment:
      REDIS_HOST: redis
      MEILI_ADDR: http://meilisearch:7700
      DATA_DIR: /data
  redis:
    image: redis:7.2-alpine
    restart: unless-stopped
    volumes:
      - /docker-data/hoarder/redis:/data
  chrome:
    image: gcr.io/zenika-hub/alpine-chrome:100
    restart: unless-stopped
    command:
      - --no-sandbox
      - --disable-gpu
      - --remote-debugging-address=0.0.0.0
      - --remote-debugging-port=9222
  meilisearch:
    image: getmeili/meilisearch:v1.6
    restart: unless-stopped
    env_file:
      - stack.env
    volumes:
      - /docker-data/hoarder/meilisearch:/meili_data
  workers:
    image: ghcr.io/mohamedbassem/hoarder-workers:${HOARDER_VERSION:-release}
    restart: unless-stopped
    volumes:
      - /docker-data/hoarder/workers:/data
    env_file:
      - stack.env
    environment:
      REDIS_HOST: redis
      MEILI_ADDR: http://meilisearch:7700
      BROWSER_WEB_URL: http://chrome:9222
      DATA_DIR: /data
      # OPENAI_API_KEY: ...
    depends_on:
      web:
        condition: service_started

volumes:
  redis:
  meilisearch:
  data:
Originally created by @keithstric on GitHub (Mar 30, 2024). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/53 Hi just got this setup with Ollama for inference. I'm seeing the above error in the workers container when I click the "Recrawl All Links" button on the admin page. Also I get this series of errors adding a note: ``` 2024-03-30T18:38:18.740Z error: [inference][3] inference job failed: SqliteError: no such table: bookmarks 2024-03-30T18:38:18.742Z error: Something went wrong when marking the tagging status: SqliteError: no such table: bookmarks 2024-03-30T18:38:18.760Z error: [search][5] openai job failed: SqliteError: no such table: bookmarks 2024-03-30T18:38:19.245Z error: [inference][3] inference job failed: SqliteError: no such table: bookmarks 2024-03-30T18:38:19.246Z error: Something went wrong when marking the tagging status: SqliteError: no such table: bookmarks 2024-03-30T18:38:19.851Z error: [search][5] openai job failed: SqliteError: no such table: bookmarks 2024-03-30T18:38:20.250Z error: [inference][3] inference job failed: SqliteError: no such table: bookmarks 2024-03-30T18:38:20.250Z error: Something went wrong when marking the tagging status: SqliteError: no such table: bookmarks 2024-03-30T18:38:21.956Z error: [search][5] openai job failed: SqliteError: no such table: bookmarks 2024-03-30T18:38:25.971Z error: [search][5] openai job failed: SqliteError: no such table: bookmarks 2024-03-30T18:38:33.996Z error: [search][5] openai job failed: SqliteError: no such table: bookmarks ``` ### Here is my docker-compose file: ``` version: "3.8" services: web: image: ghcr.io/mohamedbassem/hoarder-web:${HOARDER_VERSION:-release} restart: unless-stopped volumes: - /docker-data/hoarder/data:/data ports: - 81:3000 env_file: - stackstack.env environment: REDIS_HOST: redis MEILI_ADDR: http://meilisearch:7700 DATA_DIR: /data redis: image: redis:7.2-alpine restart: unless-stopped volumes: - /docker-data/hoarder/redis:/data chrome: image: gcr.io/zenika-hub/alpine-chrome:100 restart: unless-stopped command: - --no-sandbox - --disable-gpu - --remote-debugging-address=0.0.0.0 - --remote-debugging-port=9222 meilisearch: image: getmeili/meilisearch:v1.6 restart: unless-stopped env_file: - stack.env volumes: - /docker-data/hoarder/meilisearch:/meili_data workers: image: ghcr.io/mohamedbassem/hoarder-workers:${HOARDER_VERSION:-release} restart: unless-stopped volumes: - /docker-data/hoarder/workers:/data env_file: - stack.env environment: REDIS_HOST: redis MEILI_ADDR: http://meilisearch:7700 BROWSER_WEB_URL: http://chrome:9222 DATA_DIR: /data # OPENAI_API_KEY: ... depends_on: web: condition: service_started volumes: redis: meilisearch: data: ```
kerem 2026-03-02 11:46:05 +03:00
  • closed this issue
  • added the
    question
    label
Author
Owner

@MohamedBassem commented on GitHub (Mar 30, 2024):

/docker-data/hoarder/workers:/data

The data directory of the workers and the web app must be the same. They are both working on the same data.
Change the volume of the worker to /docker-data/hoarder/data:/data and it should work.

<!-- gh-comment-id:2028442355 --> @MohamedBassem commented on GitHub (Mar 30, 2024): > /docker-data/hoarder/workers:/data The data directory of the `workers` and the `web` app must be the same. They are both working on the same data. Change the volume of the worker to `/docker-data/hoarder/data:/data` and it should work.
Author
Owner

@keithstric commented on GitHub (Mar 30, 2024):

Thank you! That fixed almost all the issues. Once I got that far i was able to figure it out. I had downloaded the mistral:7b model for Ollama and for the INFERENCE_TEXT_MODEL I had mistral defined instead of mistral:7b. I guess I should've paid more attention to the compose file I found and the documentation.

Thanks for the quick turn around on this.

<!-- gh-comment-id:2028446913 --> @keithstric commented on GitHub (Mar 30, 2024): Thank you! That fixed almost all the issues. Once I got that far i was able to figure it out. I had downloaded the mistral:7b model for Ollama and for the `INFERENCE_TEXT_MODEL` I had mistral defined instead of mistral:7b. I guess I should've paid more attention to the compose file I found and the documentation. Thanks for the quick turn around on this.
Author
Owner

@MohamedBassem commented on GitHub (Mar 30, 2024):

Glad you managed to get it working!

<!-- gh-comment-id:2028451414 --> @MohamedBassem commented on GitHub (Mar 30, 2024): Glad you managed to get it working!
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#49
No description provided.