[GH-ISSUE #346] meilisearc: search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists. #224

Open
opened 2026-03-02 11:47:46 +03:00 by kerem · 9 comments
Owner

Originally created by @8BitGitHub on GitHub (Aug 8, 2024).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/346

Everything seems to work. Added a few links and the OpenAI tagging works (well, to OpenAI but not LM Studio but that's a different topic).

Getting this error below though. Home shows all the saved items. Search says "No bookmarks".

This is a plain vanilla install using official Docker on Ubuntu server.

workers-1 | 2024-08-08T06:08:58.929Z error: [search][24] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists.
workers-1 | 2024-08-08T06:09:00.052Z info: [search][25] Attempting to index bookmark with id vrjuf9dl17f4gpto6phh5g2c ...
meilisearch-1 | [2024-08-08T06:09:00Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 137 "-" "node" 0.006329
meilisearch-1 | [2024-08-08T06:09:00Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/71 HTTP/1.1" 200 303 "-" "node" 0.000198
meilisearch-1 | [2024-08-08T06:09:00Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/71 HTTP/1.1" 200 489 "-" "node" 0.000190
workers-1 | 2024-08-08T06:09:00.125Z error: [search][25] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists.
workers-1 | 2024-08-08T06:09:02.962Z info: [search][24] Attempting to index bookmark with id vrjuf9dl17f4gpto6phh5g2c ...
meilisearch-1 | [2024-08-08T06:09:02Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 137 "-" "node" 0.008883
meilisearch-1 | [2024-08-08T06:09:02Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/72 HTTP/1.1" 200 303 "-" "node" 0.000289
meilisearch-1 | [2024-08-08T06:09:03Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/72 HTTP/1.1" 200 489 "-" "node" 0.000204
workers-1 | 2024-08-08T06:09:03.037Z error: [search][24] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists.
workers-1 | 2024-08-08T06:09:04.168Z info: [search][25] Attempting to index bookmark with id vrjuf9dl17f4gpto6phh5g2c ...
meilisearch-1 | [2024-08-08T06:09:04Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 136 "-" "node" 0.007771
meilisearch-1 | [2024-08-08T06:09:04Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/73 HTTP/1.1" 200 302 "-" "node" 0.000753
meilisearch-1 | [2024-08-08T06:09:04Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/73 HTTP/1.1" 200 488 "-" "node" 0.000314
workers-1 | 2024-08-08T06:09:04.246Z error: [search][25] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists.

Originally created by @8BitGitHub on GitHub (Aug 8, 2024). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/346 Everything seems to work. Added a few links and the OpenAI tagging works (well, to OpenAI but not LM Studio but that's a different topic). Getting this error below though. Home shows all the saved items. Search says "No bookmarks". This is a plain vanilla install using official Docker on Ubuntu server. workers-1 | 2024-08-08T06:08:58.929Z error: [search][24] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists. workers-1 | 2024-08-08T06:09:00.052Z info: [search][25] Attempting to index bookmark with id vrjuf9dl17f4gpto6phh5g2c ... meilisearch-1 | [2024-08-08T06:09:00Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 137 "-" "node" 0.006329 meilisearch-1 | [2024-08-08T06:09:00Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/71 HTTP/1.1" 200 303 "-" "node" 0.000198 meilisearch-1 | [2024-08-08T06:09:00Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/71 HTTP/1.1" 200 489 "-" "node" 0.000190 workers-1 | 2024-08-08T06:09:00.125Z error: [search][25] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists. workers-1 | 2024-08-08T06:09:02.962Z info: [search][24] Attempting to index bookmark with id vrjuf9dl17f4gpto6phh5g2c ... meilisearch-1 | [2024-08-08T06:09:02Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 137 "-" "node" 0.008883 meilisearch-1 | [2024-08-08T06:09:02Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/72 HTTP/1.1" 200 303 "-" "node" 0.000289 meilisearch-1 | [2024-08-08T06:09:03Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/72 HTTP/1.1" 200 489 "-" "node" 0.000204 workers-1 | 2024-08-08T06:09:03.037Z error: [search][24] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists. workers-1 | 2024-08-08T06:09:04.168Z info: [search][25] Attempting to index bookmark with id vrjuf9dl17f4gpto6phh5g2c ... meilisearch-1 | [2024-08-08T06:09:04Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 136 "-" "node" 0.007771 meilisearch-1 | [2024-08-08T06:09:04Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/73 HTTP/1.1" 200 302 "-" "node" 0.000753 meilisearch-1 | [2024-08-08T06:09:04Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/73 HTTP/1.1" 200 488 "-" "node" 0.000314 workers-1 | 2024-08-08T06:09:04.246Z error: [search][25] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists.
Author
Owner

@MohamedBassem commented on GitHub (Aug 8, 2024):

Please check the recommendation here: https://github.com/hoarder-app/hoarder/issues/342

<!-- gh-comment-id:2275198667 --> @MohamedBassem commented on GitHub (Aug 8, 2024): Please check the recommendation here: https://github.com/hoarder-app/hoarder/issues/342
Author
Owner

@StackShard commented on GitHub (Aug 8, 2024):

Thank you for posting this! Had the same problem and worked for me. Went into /var/lib/docker/....meili..../_data. Deleted everything inside. Started up the containers, things work.

<!-- gh-comment-id:2276166857 --> @StackShard commented on GitHub (Aug 8, 2024): Thank you for posting this! Had the same problem and worked for me. Went into /var/lib/docker/....meili..../_data. Deleted everything inside. Started up the containers, things work.
Author
Owner

@MohamedBassem commented on GitHub (Aug 8, 2024):

weird that this problem is becoming more common lately. Are you doing anything special during installation?

<!-- gh-comment-id:2276182776 --> @MohamedBassem commented on GitHub (Aug 8, 2024): weird that this problem is becoming more common lately. Are you doing anything special during installation?
Author
Owner

@StackShard commented on GitHub (Aug 8, 2024):

weird that this problem is becoming more common lately. Are you doing anything special during installation?

Me personally, no. Deployed latest Ubuntu 24.x Server. Updated repos. Installed official Docker from ... Docker. Pulled the YAML. Added .ENV with the 4-5 variables as suggested (ie. private keys). docker compose up. That's it. Nothing else done. No separation of data volumes, etc. Normally I'd use Debian so I'm not sure if that's different than usual. I see the other fellow above also using Ubuntu.

<!-- gh-comment-id:2276196198 --> @StackShard commented on GitHub (Aug 8, 2024): > weird that this problem is becoming more common lately. Are you doing anything special during installation? Me personally, no. Deployed latest Ubuntu 24.x Server. Updated repos. Installed official Docker from ... Docker. Pulled the YAML. Added .ENV with the 4-5 variables as suggested (ie. private keys). docker compose up. That's it. Nothing else done. No separation of data volumes, etc. Normally I'd use Debian so I'm not sure if that's different than usual. I see the other fellow above also using Ubuntu.
Author
Owner

@StackShard commented on GitHub (Aug 8, 2024):

weird that this problem is becoming more common lately. Are you doing anything special during installation?

May have spoken too soon. Rebooted, started it all up again but got the same. Had to clear out the directory again and re-crawl to get it to work:

meilisearch-1 | [2024-08-08T16:29:33Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 137 "-" "node" 0.006005
meilisearch-1 | [2024-08-08T16:29:33Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/25 HTTP/1.1" 200 303 "-" "node" 0.000293
meilisearch-1 | [2024-08-08T16:29:33Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/25 HTTP/1.1" 200 488 "-" "node" 0.000190
workers-1 | 2024-08-08T16:29:33.587Z error: [search][54] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists.
workers-1 | 2024-08-08T16:29:34.705Z info: [search][55] Attempting to index bookmark with id lhw0461d4a5aed8vl3l0iduw ...
meilisearch-1 | [2024-08-08T16:29:34Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 137 "-" "node" 0.006847
meilisearch-1 | [2024-08-08T16:29:34Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/26 HTTP/1.1" 200 303 "-" "node" 0.002073
meilisearch-1 | [2024-08-08T16:29:34Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/26 HTTP/1.1" 200 489 "-" "node" 0.000178
workers-1 | 2024-08-08T16:29:34.788Z error: [search][55] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists.


I added one link. It worked. I deleted it and tried to add another, same error. This is odd.


meilisearch-1 | [2024-08-08T16:35:58Z INFO actix_web::middleware::logger] 172.18.0.6 "DELETE /indexes/bookmarks/documents/lhw0461d4a5aed8vl3l0iduw HTTP/1.1" 202 128 "-" "node" 0.004890
meilisearch-1 | [2024-08-08T16:35:58Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/5 HTTP/1.1" 200 310 "-" "node" 0.000270
web-1 | s [TRPCError]: Bookmark not found
web-1 | at /app/apps/web/.next/server/chunks/838.js:1:8025
web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557)
web-1 | at async t (/app/apps/web/.next/server/chunks/673.js:4823:8930)
web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557)
web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557)
web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557)
web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557)
web-1 | at async t (/app/apps/web/.next/server/chunks/673.js:4823:9896)
web-1 | at async /app/apps/web/.next/server/app/api/trpc/[trpc]/route.js:1:4281
web-1 | at async Promise.all (index 1) {
web-1 | code: 'NOT_FOUND',
web-1 | [cause]: undefined
web-1 | }
meilisearch-1 | [2024-08-08T16:35:58Z INFO actix_web::middleware::logger] 172.18.0.5 "POST /indexes/bookmarks/search HTTP/1.1" 200 141 "-" "node" 0.000631
meilisearch-1 | [2024-08-08T16:35:58Z INFO index_scheduler::batch] document addition done: DocumentAdditionResult { indexed_documents: 0, number_of_documents: 0 }
meilisearch-1 | [2024-08-08T16:35:58Z INFO index_scheduler] A batch of tasks was successfully completed.
meilisearch-1 | [2024-08-08T16:35:58Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/5 HTTP/1.1" 200 346 "-" "node" 0.000316
workers-1 | 2024-08-08T16:35:58.373Z info: [search][60] Completed successfully
web-1 | s [TRPCError]: Bookmark not found
web-1 | at /app/apps/web/.next/server/chunks/838.js:1:8025
web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557)
web-1 | at async t (/app/apps/web/.next/server/chunks/673.js:4823:8930)
web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557)
web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557)
web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557)
web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557)
web-1 | at async t (/app/apps/web/.next/server/chunks/673.js:4823:9896)
web-1 | at async /app/apps/web/.next/server/app/api/trpc/[trpc]/route.js:1:4281
web-1 | at async Promise.all (index 0) {
web-1 | code: 'NOT_FOUND',
web-1 | [cause]: undefined
web-1 | }
workers-1 | 2024-08-08T16:36:13.121Z info: [Crawler][16] Will crawl "https://hoarder.app" for link with id "zp06oirtfjb7jykr8o8x7fad"
workers-1 | 2024-08-08T16:36:13.122Z info: [Crawler][16] Attempting to determine the content-type for the url https://hoarder.app
workers-1 | 2024-08-08T16:36:13.126Z info: [search][61] Attempting to index bookmark with id zp06oirtfjb7jykr8o8x7fad ...
meilisearch-1 | [2024-08-08T16:36:13Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 136 "-" "node" 0.006874
meilisearch-1 | [2024-08-08T16:36:13Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/6 HTTP/1.1" 200 302 "-" "node" 0.000242
meilisearch-1 | [2024-08-08T16:36:13Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/6 HTTP/1.1" 200 488 "-" "node" 0.000347
workers-1 | 2024-08-08T16:36:13.208Z error: [search][61] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists.
workers-1 | 2024-08-08T16:36:13.356Z info: [Crawler][16] Content-type for the url https://hoarder.app is "text/html; charset=utf-8"
chrome-1 | [0808/163613.400919:WARNING:runtime_features.cc(728)] AttributionReportingCrossAppWeb cannot be enabled in this configuration. Use --enable-features=ConversionMeasurement,AttributionReportingCrossAppWeb in addition.
workers-1 | 2024-08-08T16:36:13.939Z info: [Crawler][16] Successfully navigated to "https://hoarder.app". Waiting for the page to load ...
workers-1 | 2024-08-08T16:36:14.227Z info: [search][61] Attempting to index bookmark with id zp06oirtfjb7jykr8o8x7fad ...
meilisearch-1 | [2024-08-08T16:36:14Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 136 "-" "node" 0.007690
meilisearch-1 | [2024-08-08T16:36:14Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/7 HTTP/1.1" 200 302 "-" "node" 0.000155
meilisearch-1 | [2024-08-08T16:36:14Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/7 HTTP/1.1" 200 488 "-" "node" 0.000173
workers-1 | 2024-08-08T16:36:14.299Z error: [search][61] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists.
workers-1 | 2024-08-08T16:36:15.003Z info: [Crawler][16] Finished waiting for the page to load.
workers-1 | 2024-08-08T16:36:15.267Z info: [Crawler][16] Finished capturing page content and a screenshot. FullPageScreenshot: false
workers-1 | 2024-08-08T16:36:15.286Z info: [Crawler][16] Will attempt to extract metadata from page ...
workers-1 | 2024-08-08T16:36:16.185Z info: [Crawler][16] Will attempt to extract readable content ...
workers-1 | 2024-08-08T16:36:16.733Z info: [Crawler][16] Done extracting readable content.
workers-1 | 2024-08-08T16:36:16.766Z info: [search][61] Attempting to index bookmark with id zp06oirtfjb7jykr8o8x7fad ...
workers-1 | 2024-08-08T16:36:16.774Z info: [Crawler][16] Stored the screenshot as assetId: e6a60992-df37-4d2d-bb2a-279ef919c809

<!-- gh-comment-id:2276233275 --> @StackShard commented on GitHub (Aug 8, 2024): > weird that this problem is becoming more common lately. Are you doing anything special during installation? May have spoken too soon. Rebooted, started it all up again but got the same. Had to clear out the directory again and re-crawl to get it to work: meilisearch-1 | [2024-08-08T16:29:33Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 137 "-" "node" 0.006005 meilisearch-1 | [2024-08-08T16:29:33Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/25 HTTP/1.1" 200 303 "-" "node" 0.000293 meilisearch-1 | [2024-08-08T16:29:33Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/25 HTTP/1.1" 200 488 "-" "node" 0.000190 workers-1 | 2024-08-08T16:29:33.587Z error: [search][54] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists. workers-1 | 2024-08-08T16:29:34.705Z info: [search][55] Attempting to index bookmark with id lhw0461d4a5aed8vl3l0iduw ... meilisearch-1 | [2024-08-08T16:29:34Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 137 "-" "node" 0.006847 meilisearch-1 | [2024-08-08T16:29:34Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/26 HTTP/1.1" 200 303 "-" "node" 0.002073 meilisearch-1 | [2024-08-08T16:29:34Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/26 HTTP/1.1" 200 489 "-" "node" 0.000178 workers-1 | 2024-08-08T16:29:34.788Z error: [search][55] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists. ------ I added one link. It worked. I deleted it and tried to add another, same error. This is odd. ------ meilisearch-1 | [2024-08-08T16:35:58Z INFO actix_web::middleware::logger] 172.18.0.6 "DELETE /indexes/bookmarks/documents/lhw0461d4a5aed8vl3l0iduw HTTP/1.1" 202 128 "-" "node" 0.004890 meilisearch-1 | [2024-08-08T16:35:58Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/5 HTTP/1.1" 200 310 "-" "node" 0.000270 web-1 | s [TRPCError]: Bookmark not found web-1 | at /app/apps/web/.next/server/chunks/838.js:1:8025 web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557) web-1 | at async t (/app/apps/web/.next/server/chunks/673.js:4823:8930) web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557) web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557) web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557) web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557) web-1 | at async t (/app/apps/web/.next/server/chunks/673.js:4823:9896) web-1 | at async /app/apps/web/.next/server/app/api/trpc/[trpc]/route.js:1:4281 web-1 | at async Promise.all (index 1) { web-1 | code: 'NOT_FOUND', web-1 | [cause]: undefined web-1 | } meilisearch-1 | [2024-08-08T16:35:58Z INFO actix_web::middleware::logger] 172.18.0.5 "POST /indexes/bookmarks/search HTTP/1.1" 200 141 "-" "node" 0.000631 meilisearch-1 | [2024-08-08T16:35:58Z INFO index_scheduler::batch] document addition done: DocumentAdditionResult { indexed_documents: 0, number_of_documents: 0 } meilisearch-1 | [2024-08-08T16:35:58Z INFO index_scheduler] A batch of tasks was successfully completed. meilisearch-1 | [2024-08-08T16:35:58Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/5 HTTP/1.1" 200 346 "-" "node" 0.000316 workers-1 | 2024-08-08T16:35:58.373Z info: [search][60] Completed successfully web-1 | s [TRPCError]: Bookmark not found web-1 | at /app/apps/web/.next/server/chunks/838.js:1:8025 web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557) web-1 | at async t (/app/apps/web/.next/server/chunks/673.js:4823:8930) web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557) web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557) web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557) web-1 | at async a (/app/apps/web/.next/server/chunks/673.js:4823:9557) web-1 | at async t (/app/apps/web/.next/server/chunks/673.js:4823:9896) web-1 | at async /app/apps/web/.next/server/app/api/trpc/[trpc]/route.js:1:4281 web-1 | at async Promise.all (index 0) { web-1 | code: 'NOT_FOUND', web-1 | [cause]: undefined web-1 | } workers-1 | 2024-08-08T16:36:13.121Z info: [Crawler][16] Will crawl "https://hoarder.app" for link with id "zp06oirtfjb7jykr8o8x7fad" workers-1 | 2024-08-08T16:36:13.122Z info: [Crawler][16] Attempting to determine the content-type for the url https://hoarder.app workers-1 | 2024-08-08T16:36:13.126Z info: [search][61] Attempting to index bookmark with id zp06oirtfjb7jykr8o8x7fad ... meilisearch-1 | [2024-08-08T16:36:13Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 136 "-" "node" 0.006874 meilisearch-1 | [2024-08-08T16:36:13Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/6 HTTP/1.1" 200 302 "-" "node" 0.000242 meilisearch-1 | [2024-08-08T16:36:13Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/6 HTTP/1.1" 200 488 "-" "node" 0.000347 workers-1 | 2024-08-08T16:36:13.208Z error: [search][61] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists. workers-1 | 2024-08-08T16:36:13.356Z info: [Crawler][16] Content-type for the url https://hoarder.app is "text/html; charset=utf-8" chrome-1 | [0808/163613.400919:WARNING:runtime_features.cc(728)] AttributionReportingCrossAppWeb cannot be enabled in this configuration. Use --enable-features=ConversionMeasurement,AttributionReportingCrossAppWeb in addition. workers-1 | 2024-08-08T16:36:13.939Z info: [Crawler][16] Successfully navigated to "https://hoarder.app". Waiting for the page to load ... workers-1 | 2024-08-08T16:36:14.227Z info: [search][61] Attempting to index bookmark with id zp06oirtfjb7jykr8o8x7fad ... meilisearch-1 | [2024-08-08T16:36:14Z INFO actix_web::middleware::logger] 172.18.0.6 "POST /indexes/bookmarks/documents?primaryKey=id HTTP/1.1" 202 136 "-" "node" 0.007690 meilisearch-1 | [2024-08-08T16:36:14Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/7 HTTP/1.1" 200 302 "-" "node" 0.000155 meilisearch-1 | [2024-08-08T16:36:14Z INFO actix_web::middleware::logger] 172.18.0.6 "GET /tasks/7 HTTP/1.1" 200 488 "-" "node" 0.000173 workers-1 | 2024-08-08T16:36:14.299Z error: [search][61] search job failed: Error: Search task failed: internal: MDB_KEYEXIST: Key/data pair already exists. workers-1 | 2024-08-08T16:36:15.003Z info: [Crawler][16] Finished waiting for the page to load. workers-1 | 2024-08-08T16:36:15.267Z info: [Crawler][16] Finished capturing page content and a screenshot. FullPageScreenshot: false workers-1 | 2024-08-08T16:36:15.286Z info: [Crawler][16] Will attempt to extract metadata from page ... workers-1 | 2024-08-08T16:36:16.185Z info: [Crawler][16] Will attempt to extract readable content ... workers-1 | 2024-08-08T16:36:16.733Z info: [Crawler][16] Done extracting readable content. workers-1 | 2024-08-08T16:36:16.766Z info: [search][61] Attempting to index bookmark with id zp06oirtfjb7jykr8o8x7fad ... workers-1 | 2024-08-08T16:36:16.774Z info: [Crawler][16] Stored the screenshot as assetId: e6a60992-df37-4d2d-bb2a-279ef919c809
Author
Owner

@MohamedBassem commented on GitHub (Aug 8, 2024):

smells like some bug

<!-- gh-comment-id:2276242146 --> @MohamedBassem commented on GitHub (Aug 8, 2024): smells like some bug
Author
Owner

@MohamedBassem commented on GitHub (Aug 8, 2024):

btw, when you wipe meillisearch, you don't need to recrawl, you can just re index from the admin panel. It's much faster.

<!-- gh-comment-id:2276243229 --> @MohamedBassem commented on GitHub (Aug 8, 2024): btw, when you wipe meillisearch, you don't need to recrawl, you can just re index from the admin panel. It's much faster.
Author
Owner

@wafflecomposite commented on GitHub (Oct 14, 2024):

Ubuntu 24, had to deal with MDB_KEYEXIST: Key/data pair already exists pretty much immediately after installation, as soon as I added the first link and found that search is not working.
I don't recall doing anything special during installation, just installed it by the manual, added the OpenAI key too.
Wiping /var/lib/docker/volumes/hoarder_meilisearch/ and then reindex in admin panel helped, search is working fine so far.

UPD: happened again just 4 hours later of little to no usage
UPD2: two weeks in, never happened again

<!-- gh-comment-id:2411844944 --> @wafflecomposite commented on GitHub (Oct 14, 2024): Ubuntu 24, had to deal with `MDB_KEYEXIST: Key/data pair already exists` pretty much immediately after installation, as soon as I added the first link and found that search is not working. I don't recall doing anything special during installation, just installed it by the manual, added the OpenAI key too. Wiping `/var/lib/docker/volumes/hoarder_meilisearch/` and then reindex in admin panel helped, search is working fine so far. UPD: happened again just 4 hours later of little to no usage UPD2: two weeks in, never happened again
Author
Owner

@treardon17 commented on GitHub (Oct 30, 2024):

I also just had this problem immediately after installation. Didn't change the docker compose file very much -- just changed the port, and configured the variables in the .env file. Wiping out the hoarder_meilisearch volume, restarting everything, and doing a reindex resolved the issue for me.

<!-- gh-comment-id:2447674133 --> @treardon17 commented on GitHub (Oct 30, 2024): I also just had this problem immediately after installation. Didn't change the docker compose file very much -- just changed the port, and configured the variables in the `.env` file. Wiping out the `hoarder_meilisearch` volume, restarting everything, and doing a reindex resolved the issue for me.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#224
No description provided.