[GH-ISSUE #1666] [FR] Don't mark "No content found for link" as a failed inference job #1036

Closed
opened 2026-03-02 11:54:33 +03:00 by kerem · 0 comments
Owner

Originally created by @pdc1 on GitHub (Jun 23, 2025).
Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/1666

Describe the feature you'd like

Currently my database has ~200 bookmarks with no content, generally due to crawler errors (cookies, captcha, etc).

Each time I run the "Regenerate AI Summaries for Failed Tags Only" admin job, it has to reprocess all those failed tags. Granted, it is very fast, but having 200+ failures in the queue makes it seem like there is a problem that needs addressing.

This does not seem to be a problem for AI summaries, so using the same approach might be helpful.

Describe the benefits this would bring to existing Karakeep users

This would result in a more realistic view of failed tasks, and avoid reprocessing 200+ bookmarks that will produce the same error every time.

I don't know if AI tasks are automatically redone when the crawler is redone, but that would be something to consider.

Can the goal of this request already be achieved via other means?

Outside of direct database manipulation, I don't think so.

Have you searched for an existing open/closed issue?

  • I have searched for existing issues and none cover my fundamental request

Additional context

No response

Originally created by @pdc1 on GitHub (Jun 23, 2025). Original GitHub issue: https://github.com/karakeep-app/karakeep/issues/1666 ### Describe the feature you'd like Currently my database has ~200 bookmarks with no content, generally due to crawler errors (cookies, captcha, etc). Each time I run the "Regenerate AI Summaries for Failed Tags Only" admin job, it has to reprocess all those failed tags. Granted, it is very fast, but having 200+ failures in the queue makes it seem like there is a problem that needs addressing. This does not seem to be a problem for AI summaries, so using the same approach might be helpful. ### Describe the benefits this would bring to existing Karakeep users This would result in a more realistic view of failed tasks, and avoid reprocessing 200+ bookmarks that will produce the same error every time. I don't know if AI tasks are automatically redone when the crawler is redone, but that would be something to consider. ### Can the goal of this request already be achieved via other means? Outside of direct database manipulation, I don't think so. ### Have you searched for an existing open/closed issue? - [x] I have searched for existing issues and none cover my fundamental request ### Additional context _No response_
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#1036
No description provided.