[PR #2358] fix: Add JSON instruction to default text tagging prompt before content insertion #2083

Open
opened 2026-03-02 12:00:29 +03:00 by kerem · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/karakeep-app/karakeep/pull/2358
Author: @ElectricTea
Created: 1/6/2026
Status: 🔄 Open

Base: mainHead: patch-3


📝 Commits (1)

  • dcd759d fix: Add JSON instruction to default text tagging prompt before content insertion

📊 Changes

1 file changed (+1 additions, -0 deletions)

View changed files

📝 packages/shared/prompts.ts (+1 -0)

📄 Description

Added the following prompt instruction

- You must respond in JSON with the key "tags" and the value is an array of string tags.

to the default text tagging instructions before the content insertion. This change significantly improves the success rate of a response containing structured JSON when a prompt is truncated by the LLM due to the prompt exceeding a maximum token limit.

I kept the original JSON instruction at the end of the prompt because it reminds the LLM to use JSON structure after the content insertion, and it causes no issues.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/karakeep-app/karakeep/pull/2358 **Author:** [@ElectricTea](https://github.com/ElectricTea) **Created:** 1/6/2026 **Status:** 🔄 Open **Base:** `main` ← **Head:** `patch-3` --- ### 📝 Commits (1) - [`dcd759d`](https://github.com/karakeep-app/karakeep/commit/dcd759d0496e90ac6e03153cd57455216da0ee5b) fix: Add JSON instruction to default text tagging prompt before content insertion ### 📊 Changes **1 file changed** (+1 additions, -0 deletions) <details> <summary>View changed files</summary> 📝 `packages/shared/prompts.ts` (+1 -0) </details> ### 📄 Description Added the following prompt instruction ``` - You must respond in JSON with the key "tags" and the value is an array of string tags. ``` to the default text tagging instructions before the content insertion. This change significantly improves the success rate of a response containing structured JSON when a prompt is truncated by the LLM due to the prompt exceeding a maximum token limit. I kept the original JSON instruction at the end of the prompt because it reminds the LLM to use JSON structure after the content insertion, and it causes no issues. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/karakeep#2083
No description provided.