mirror of
https://github.com/AJaySi/ALwrity.git
synced 2026-04-25 00:45:54 +03:00
[GH-ISSUE #214] Integrate google caching for large context and prompt chaining #151
Labels
No labels
AI Content Agents
AI Content Strategy
AI Content planning
AI Marketing Tools
AI SEO
AI personalization
AI writer
ALwrity Copi-lot
Alwrity web search
Anthropic
DeepSeek
Gemini AI
Integration
LLM
OnBoarding
OnBoarding
RAG knowledgebase Memory
bug
documentation
enhancement
good first issue
help wanted
invalid
openai
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/ALwrity#151
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @AJaySi on GitHub (Aug 21, 2025).
Original GitHub issue: https://github.com/AJaySi/ALwrity/issues/214
Originally assigned to: @AJaySi on GitHub.
COntent strategy and content calendar generation are using more than 270+ datapoints and large amount of context data. the problem is then simple, sending the same data again or large prompts will increase the cost. Also, the number of prompts required to generate strategy, calender, insights from database is also considerable.
Implementing google cache for context storing will then almost reduce the AI API costing by 60%