mirror of
https://github.com/AJaySi/ALwrity.git
synced 2026-04-25 08:55:58 +03:00
[GH-ISSUE #117] Take advantage of increased LLM context window #92
Labels
No labels
AI Content Agents
AI Content Strategy
AI Content planning
AI Marketing Tools
AI SEO
AI personalization
AI writer
ALwrity Copi-lot
Alwrity web search
Anthropic
DeepSeek
Gemini AI
Integration
LLM
OnBoarding
OnBoarding
RAG knowledgebase Memory
bug
documentation
enhancement
good first issue
help wanted
invalid
openai
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/ALwrity#92
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @AJaySi on GitHub (Sep 2, 2024).
Original GitHub issue: https://github.com/AJaySi/ALwrity/issues/117
Originally assigned to: @AJaySi on GitHub.
RAG is dying as the context window keeps increasing exponentially.
With 1M tokens context windows, who needs RAG. Caching course material and doing vector search seems like the way to go.
We will need to implement concepts from this paper: https://ai.google.dev/gemini-api/docs/long-context
@AJaySi commented on GitHub (Sep 24, 2024):
https://ai.google.dev/gemini-api/docs/long-context
@AJaySi commented on GitHub (Sep 24, 2024):
https://github.com/google-gemini/cookbook/blob/main/examples/Apollo_11.ipynb
@AJaySi commented on GitHub (Mar 13, 2025):
This needs attention for long form content generation.
@AJaySi commented on GitHub (Sep 25, 2025):
We have shifted to flash 2.5 and this feature is inheriently provided and needs no extra instrumentation on our part.
I still need to test, but the quality suffers a lot with longer content. Needs more experiementation.