mirror of
https://github.com/AJaySi/ALwrity.git
synced 2026-04-25 00:45:54 +03:00
[GH-ISSUE #112] Adding a local or cloud Database for ALwrity #426
Labels
No labels
AI Content Agents
AI Content Strategy
AI Content planning
AI Marketing Tools
AI SEO
AI personalization
AI writer
ALwrity Copi-lot
Alwrity web search
Anthropic
DeepSeek
Gemini AI
Integration
LLM
OnBoarding
OnBoarding
RAG knowledgebase Memory
bug
documentation
enhancement
good first issue
help wanted
invalid
openai
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/ALwrity#426
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @AJaySi on GitHub (Aug 21, 2024).
Original GitHub issue: https://github.com/AJaySi/ALwrity/issues/112
Originally assigned to: @AJaySi, @DikshaDisciplines, @Om-Singh1808 on GitHub.
We are against a Local DB, only as its difficult to for non tech users to get started and maintain.
But, a DB will make ALwrity very powerful and also help get more features in.
Cloud DB which will work with APIs and zero setup for end-users, makes sense.
Again, another API. Although, we will keep this as optional.
ALwrity should work with least number of API keys, and most are optional, depending on features end-user wants.
Prisma, seems like a good option, looking at more options which have generous free plans.
A DB will help snapshot performance overtime and remember our actions. It will help learning about the given website embed it and perform operations site-wide or bulk operations.
@AJaySi commented on GitHub (Aug 15, 2025):
In the new SaaS version, we are using postgresql, as a local DB, setup and maintained on your latptop, with minimal/no setup required.
We have integrated datapoints from database, web scraping and feeding it AI to generate personalized recommendations and generation.
@AJaySi commented on GitHub (Oct 23, 2025):
A basic sqlite implementation is working. We need to move to postgresql and there is a separate ticket for it.