mirror of
https://github.com/AJaySi/ALwrity.git
synced 2026-04-26 01:15:52 +03:00
[GH-ISSUE #134] Custom OpenAI Endpoint - LiteLLM integration #101
Labels
No labels
AI Content Agents
AI Content Strategy
AI Content planning
AI Marketing Tools
AI SEO
AI personalization
AI writer
ALwrity Copi-lot
Alwrity web search
Anthropic
DeepSeek
Gemini AI
Integration
LLM
OnBoarding
OnBoarding
RAG knowledgebase Memory
bug
documentation
enhancement
good first issue
help wanted
invalid
openai
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/ALwrity#101
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @gabelul on GitHub (Oct 14, 2024).
Original GitHub issue: https://github.com/AJaySi/ALwrity/issues/134
Originally assigned to: @gabelul, @AJaySi on GitHub.
It would be great if we could use our own OpenAI-compatible API endpoint.
@AJaySi commented on GitHub (Oct 14, 2024):
@gabelul
It would be great if you can give example of it ?
We can then evaluate its pros-cons & its implementation details.
-Thanks.
@gabelul commented on GitHub (Oct 14, 2024):
Well, I'm using LiteLLM to rotate through different providers. It's like using Operounter; they have an OpenAI-compatible API endpoint. We would only need to be able to specify the API Endpoint / Base URL and maybe the model, so it's not a significant change code wise
@AJaySi commented on GitHub (Oct 14, 2024):
@gabelul
May I request for your help in this issue (pretty please). I can help you with present code layout and other gory details.
I am aware of LiteLLM and it will be a great feature addition.
Excuse: We are busy building a nextjs version of ALwrity for non-tech content creators (streamlit version is meant for developers).
Regards.
@gabelul commented on GitHub (Oct 14, 2024):
@AJaySi i can give it a try, sure
@AJaySi commented on GitHub (Oct 14, 2024):
Thank you so much @gabelul
Just to get you started:
1). We call appropriate LLMs based on config values from sidebar Or from main_config in ~/AI-Writer/lib/workspace/alwrity_config
2). The above logic is then placed in this file/dir:
3). Navigate to text_generation folder and have a look at main_text_generation.py
This file changes should suffice for liteLLM, you may write a new module file for litellm functions there.
Hope, this gets you started, else please post here.
Regards & Thanks.