mirror of
https://github.com/AJaySi/ALwrity.git
synced 2026-04-25 17:05:56 +03:00
[PR #288] [CLOSED] ollama local ai model support #616
Labels
No labels
AI Content Agents
AI Content Strategy
AI Content planning
AI Marketing Tools
AI SEO
AI personalization
AI writer
ALwrity Copi-lot
Alwrity web search
Anthropic
DeepSeek
Gemini AI
Integration
LLM
OnBoarding
OnBoarding
RAG knowledgebase Memory
bug
documentation
enhancement
good first issue
help wanted
invalid
openai
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/ALwrity#616
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/AJaySi/ALwrity/pull/288
Author: @Ratna-Babu
Created: 10/11/2025
Status: ❌ Closed
Base:
main← Head:feature/ollama-local-ai-model-support📝 Commits (4)
d0c3e87ollama-local-ai-model-support3224035Merge branch 'main' into feature/ollama-local-ai-model-support92bccf5fix: Critical bug fixes for Ollama integration5425a23Merge branch 'feature/ollama-local-ai-model-support' of https://github.com/Ratna-Babu/ALwrity into feature/ollama-local-ai-model-support📊 Changes
16 files changed (+3566 additions, -49 deletions)
View changed files
📝
.github/CONTRIBUTING.md(+58 -0)📝
backend/env_template.txt(+7 -0)📝
backend/requirements.txt(+1 -0)📝
backend/services/api_key_manager.py(+45 -0)📝
backend/services/llm_providers/main_text_generation.py(+177 -49)➕
backend/services/llm_providers/ollama_provider.py(+556 -0)➕
backend/services/llm_providers/smart_model_selector.py(+406 -0)➕
backend/test/README.md(+0 -0)➕
backend/test/README_OLLAMA_TESTS.md(+252 -0)➕
backend/test/run_ollama_tests.py(+401 -0)➕
backend/test/run_ollama_tests_fixed.py(+213 -0)➕
backend/test/test_api_key_manager_ollama.py(+93 -0)➕
docs/OLLAMA_DOCUMENTATION_SUMMARY.md(+307 -0)➕
docs/OLLAMA_INTEGRATION_GUIDE.md(+341 -0)➕
docs/OLLAMA_QUICK_REFERENCE.md(+75 -0)➕
docs/OLLAMA_TESTING_GUIDE.md(+634 -0)📄 Description
Pull Request: Ollama Local AI Model Integration #287
📝 Description
This PR implements comprehensive Ollama integration for ALwrity, enabling users to run open-source AI models locally on their laptops. This feature provides free access to AI models like LLaMA, Gemma, Qwen, and DeepSeek while ensuring complete data privacy and significant cost savings for content creators and digital marketing professionals.
Key Features
✅ Free Local AI Models
✅ Cost Optimization
✅ Task-Specific Model Selection
✅ User Abstraction
✅ GPT_PROVIDER Integration
✅ Data Privacy
⏳ Future Enhancements
❌ Ollama Cloud Integration
❌ LiteLLM Integration
This implementation addresses ALwrity's core mission of providing affordable, privacy-focused AI solutions for content creators while maintaining professional-grade functionality.
🔄 Type of Change
🎯 Related Issues
Implements local AI model support as outlined in ALwrity's roadmap for:
🧪 Testing
Test Results:
🏷️ Component/Feature
Which component or feature is affected?
📋 Checklist
ALwrity-Specific Checklist
🔍 Code Quality
📚 Documentation
Documentation Added:
docs/OLLAMA_INTEGRATION_GUIDE.md- Comprehensive setup and usage guide🚀 Deployment Notes
Ollama Integration Deployment:
OLLAMA_BASE_URLto override default localhost endpoint🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.