[PR #66] [CLOSED] feat: add Ollama support for local model usage #99

Closed
opened 2026-02-27 07:20:25 +03:00 by kerem · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/KeygraphHQ/shannon/pull/66
Author: @Mr-Neutr0n
Created: 2/6/2026
Status: Closed

Base: mainHead: feat/ollama-support


📝 Commits (1)

  • b0db38c feat: add Ollama support for local model usage

📊 Changes

3 files changed (+71 additions, -4 deletions)

View changed files

📝 .env.example (+10 -0)
📝 README.md (+41 -4)
📝 src/ai/router-utils.ts (+20 -0)

📄 Description

Summary

  • Add Ollama as an experimental router provider option, enabling users to run Shannon with local models for privacy or cost savings
  • Add OLLAMA_BASE_URL configuration to .env.example
  • Add helper functions isOllamaMode() and getProviderName() to router-utils.ts
  • Document Ollama setup in README with Docker considerations (using host.docker.internal)

Test plan

  • Configure Ollama with a 70B+ model (e.g., llama3.3:70b)
  • Set OLLAMA_BASE_URL and ROUTER_DEFAULT in .env
  • Run Shannon with ROUTER=true flag
  • Verify requests are routed to Ollama endpoint

Closes #61


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/KeygraphHQ/shannon/pull/66 **Author:** [@Mr-Neutr0n](https://github.com/Mr-Neutr0n) **Created:** 2/6/2026 **Status:** ❌ Closed **Base:** `main` ← **Head:** `feat/ollama-support` --- ### 📝 Commits (1) - [`b0db38c`](https://github.com/KeygraphHQ/shannon/commit/b0db38cb1f03fb4d3ee554542c22f25e58c2df12) feat: add Ollama support for local model usage ### 📊 Changes **3 files changed** (+71 additions, -4 deletions) <details> <summary>View changed files</summary> 📝 `.env.example` (+10 -0) 📝 `README.md` (+41 -4) 📝 `src/ai/router-utils.ts` (+20 -0) </details> ### 📄 Description ## Summary - Add Ollama as an experimental router provider option, enabling users to run Shannon with local models for privacy or cost savings - Add `OLLAMA_BASE_URL` configuration to `.env.example` - Add helper functions `isOllamaMode()` and `getProviderName()` to `router-utils.ts` - Document Ollama setup in README with Docker considerations (using `host.docker.internal`) ## Test plan - [ ] Configure Ollama with a 70B+ model (e.g., `llama3.3:70b`) - [ ] Set `OLLAMA_BASE_URL` and `ROUTER_DEFAULT` in `.env` - [ ] Run Shannon with `ROUTER=true` flag - [ ] Verify requests are routed to Ollama endpoint Closes #61 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
kerem 2026-02-27 07:20:25 +03:00
Sign in to join this conversation.
No labels
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/shannon-KeygraphHQ#99
No description provided.