[PR #28] [CLOSED] feat: Add local Ollama AI support with browser automation #77

Closed
opened 2026-02-27 07:20:20 +03:00 by kerem · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/KeygraphHQ/shannon/pull/28
Author: @vito1317
Created: 12/17/2025
Status: Closed

Base: mainHead: main


📝 Commits (2)

  • 58a7fdc Update package.json
  • 7fcfdc1 feat: Add local Ollama AI support with browser automation

📊 Changes

7 files changed (+1117 additions, -13 deletions)

View changed files

📝 README.md (+40 -0)
📝 package-lock.json (+17 -0)
📝 package.json (+2 -1)
📝 shannon.mjs (+63 -12)
src/ai/ai-provider.js (+125 -0)
src/ai/ollama-executor.js (+863 -0)
📝 src/cli/ui.js (+7 -0)

📄 Description

Summary

This PR adds support for running Shannon with local Ollama models, providing a cost-free alternative to Claude while maintaining full functionality including browser automation.

Motivation

  • Enable users to run Shannon without Claude API costs
  • Support air-gapped or privacy-sensitive environments
  • Allow testing with various open-source LLM models

Changes

New Files

Modified Files

New CLI Options

Option Description
--ollama Enable Ollama mode
--ollama-model <model> Specify model (default: qwen3:32b)
--ollama-host <url> Server URL (default: http://localhost:11434)
--ollama-baseurl Alias for --ollama-host

Browser Automation Tools (9 tools)

  • browser_navigate - Navigate to URL
  • browser_click - Click elements
  • browser_type - Type text
  • browser_screenshot - Capture screenshots
  • browser_get_content - Get page content
  • browser_evaluate - Execute JavaScript
  • browser_fill_form - Fill forms
  • browser_wait - Wait for elements
  • browser_close - Close browser

Usage

# Start Ollama
ollama serve

# Pull a model
ollama pull qwen3:14b

# Run Shannon with Ollama
./shannon.mjs "[https://target.com](https://target.com)" "/path/to/repo" --ollama --ollama-model qwen3:14b

---

<sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
## 📋 Pull Request Information **Original PR:** https://github.com/KeygraphHQ/shannon/pull/28 **Author:** [@vito1317](https://github.com/vito1317) **Created:** 12/17/2025 **Status:** ❌ Closed **Base:** `main` ← **Head:** `main` --- ### 📝 Commits (2) - [`58a7fdc`](https://github.com/KeygraphHQ/shannon/commit/58a7fdc3061ad19fcfd563b4b9ce19243197ee58) Update package.json - [`7fcfdc1`](https://github.com/KeygraphHQ/shannon/commit/7fcfdc185a64aab69a98821797d263b86dd7ef6f) feat: Add local Ollama AI support with browser automation ### 📊 Changes **7 files changed** (+1117 additions, -13 deletions) <details> <summary>View changed files</summary> 📝 `README.md` (+40 -0) 📝 `package-lock.json` (+17 -0) 📝 `package.json` (+2 -1) 📝 `shannon.mjs` (+63 -12) ➕ `src/ai/ai-provider.js` (+125 -0) ➕ `src/ai/ollama-executor.js` (+863 -0) 📝 `src/cli/ui.js` (+7 -0) </details> ### 📄 Description ## Summary This PR adds support for running Shannon with local Ollama models, providing a cost-free alternative to Claude while maintaining full functionality including browser automation. ## Motivation - Enable users to run Shannon without Claude API costs - Support air-gapped or privacy-sensitive environments - Allow testing with various open-source LLM models ## Changes ### New Files - [src/ai/ollama-executor.js](cci:7://file:///Users/vito/Documents/GitHub/shannon/src/ai/ollama-executor.js:0:0-0:0) - Ollama execution engine with tool calling and retry logic - [src/ai/ai-provider.js](cci:7://file:///Users/vito/Documents/GitHub/shannon/src/ai/ai-provider.js:0:0-0:0) - AI provider abstraction layer for dynamic switching ### Modified Files - [shannon.mjs](cci:7://file:///Users/vito/Documents/GitHub/shannon/shannon.mjs:0:0-0:0) - CLI flag parsing and provider integration - [src/cli/ui.js](cci:7://file:///Users/vito/Documents/GitHub/shannon/src/cli/ui.js:0:0-0:0) - Help message updates - [README.md](cci:7://file:///Users/vito/Documents/GitHub/shannon/README.md:0:0-0:0) - Ollama setup documentation - [package.json](cci:7://file:///Users/vito/Documents/GitHub/shannon/package.json:0:0-0:0) - Added `ollama` dependency ## New CLI Options | Option | Description | |--------|-------------| | `--ollama` | Enable Ollama mode | | `--ollama-model <model>` | Specify model (default: qwen3:32b) | | `--ollama-host <url>` | Server URL (default: http://localhost:11434) | | `--ollama-baseurl` | Alias for --ollama-host | ## Browser Automation Tools (9 tools) - `browser_navigate` - Navigate to URL - `browser_click` - Click elements - `browser_type` - Type text - `browser_screenshot` - Capture screenshots - `browser_get_content` - Get page content - `browser_evaluate` - Execute JavaScript - `browser_fill_form` - Fill forms - `browser_wait` - Wait for elements - `browser_close` - Close browser ## Usage ```bash # Start Ollama ollama serve # Pull a model ollama pull qwen3:14b # Run Shannon with Ollama ./shannon.mjs "[https://target.com](https://target.com)" "/path/to/repo" --ollama --ollama-model qwen3:14b --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
kerem 2026-02-27 07:20:20 +03:00
Sign in to join this conversation.
No labels
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/shannon-KeygraphHQ#77
No description provided.