[PR #15] [MERGED] Added support for local LLM and added fetching of available models #16

Closed
opened 2026-03-04 02:09:38 +03:00 by kerem · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/zlwaterfield/scramble/pull/15
Author: @Dobidop
Created: 5/25/2025
Status: Merged
Merged: 5/25/2025
Merged by: @zlwaterfield

Base: mainHead: main


📝 Commits (2)

  • d75b4cd Updated support for local LLM
  • fb936f2 Add model fetching functionality for supported providers

📊 Changes

3 files changed (+299 additions, -30 deletions)

View changed files

📝 src/background.js (+83 -6)
📝 src/options.html (+16 -3)
📝 src/options.js (+200 -21)

📄 Description

  • Add LM Studio provider with OpenAI-compatible API integration

  • Enable Ollama provider (previously commented out) with improved error handling

  • Add optional API key support for remote Ollama instances

  • Improve error messages with connection troubleshooting tips

  • Add "Fetch Models" button for OpenAI, LM Studio, Ollama, OpenRouter, and Groq

  • Implement provider-specific model list endpoints with proper authentication

  • Add model dropdown that auto-populates from API responses


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/zlwaterfield/scramble/pull/15 **Author:** [@Dobidop](https://github.com/Dobidop) **Created:** 5/25/2025 **Status:** ✅ Merged **Merged:** 5/25/2025 **Merged by:** [@zlwaterfield](https://github.com/zlwaterfield) **Base:** `main` ← **Head:** `main` --- ### 📝 Commits (2) - [`d75b4cd`](https://github.com/zlwaterfield/scramble/commit/d75b4cd33db36f9dfc6041b36abb494ae30c2964) Updated support for local LLM - [`fb936f2`](https://github.com/zlwaterfield/scramble/commit/fb936f20b3dd428726824b74e2d69b144397d4eb) Add model fetching functionality for supported providers ### 📊 Changes **3 files changed** (+299 additions, -30 deletions) <details> <summary>View changed files</summary> 📝 `src/background.js` (+83 -6) 📝 `src/options.html` (+16 -3) 📝 `src/options.js` (+200 -21) </details> ### 📄 Description - Add LM Studio provider with OpenAI-compatible API integration - Enable Ollama provider (previously commented out) with improved error handling - Add optional API key support for remote Ollama instances - Improve error messages with connection troubleshooting tips - Add "Fetch Models" button for OpenAI, LM Studio, Ollama, OpenRouter, and Groq - Implement provider-specific model list endpoints with proper authentication - Add model dropdown that auto-populates from API responses --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
kerem 2026-03-04 02:09:38 +03:00
Sign in to join this conversation.
No labels
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/scramble#16
No description provided.