[GH-ISSUE #1] Add Local LLM support #1

Closed
opened 2026-03-04 02:09:30 +03:00 by kerem · 11 comments
Owner

Originally created by @NurvX on GitHub (Sep 18, 2024).
Original GitHub issue: https://github.com/zlwaterfield/scramble/issues/1

Would love to have local llm support through llmstudio or ollama

Originally created by @NurvX on GitHub (Sep 18, 2024). Original GitHub issue: https://github.com/zlwaterfield/scramble/issues/1 Would love to have local llm support through llmstudio or ollama
kerem closed this issue 2026-03-04 02:09:30 +03:00
Author
Owner

@praserocking commented on GitHub (Sep 18, 2024):

Please plan to include local LLM via https://lmstudio.ai/ should be easy. It exposes Open AI compatible API locally for any model, which will be an easy integration point. I am pulling this extension and re-writing api section with localhost for my local use. But proper support needs to be through options.

<!-- gh-comment-id:2357632668 --> @praserocking commented on GitHub (Sep 18, 2024): Please plan to include local LLM via https://lmstudio.ai/ should be easy. It exposes Open AI compatible API locally for any model, which will be an easy integration point. I am pulling this extension and re-writing api section with localhost for my local use. But proper support needs to be through options.
Author
Owner

@Richard-Woessner commented on GitHub (Sep 18, 2024):

I think having a local LLM mimicking OpenAI API. Is the move but I don't have much browser experience is it typical for a browser extension to have a running service like a local LLM

<!-- gh-comment-id:2358357298 --> @Richard-Woessner commented on GitHub (Sep 18, 2024): I think having a local LLM mimicking OpenAI API. Is the move but I don't have much browser experience is it typical for a browser extension to have a running service like a local LLM
Author
Owner

@Richard-Woessner commented on GitHub (Sep 18, 2024):

But this also seems perfect for the job

https://github.com/mlc-ai/web-llm

I might try to plug it in later

<!-- gh-comment-id:2358432443 --> @Richard-Woessner commented on GitHub (Sep 18, 2024): But this also seems perfect for the job https://github.com/mlc-ai/web-llm I might try to plug it in later
Author
Owner

@len-foss commented on GitHub (Sep 18, 2024):

Should be fixed by #5

<!-- gh-comment-id:2358695812 --> @len-foss commented on GitHub (Sep 18, 2024): Should be fixed by #5
Author
Owner

@zlwaterfield commented on GitHub (Oct 14, 2024):

Ollama support is in there now but the requests are 403ing because of https://github.com/ollama/ollama/issues/6021 so looking into how to resolve.

<!-- gh-comment-id:2411975225 --> @zlwaterfield commented on GitHub (Oct 14, 2024): Ollama support is in there now but the requests are 403ing because of https://github.com/ollama/ollama/issues/6021 so looking into how to resolve.
Author
Owner

@tcsenpai commented on GitHub (Oct 26, 2024):

  1. Ollama needs the OLLAMA_ORIGINS system variable set to something that allows extensions to run (I made my own extension too for summarizing and I haven't found any workaround)
  2. Anyway, Ollama just works fine with by using a custom OpenAI endpoint (e.g. localhost:11434/v1/chat/completions) and a random API Key

I tested this on Firefox (Zen).

<!-- gh-comment-id:2439443145 --> @tcsenpai commented on GitHub (Oct 26, 2024): 1. Ollama needs the OLLAMA_ORIGINS system variable set to something that allows extensions to run (I made my own extension too for summarizing and I haven't found any workaround) 2. Anyway, Ollama just works fine with by using a custom OpenAI endpoint (e.g. localhost:11434/v1/chat/completions) and a random API Key I tested this on Firefox (Zen).
Author
Owner

@FlashSkyNews commented on GitHub (Jan 10, 2025):

Hey! I came across this project a few months ago because I was interested in a locally hosted Grammarly alternative. Is there an update on this issue?

<!-- gh-comment-id:2582757159 --> @FlashSkyNews commented on GitHub (Jan 10, 2025): Hey! I came across this project a few months ago because I was interested in a locally hosted Grammarly alternative. Is there an update on this issue?
Author
Owner

@theJayTea commented on GitHub (Jan 27, 2025):

it looks like this cool repo isn't being updated anymore. I've made something somewhat similar with support for multiple local LLMs, Gemini, etc, and it works system-wide and not just in the browser. Feel free to check it out!

https://github.com/theJayTea/WritingTools

<!-- gh-comment-id:2616099592 --> @theJayTea commented on GitHub (Jan 27, 2025): it looks like this cool repo isn't being updated anymore. I've made something somewhat similar with support for multiple local LLMs, Gemini, etc, and it works system-wide and not just in the browser. Feel free to check it out! https://github.com/theJayTea/WritingTools
Author
Owner

@Dobidop commented on GitHub (May 25, 2025):

Added support for Ollama and LM studio in #15
Works fine for me locally so far at least

<!-- gh-comment-id:2907917115 --> @Dobidop commented on GitHub (May 25, 2025): Added support for Ollama and LM studio in #15 Works fine for me locally so far at least
Author
Owner

@zlwaterfield commented on GitHub (May 25, 2025):

Merged in @Dobidop's PR.

<!-- gh-comment-id:2908026029 --> @zlwaterfield commented on GitHub (May 25, 2025): Merged in @Dobidop's PR.
Author
Owner

@zlwaterfield commented on GitHub (May 25, 2025):

This should be good now, thanks everyone.

<!-- gh-comment-id:2908026101 --> @zlwaterfield commented on GitHub (May 25, 2025): This should be good now, thanks everyone.
Sign in to join this conversation.
No labels
pull-request
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/scramble#1
No description provided.