[GH-ISSUE #2] Support for LLMs with a local backend #2

Open
opened 2026-03-03 16:44:28 +03:00 by kerem · 1 comment
Owner

Originally created by @nucleartux on GitHub (Sep 18, 2024).
Original GitHub issue: https://github.com/nucleartux/ai-grammar/issues/2

ollama for example

Originally created by @nucleartux on GitHub (Sep 18, 2024). Original GitHub issue: https://github.com/nucleartux/ai-grammar/issues/2 ollama for example
Author
Owner

@nucleartux commented on GitHub (Sep 25, 2024):

I implemented support for Ollama, but now it works only with llama3.1 model.
I think of implementing support for choosing between multiple models, but not sure about the UX right now.

<!-- gh-comment-id:2373170632 --> @nucleartux commented on GitHub (Sep 25, 2024): I implemented support for Ollama, but now it works only with `llama3.1` model. I think of implementing support for choosing between multiple models, but not sure about the UX right now.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/ai-grammar#2
No description provided.