mirror of
https://github.com/jehna/humanify.git
synced 2026-04-27 09:35:58 +03:00
[GH-ISSUE #58] Improve model download speed, progress display, etc #31
Labels
No labels
bug
enhancement
pull-request
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/humanify#31
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @0xdevalias on GitHub (Aug 26, 2024).
Original GitHub issue: https://github.com/jehna/humanify/issues/58
I was reading the
node-llama-cppdocs, and they mention that theipullpackage can be useful for improved model download speeds:I can see that the current
downloadcommand callsdownloadModel:github.com/jehna/humanify@14e4ae5365/src/commands/download.ts (L1-L6)Which is defined here, and seems to just use
fetchcurrently, as well as implementing its own download progress tracking inshowProgress:github.com/jehna/humanify@14e4ae5365/src/local-models.ts (L38-L65)github.com/jehna/humanify@14e4ae5365/src/progress.ts (L4-L13)I wonder if using iPull might make sense, both in increased download speed, as well as better download progress visibility/etc.
@0xdevalias commented on GitHub (Aug 26, 2024):
Partially related context:
@neoOpus commented on GitHub (Sep 4, 2024):
I ve been downloading the 8b for almost an hour now... So yeah any improvement in this regard will be great. Or maybe allow to just download the models then place them (I don't know where the model will be located as I didn't analyse the source code yet) But having them shared with other softwares like Ollama would be great (Symbolic link)
@0xdevalias commented on GitHub (Sep 12, 2024):
~/.humanifyjs/modelsgithub.com/jehna/humanify@14e4ae5365/src/local-models.ts (L13-L25)@neoOpus commented on GitHub (Sep 12, 2024):
Yeah I figured that out and updated to Phi 3.5 but the PR shows an error (unrelated I guess as many dependabot PRs are rejected as well)
My mistake (I am extremely exhausted lately and I cannot focus)... Still I want to make some progress with this and create a workflow that would allow me to puruse some new venues in the near future by learning from some extensions how they operate internally... alter some and analyse some other for malwares.
This is the error of the PR, I thought it would be a drop from 3.1 to 3.5 but I guess I have to learn more about the difference between the tokenization of both.
@0xdevalias commented on GitHub (Sep 30, 2024):
Some more relevant links/functions/etc that could be used here:
@0xdevalias commented on GitHub (Oct 20, 2024):
FYI: It looks like @jehna opted for directly using
ipullin this PR:Rather than the
createModelDownloader/combineModelDownloadersabstractions fromnode-llama-cppmentioned above in https://github.com/jehna/humanify/issues/58#issuecomment-2381885999