[PR #337] [CLOSED] Bump node-llama-cpp from 3.0.0-beta.44 to 3.6.0 #368

Closed
opened 2026-03-03 13:54:15 +03:00 by kerem · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/jehna/humanify/pull/337
Author: @dependabot[bot]
Created: 2/24/2025
Status: Closed

Base: mainHead: dependabot/npm_and_yarn/node-llama-cpp-3.6.0


📝 Commits (1)

  • 0289430 Bump node-llama-cpp from 3.0.0-beta.44 to 3.6.0

📊 Changes

1 file changed (+501 additions, -261 deletions)

View changed files

📝 package-lock.json (+501 -261)

📄 Description

Bumps node-llama-cpp from 3.0.0-beta.44 to 3.6.0.

Release notes

Sourced from node-llama-cpp's releases.

v3.6.0

DeepSeek R1 is here!

Read about the release in the blog post


3.6.0 (2025-02-21)

Features

Bug Fixes


Shipped with llama.cpp release b4753

To use the latest llama.cpp release available, run npx -n node-llama-cpp source download --release latest. (learn more)

v3.5.0

3.5.0 (2025-01-31)

Features

Bug Fixes

  • add missing Jinja features for DeepSeek (#425) (6e4bf3d)

... (truncated)

Commits

Dependabot compatibility score

You can trigger a rebase of this PR by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Note

Automatic rebases have been disabled on this pull request as it has been open for over 30 days.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/jehna/humanify/pull/337 **Author:** [@dependabot[bot]](https://github.com/apps/dependabot) **Created:** 2/24/2025 **Status:** ❌ Closed **Base:** `main` ← **Head:** `dependabot/npm_and_yarn/node-llama-cpp-3.6.0` --- ### 📝 Commits (1) - [`0289430`](https://github.com/jehna/humanify/commit/0289430745f4a59ad1fe4f4ba877aa78cdfc0efc) Bump node-llama-cpp from 3.0.0-beta.44 to 3.6.0 ### 📊 Changes **1 file changed** (+501 additions, -261 deletions) <details> <summary>View changed files</summary> 📝 `package-lock.json` (+501 -261) </details> ### 📄 Description Bumps [node-llama-cpp](https://github.com/withcatai/node-llama-cpp) from 3.0.0-beta.44 to 3.6.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/withcatai/node-llama-cpp/releases">node-llama-cpp's releases</a>.</em></p> <blockquote> <h2>v3.6.0</h2> <p><img src="https://github.com/user-attachments/assets/9ed954f8-102d-4cdd-96d8-9b6710b8a1f5" alt="" /></p> <h1>✨ <a href="https://node-llama-cpp.withcat.ai/blog/v3.6-deepseek-r1">DeepSeek R1 is here!</a> ✨</h1> <p>Read about the release in the <a href="https://node-llama-cpp.withcat.ai/blog/v3.6-deepseek-r1">blog post</a></p> <hr /> <h1><a href="https://github.com/withcatai/node-llama-cpp/compare/v3.5.0...v3.6.0">3.6.0</a> (2025-02-21)</h1> <h3>Features</h3> <ul> <li>DeepSeek R1 support (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>) (documentation: <a href="https://node-llama-cpp.withcat.ai/blog/v3.6-deepseek-r1">DeepSeek R1</a>)</li> <li>chain of thought segmentation (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>) (documentation: <a href="https://node-llama-cpp.withcat.ai/guide/chat-session#stream-response-segments">Stream Response Segments</a>)</li> <li>pass a model to <code>resolveChatWrapper</code> (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li><strong><code>defineChatSessionFunction</code></strong>: improve <code>params</code> type (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li><strong>Electron template</strong>: show chain of thought (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>) (documentation: <a href="https://node-llama-cpp.withcat.ai/blog/v3.6-deepseek-r1#electron-app-template">DeepSeek R1</a>)</li> <li><strong>Electron template</strong>: add functions template (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li><strong>Electron template</strong>: new icon for the CI build (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li><strong>Electron template</strong>: update model message in a more stable manner (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li><strong>Electron template</strong>: more convenient completion (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> </ul> <h3>Bug Fixes</h3> <ul> <li>check path existence before reading its content (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li>partial tokens handling (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li>uncaught exception (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/430">#430</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/599a1616f76ca53168bbf534e3bdcabb5180b34c">599a161</a>)</li> <li><strong>Electron template</strong>: non-latin text formatting (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/430">#430</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/599a1616f76ca53168bbf534e3bdcabb5180b34c">599a161</a>)</li> </ul> <hr /> <p>Shipped with <code>llama.cpp</code> release <a href="https://github.com/ggml-org/llama.cpp/releases/tag/b4753"><code>b4753</code></a></p> <blockquote> <p>To use the latest <code>llama.cpp</code> release available, run <code>npx -n node-llama-cpp source download --release latest</code>. (<a href="https://node-llama-cpp.withcat.ai/guide/building-from-source#download-new-release">learn more</a>)</p> </blockquote> <h2>v3.5.0</h2> <h1><a href="https://github.com/withcatai/node-llama-cpp/compare/v3.4.3...v3.5.0">3.5.0</a> (2025-01-31)</h1> <h3>Features</h3> <ul> <li>shorter model URIs (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/421">#421</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/73454d976c5fa92a32a96efc00234dcd668679c2">73454d9</a>) (documentation: <a href="https://node-llama-cpp.withcat.ai/guide/downloading-models#model-uris">Model URIs</a>)</li> </ul> <h3>Bug Fixes</h3> <ul> <li>add missing Jinja features for DeepSeek (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/425">#425</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/63a106627e1a8664ac335526c987522c94e87ce2">6e4bf3d</a>)</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/599a1616f76ca53168bbf534e3bdcabb5180b34c"><code>599a161</code></a> fix: uncaught exception (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/430">#430</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961"><code>ca6b901</code></a> feat: DeepSeek R1 support (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/63a106627e1a8664ac335526c987522c94e87ce2"><code>63a1066</code></a> chore: update modules (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/425">#425</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/73454d976c5fa92a32a96efc00234dcd668679c2"><code>73454d9</code></a> feat: shorter model URIs (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/421">#421</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/6e4bf3dbae7f3a6e0660ba6149e200fef59e0b71"><code>6e4bf3d</code></a> fix: adapt to <code>llama.cpp</code> breaking changes (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/424">#424</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/314d7e81b077c29c1451696c44e348129466ebb6"><code>314d7e8</code></a> fix: metadata string encoding (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/420">#420</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/86e1bee5c43313ddb2fe7270ec8049ac57aadf9d"><code>86e1bee</code></a> fix: ranking empty inputs (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/415">#415</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/d1b44161e90bb1fe4097c62e0f18b9bb2de8da8e"><code>d1b4416</code></a> fix: reranking probabilities (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/412">#412</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/5d07289d132dc7937263aa0de1dc812cd93824cb"><code>5d07289</code></a> chore: update modules (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/411">#411</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/632a7bf78fcee2ba1fa46865c5a2aea31db3d4da"><code>632a7bf</code></a> feat: token prediction (speculative decoding) (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/405">#405</a>)</li> <li>Additional commits viewable in <a href="https://github.com/withcatai/node-llama-cpp/compare/v3.0.0-beta.44...v3.6.0">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=node-llama-cpp&package-manager=npm_and_yarn&previous-version=3.0.0-beta.44&new-version=3.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) You can trigger a rebase of this PR by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> > **Note** > Automatic rebases have been disabled on this pull request as it has been open for over 30 days. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
kerem 2026-03-03 13:54:15 +03:00
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/humanify#368
No description provided.