[PR #382] [MERGED] Bump node-llama-cpp from 3.0.0-beta.44 to 3.7.0 #408

Closed
opened 2026-03-03 13:54:25 +03:00 by kerem · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/jehna/humanify/pull/382
Author: @dependabot[bot]
Created: 3/28/2025
Status: Merged
Merged: 3/28/2025
Merged by: @github-actions[bot]

Base: mainHead: dependabot/npm_and_yarn/node-llama-cpp-3.7.0


📝 Commits (1)

  • 881e9fc Bump node-llama-cpp from 3.0.0-beta.44 to 3.7.0

📊 Changes

1 file changed (+507 additions, -267 deletions)

View changed files

📝 package-lock.json (+507 -267)

📄 Description

Bumps node-llama-cpp from 3.0.0-beta.44 to 3.7.0.

Release notes

Sourced from node-llama-cpp's releases.

v3.7.0

3.7.0 (2025-03-28)

Features

  • extract function calling syntax from a Jinja template (#444) (c070e81)
  • Full support for Qwen and QwQ via QwenChatWrapper (#444) (c070e81)
  • export a llama instance getter on a model instance (#444) (c070e81)

Bug Fixes

  • better handling for function calling with empty parameters (#444) (c070e81)
  • reranking edge case crash (#444) (c070e81)
  • limit the context size by default in the node-typescript template (#444) (c070e81)
  • adapt to breaking llama.cpp changes (#444) (c070e81)
  • bump min nodejs version to 20 due to dependencies' requirements (#444) (c070e81)
  • defineChatSessionFunction type (#444) (c070e81)

Shipped with llama.cpp release b4980

To use the latest llama.cpp release available, run npx -n node-llama-cpp source download --release latest. (learn more)

v3.6.0

DeepSeek R1 is here!

Read about the release in the blog post


3.6.0 (2025-02-21)

Features

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/jehna/humanify/pull/382 **Author:** [@dependabot[bot]](https://github.com/apps/dependabot) **Created:** 3/28/2025 **Status:** ✅ Merged **Merged:** 3/28/2025 **Merged by:** [@github-actions[bot]](https://github.com/apps/github-actions) **Base:** `main` ← **Head:** `dependabot/npm_and_yarn/node-llama-cpp-3.7.0` --- ### 📝 Commits (1) - [`881e9fc`](https://github.com/jehna/humanify/commit/881e9fc53354023b916c5895e4d70572032e20cf) Bump node-llama-cpp from 3.0.0-beta.44 to 3.7.0 ### 📊 Changes **1 file changed** (+507 additions, -267 deletions) <details> <summary>View changed files</summary> 📝 `package-lock.json` (+507 -267) </details> ### 📄 Description Bumps [node-llama-cpp](https://github.com/withcatai/node-llama-cpp) from 3.0.0-beta.44 to 3.7.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/withcatai/node-llama-cpp/releases">node-llama-cpp's releases</a>.</em></p> <blockquote> <h2>v3.7.0</h2> <h1><a href="https://github.com/withcatai/node-llama-cpp/compare/v3.6.0...v3.7.0">3.7.0</a> (2025-03-28)</h1> <h3>Features</h3> <ul> <li>extract function calling syntax from a Jinja template (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/444">#444</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/c070e813eae11cc2286605706d424ffa429d441a">c070e81</a>)</li> <li>Full support for Qwen and QwQ via <code>QwenChatWrapper</code> (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/444">#444</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/c070e813eae11cc2286605706d424ffa429d441a">c070e81</a>)</li> <li>export a <code>llama</code> instance getter on a model instance (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/444">#444</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/c070e813eae11cc2286605706d424ffa429d441a">c070e81</a>)</li> </ul> <h3>Bug Fixes</h3> <ul> <li>better handling for function calling with empty parameters (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/444">#444</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/c070e813eae11cc2286605706d424ffa429d441a">c070e81</a>)</li> <li>reranking edge case crash (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/444">#444</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/c070e813eae11cc2286605706d424ffa429d441a">c070e81</a>)</li> <li>limit the context size by default in the node-typescript template (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/444">#444</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/c070e813eae11cc2286605706d424ffa429d441a">c070e81</a>)</li> <li>adapt to breaking <code>llama.cpp</code> changes (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/444">#444</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/c070e813eae11cc2286605706d424ffa429d441a">c070e81</a>)</li> <li>bump min nodejs version to 20 due to dependencies' requirements (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/444">#444</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/c070e813eae11cc2286605706d424ffa429d441a">c070e81</a>)</li> <li><code>defineChatSessionFunction</code> type (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/444">#444</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/c070e813eae11cc2286605706d424ffa429d441a">c070e81</a>)</li> </ul> <hr /> <p>Shipped with <code>llama.cpp</code> release <a href="https://github.com/ggml-org/llama.cpp/releases/tag/b4980"><code>b4980</code></a></p> <blockquote> <p>To use the latest <code>llama.cpp</code> release available, run <code>npx -n node-llama-cpp source download --release latest</code>. (<a href="https://node-llama-cpp.withcat.ai/guide/building-from-source#download-new-release">learn more</a>)</p> </blockquote> <h2>v3.6.0</h2> <p><img src="https://github.com/user-attachments/assets/9ed954f8-102d-4cdd-96d8-9b6710b8a1f5" alt="" /></p> <h1>✨ <a href="https://node-llama-cpp.withcat.ai/blog/v3.6-deepseek-r1">DeepSeek R1 is here!</a> ✨</h1> <p>Read about the release in the <a href="https://node-llama-cpp.withcat.ai/blog/v3.6-deepseek-r1">blog post</a></p> <hr /> <h1><a href="https://github.com/withcatai/node-llama-cpp/compare/v3.5.0...v3.6.0">3.6.0</a> (2025-02-21)</h1> <h3>Features</h3> <ul> <li>DeepSeek R1 support (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>) (documentation: <a href="https://node-llama-cpp.withcat.ai/blog/v3.6-deepseek-r1">DeepSeek R1</a>)</li> <li>chain of thought segmentation (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>) (documentation: <a href="https://node-llama-cpp.withcat.ai/guide/chat-session#stream-response-segments">Stream Response Segments</a>)</li> <li>pass a model to <code>resolveChatWrapper</code> (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li><strong><code>defineChatSessionFunction</code></strong>: improve <code>params</code> type (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li><strong>Electron template</strong>: show chain of thought (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>) (documentation: <a href="https://node-llama-cpp.withcat.ai/blog/v3.6-deepseek-r1#electron-app-template">DeepSeek R1</a>)</li> <li><strong>Electron template</strong>: add functions template (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li><strong>Electron template</strong>: new icon for the CI build (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li><strong>Electron template</strong>: update model message in a more stable manner (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> <li><strong>Electron template</strong>: more convenient completion (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961">ca6b901</a>)</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/c070e813eae11cc2286605706d424ffa429d441a"><code>c070e81</code></a> feat: extract function calling syntax from a Jinja template (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/444">#444</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/ee94403bb1ae7cb8814a459e6db8259da6c2da8a"><code>ee94403</code></a> docs: fix electron command (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/442">#442</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/1d13c0e64649b1a77e1f2655fb5567e069e873d1"><code>1d13c0e</code></a> docs: fix cover images (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/432">#432</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/599a1616f76ca53168bbf534e3bdcabb5180b34c"><code>599a161</code></a> fix: uncaught exception (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/430">#430</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/ca6b901ae47ba99fa31590d1f5f8a8efba797961"><code>ca6b901</code></a> feat: DeepSeek R1 support (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/428">#428</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/63a106627e1a8664ac335526c987522c94e87ce2"><code>63a1066</code></a> chore: update modules (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/425">#425</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/73454d976c5fa92a32a96efc00234dcd668679c2"><code>73454d9</code></a> feat: shorter model URIs (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/421">#421</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/6e4bf3dbae7f3a6e0660ba6149e200fef59e0b71"><code>6e4bf3d</code></a> fix: adapt to <code>llama.cpp</code> breaking changes (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/424">#424</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/314d7e81b077c29c1451696c44e348129466ebb6"><code>314d7e8</code></a> fix: metadata string encoding (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/420">#420</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/86e1bee5c43313ddb2fe7270ec8049ac57aadf9d"><code>86e1bee</code></a> fix: ranking empty inputs (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/415">#415</a>)</li> <li>Additional commits viewable in <a href="https://github.com/withcatai/node-llama-cpp/compare/v3.0.0-beta.44...v3.7.0">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=node-llama-cpp&package-manager=npm_and_yarn&previous-version=3.0.0-beta.44&new-version=3.7.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
kerem 2026-03-03 13:54:25 +03:00
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/humanify#408
No description provided.