[PR #600] [CLOSED] Bump node-llama-cpp from 3.7.0 to 3.13.0 #606

Closed
opened 2026-03-03 13:55:19 +03:00 by kerem · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/jehna/humanify/pull/600
Author: @dependabot[bot]
Created: 9/10/2025
Status: Closed

Base: mainHead: dependabot/npm_and_yarn/node-llama-cpp-3.13.0


📝 Commits (1)

  • 793048f Bump node-llama-cpp from 3.7.0 to 3.13.0

📊 Changes

1 file changed (+458 additions, -291 deletions)

View changed files

📝 package-lock.json (+458 -291)

📄 Description

Bumps node-llama-cpp from 3.7.0 to 3.13.0.

Release notes

Sourced from node-llama-cpp's releases.

v3.13.0

3.13.0 (2025-09-09)

Features

Bug Fixes


Shipped with llama.cpp release b6431

To use the latest llama.cpp release available, run npx -n node-llama-cpp source download --release latest. (learn more)

v3.12.4

gpt-oss is here!

Read about the release in the blog post


3.12.4 (2025-08-28)

Bug Fixes


Shipped with llama.cpp release b6301

To use the latest llama.cpp release available, run npx -n node-llama-cpp source download --release latest. (learn more)

v3.12.3

gpt-oss is here!

Read about the release in the blog post


3.12.3 (2025-08-26)

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/jehna/humanify/pull/600 **Author:** [@dependabot[bot]](https://github.com/apps/dependabot) **Created:** 9/10/2025 **Status:** ❌ Closed **Base:** `main` ← **Head:** `dependabot/npm_and_yarn/node-llama-cpp-3.13.0` --- ### 📝 Commits (1) - [`793048f`](https://github.com/jehna/humanify/commit/793048fc1565a053084a12dd54df11c68ac51519) Bump node-llama-cpp from 3.7.0 to 3.13.0 ### 📊 Changes **1 file changed** (+458 additions, -291 deletions) <details> <summary>View changed files</summary> 📝 `package-lock.json` (+458 -291) </details> ### 📄 Description Bumps [node-llama-cpp](https://github.com/withcatai/node-llama-cpp) from 3.7.0 to 3.13.0. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/withcatai/node-llama-cpp/releases">node-llama-cpp's releases</a>.</em></p> <blockquote> <h2>v3.13.0</h2> <h1><a href="https://github.com/withcatai/node-llama-cpp/compare/v3.12.4...v3.13.0">3.13.0</a> (2025-09-09)</h1> <h3>Features</h3> <ul> <li>Seed OSS support (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/502">#502</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/eefe78c8ffa2dd277e1b8913d957f61eadc8788a">eefe78c</a>)</li> </ul> <h3>Bug Fixes</h3> <ul> <li>adapt to breaking <code>llama.cpp</code> changes (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/501">#501</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/76b505edf350ae8bf8837fddeda68f8fb9ed4550">76b505e</a>)</li> <li><strong>Vulkan:</strong> read external memory usage (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/500">#500</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/d33cc315eb5ecfc209da4d843a6ac7184e832754">d33cc31</a>)</li> </ul> <hr /> <p>Shipped with <code>llama.cpp</code> release <a href="https://github.com/ggml-org/llama.cpp/releases/tag/b6431"><code>b6431</code></a></p> <blockquote> <p>To use the latest <code>llama.cpp</code> release available, run <code>npx -n node-llama-cpp source download --release latest</code>. (<a href="https://node-llama-cpp.withcat.ai/guide/building-from-source#download-new-release">learn more</a>)</p> </blockquote> <h2>v3.12.4</h2> <p><a href="https://node-llama-cpp.withcat.ai/blog/v3.12-gpt-oss"><img src="https://github.com/user-attachments/assets/df5f1f59-a2cd-4fdb-b60c-3214f4a1584b" alt="" /></a></p> <h1>✨ <a href="https://node-llama-cpp.withcat.ai/blog/v3.12-gpt-oss"><code>gpt-oss</code> is here!</a> ✨</h1> <p>Read about the release in the <a href="https://node-llama-cpp.withcat.ai/blog/v3.12-gpt-oss">blog post</a></p> <hr /> <h2><a href="https://github.com/withcatai/node-llama-cpp/compare/v3.12.3...v3.12.4">3.12.4</a> (2025-08-28)</h2> <h3>Bug Fixes</h3> <ul> <li>gpt-oss prompt preloading (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/496">#496</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/db4a2437d08a659a0972e9c435609d77b93e209c">db4a243</a>)</li> </ul> <hr /> <p>Shipped with <code>llama.cpp</code> release <a href="https://github.com/ggml-org/llama.cpp/releases/tag/b6301"><code>b6301</code></a></p> <blockquote> <p>To use the latest <code>llama.cpp</code> release available, run <code>npx -n node-llama-cpp source download --release latest</code>. (<a href="https://node-llama-cpp.withcat.ai/guide/building-from-source#download-new-release">learn more</a>)</p> </blockquote> <h2>v3.12.3</h2> <p><a href="https://node-llama-cpp.withcat.ai/blog/v3.12-gpt-oss"><img src="https://github.com/user-attachments/assets/df5f1f59-a2cd-4fdb-b60c-3214f4a1584b" alt="" /></a></p> <h1>✨ <a href="https://node-llama-cpp.withcat.ai/blog/v3.12-gpt-oss"><code>gpt-oss</code> is here!</a> ✨</h1> <p>Read about the release in the <a href="https://node-llama-cpp.withcat.ai/blog/v3.12-gpt-oss">blog post</a></p> <hr /> <h2><a href="https://github.com/withcatai/node-llama-cpp/compare/v3.12.2...v3.12.3">3.12.3</a> (2025-08-26)</h2> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/eefe78c8ffa2dd277e1b8913d957f61eadc8788a"><code>eefe78c</code></a> feat: Seed OSS support (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/502">#502</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/d33cc315eb5ecfc209da4d843a6ac7184e832754"><code>d33cc31</code></a> fix(Vulkan): read external memory usage (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/500">#500</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/76b505edf350ae8bf8837fddeda68f8fb9ed4550"><code>76b505e</code></a> fix: adapt to breaking <code>llama.cpp</code> changes (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/501">#501</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/c5cd057a0097959241faec053b863e60103b103e"><code>c5cd057</code></a> test: fix tests (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/499">#499</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/634fbac5989e35829a8f3e2e7382d97cfaed5727"><code>634fbac</code></a> test: fix tests (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/497">#497</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/db4a2437d08a659a0972e9c435609d77b93e209c"><code>db4a243</code></a> fix: gpt-oss prompt preloading (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/496">#496</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/6e59160dd36a0b558675f61cd1bd06cca522193c"><code>6e59160</code></a> fix: split prebuilt CUDA binaries into 2 npm modules (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/495">#495</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/b10999de02a606a1dc02e67d81188db51346c109"><code>b10999d</code></a> fix: CUDA 13 support (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/494">#494</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/12749c08130773eda6268b3dc811f758ca61bcbc"><code>12749c0</code></a> fix(Vulkan): context creation edge cases (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/492">#492</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/f849cd9d83a3444c6e5b910e7f68388ba34687f4"><code>f849cd9</code></a> fix: completion config (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/490">#490</a>)</li> <li>Additional commits viewable in <a href="https://github.com/withcatai/node-llama-cpp/compare/v3.7.0...v3.13.0">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=node-llama-cpp&package-manager=npm_and_yarn&previous-version=3.7.0&new-version=3.13.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
kerem 2026-03-03 13:55:19 +03:00
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/humanify#606
No description provided.