[PR #726] Bump node-llama-cpp from 3.7.0 to 3.15.1 #725

Open
opened 2026-03-03 13:55:51 +03:00 by kerem · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/jehna/humanify/pull/726
Author: @dependabot[bot]
Created: 1/27/2026
Status: 🔄 Open

Base: mainHead: dependabot/npm_and_yarn/node-llama-cpp-3.15.1


📝 Commits (1)

  • b945d5b Bump node-llama-cpp from 3.7.0 to 3.15.1

📊 Changes

1 file changed (+459 additions, -291 deletions)

View changed files

📝 package-lock.json (+459 -291)

📄 Description

Bumps node-llama-cpp from 3.7.0 to 3.15.1.

Release notes

Sourced from node-llama-cpp's releases.

v3.15.1

3.15.1 (2026-01-26)

Bug Fixes


Shipped with llama.cpp release b7836

To use the latest llama.cpp release available, run npx -n node-llama-cpp source download --release latest. (learn more)

v3.15.0

3.15.0 (2026-01-10)

Features

Bug Fixes

  • support new CUDA 13.1 archs (#538) (734693d)
  • build the prebuilt binaries with CUDA 13.1 instead of 13.0 (#538) (734693d)

Shipped with llama.cpp release b7698

To use the latest llama.cpp release available, run npx -n node-llama-cpp source download --release latest. (learn more)

v3.14.5

3.14.5 (2025-12-10)

Bug Fixes


... (truncated)

Commits
Maintainer changes

This version was pushed to npm by [GitHub Actions](https://www.npmjs.com/~GitHub Actions), a new releaser for node-llama-cpp since your current version.


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/jehna/humanify/pull/726 **Author:** [@dependabot[bot]](https://github.com/apps/dependabot) **Created:** 1/27/2026 **Status:** 🔄 Open **Base:** `main` ← **Head:** `dependabot/npm_and_yarn/node-llama-cpp-3.15.1` --- ### 📝 Commits (1) - [`b945d5b`](https://github.com/jehna/humanify/commit/b945d5b01b623dbe1f810d7aa47dfca7ca6c7e15) Bump node-llama-cpp from 3.7.0 to 3.15.1 ### 📊 Changes **1 file changed** (+459 additions, -291 deletions) <details> <summary>View changed files</summary> 📝 `package-lock.json` (+459 -291) </details> ### 📄 Description Bumps [node-llama-cpp](https://github.com/withcatai/node-llama-cpp) from 3.7.0 to 3.15.1. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/withcatai/node-llama-cpp/releases">node-llama-cpp's releases</a>.</em></p> <blockquote> <h2>v3.15.1</h2> <h2><a href="https://github.com/withcatai/node-llama-cpp/compare/v3.15.0...v3.15.1">3.15.1</a> (2026-01-26)</h2> <h3>Bug Fixes</h3> <ul> <li>adapt to <code>llama.cpp</code> changes (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/547">#547</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/4baa480f6d85f7ca425ffec8811963407f0bc9e1">4baa480</a>)</li> <li>duplicate backend library files (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/541">#541</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/f5123bffa483202c8cff85e1f4b11933f4189597">f5123bf</a>)</li> </ul> <hr /> <p>Shipped with <code>llama.cpp</code> release <a href="https://github.com/ggml-org/llama.cpp/releases/tag/b7836"><code>b7836</code></a></p> <blockquote> <p>To use the latest <code>llama.cpp</code> release available, run <code>npx -n node-llama-cpp source download --release latest</code>. (<a href="https://node-llama-cpp.withcat.ai/guide/building-from-source#download-new-release">learn more</a>)</p> </blockquote> <h2>v3.15.0</h2> <h1><a href="https://github.com/withcatai/node-llama-cpp/compare/v3.14.5...v3.15.0">3.15.0</a> (2026-01-10)</h1> <h3>Features</h3> <ul> <li><strong><code>LlamaCompletion</code>:</strong> <code>stopOnAbortSignal</code> (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/538">#538</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/734693d5022c6627823ca7cdd270ad6dda67812c">734693d</a>) (documentation: <a href="https://node-llama-cpp.withcat.ai/api/type-aliases/LlamaCompletionGenerationOptions#stoponabortsignal"><code>LlamaCompletionGenerationOptions[&quot;stopOnAbortSignal&quot;]</code></a>) <strong><code>LlamaModel</code>:</strong> <code>useDirectIo</code> (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/538">#538</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/734693d5022c6627823ca7cdd270ad6dda67812c">734693d</a>) (documentation: <a href="https://node-llama-cpp.withcat.ai/api/type-aliases/LlamaModelOptions#usedirectio"><code>LlamaModelOptions[&quot;useDirectIo&quot;]</code></a>)</li> </ul> <h3>Bug Fixes</h3> <ul> <li>support new CUDA 13.1 archs (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/538">#538</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/734693d5022c6627823ca7cdd270ad6dda67812c">734693d</a>)</li> <li>build the prebuilt binaries with CUDA 13.1 instead of 13.0 (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/538">#538</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/734693d5022c6627823ca7cdd270ad6dda67812c">734693d</a>)</li> </ul> <hr /> <p>Shipped with <code>llama.cpp</code> release <a href="https://github.com/ggml-org/llama.cpp/releases/tag/b7698"><code>b7698</code></a></p> <blockquote> <p>To use the latest <code>llama.cpp</code> release available, run <code>npx -n node-llama-cpp source download --release latest</code>. (<a href="https://node-llama-cpp.withcat.ai/guide/building-from-source#download-new-release">learn more</a>)</p> </blockquote> <h2>v3.14.5</h2> <h2><a href="https://github.com/withcatai/node-llama-cpp/compare/v3.14.4...v3.14.5">3.14.5</a> (2025-12-10)</h2> <h3>Bug Fixes</h3> <ul> <li>OIDC package publish (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/531">#531</a>) (<a href="https://github.com/withcatai/node-llama-cpp/commit/3d3cb977ae698d5404bf95822a9a2d590b363f14">3d3cb97</a>)</li> </ul> <hr /> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/4baa480f6d85f7ca425ffec8811963407f0bc9e1"><code>4baa480</code></a> fix: adapt to <code>llama.cpp</code> changes (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/547">#547</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/1997b4ea4a3ae831cb4b22832eda770e1a9c17af"><code>1997b4e</code></a> fix: bin test log level config (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/546">#546</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/f5123bffa483202c8cff85e1f4b11933f4189597"><code>f5123bf</code></a> fix: duplicate backend library files (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/541">#541</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/734693d5022c6627823ca7cdd270ad6dda67812c"><code>734693d</code></a> feat(<code>LlamaCompletion</code>): <code>stopOnAbortSignal</code> (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/538">#538</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/7e467cce5f8f710da70059619f9778b3bd9049e5"><code>7e467cc</code></a> docs: fix cmake dependencies link (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/534">#534</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/f78a78ee424d38f129dcbcdc5f511bb398d2e138"><code>f78a78e</code></a> test: exclude irrelevant tests (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/533">#533</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/3d3cb977ae698d5404bf95822a9a2d590b363f14"><code>3d3cb97</code></a> fix: OIDC package publish (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/531">#531</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/9a428e5dc8b2344174e4a73643f29585ac88fd79"><code>9a428e5</code></a> fix: <code>create-node-llama-cpp</code> module package release (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/530">#530</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/87414715c328208d191c374b5e428fff730bece8"><code>8741471</code></a> test: fix tests (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/528">#528</a>)</li> <li><a href="https://github.com/withcatai/node-llama-cpp/commit/2852a23abf437ce341e94c22be5cd0cf1e828d26"><code>2852a23</code></a> build: fix release (<a href="https://redirect.github.com/withcatai/node-llama-cpp/issues/527">#527</a>)</li> <li>Additional commits viewable in <a href="https://github.com/withcatai/node-llama-cpp/compare/v3.7.0...v3.15.1">compare view</a></li> </ul> </details> <details> <summary>Maintainer changes</summary> <p>This version was pushed to npm by [GitHub Actions](<a href="https://www.npmjs.com/~GitHub">https://www.npmjs.com/~GitHub</a> Actions), a new releaser for node-llama-cpp since your current version.</p> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=node-llama-cpp&package-manager=npm_and_yarn&previous-version=3.7.0&new-version=3.15.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/humanify#725
No description provided.