[GH-ISSUE #135] install problem #47

Open
opened 2026-03-03 13:52:33 +03:00 by kerem · 7 comments
Owner

Originally created by @Yusufkulcu on GitHub (Sep 29, 2024).
Original GitHub issue: https://github.com/jehna/humanify/issues/135

I bought a new virtual server and I'm trying to install it. but I get the errors in the images. what could be the problem?

I added the error log below

command used: "npm install -g humanifyjs"

image
image

log.txt

Originally created by @Yusufkulcu on GitHub (Sep 29, 2024). Original GitHub issue: https://github.com/jehna/humanify/issues/135 I bought a new virtual server and I'm trying to install it. but I get the errors in the images. what could be the problem? I added the error log below command used: "npm install -g humanifyjs" ![image](https://github.com/user-attachments/assets/ae2974bc-adb0-464d-a841-7cab379bddf7) ![image](https://github.com/user-attachments/assets/b77e5494-e736-47b0-86a7-4887698f2fa6) [log.txt](https://github.com/user-attachments/files/17179957/log.txt)
Author
Owner

@jehna commented on GitHub (Sep 29, 2024):

What's your node --version

<!-- gh-comment-id:2381416235 --> @jehna commented on GitHub (Sep 29, 2024): What's your `node --version `
Author
Owner

@Yusufkulcu commented on GitHub (Sep 29, 2024):

What's your node --version

image

<!-- gh-comment-id:2381416754 --> @Yusufkulcu commented on GitHub (Sep 29, 2024): > What's your `node --version ` ![image](https://github.com/user-attachments/assets/3447142a-3e2a-4e7e-9b99-58cd2687ff8c)
Author
Owner

@Yusufkulcu commented on GitHub (Sep 29, 2024):

What's your node --version

I have v20.10.0 installed on my computer. I installed the same on the virtual server and tried again. First, it gave the error "Python is not installed" and then it gave the errors I added below.

operating system: server 2019

node version : v20.10.0
python version : Python 3.12.6

image

<!-- gh-comment-id:2381420747 --> @Yusufkulcu commented on GitHub (Sep 29, 2024): > What's your `node --version ` I have v20.10.0 installed on my computer. I installed the same on the virtual server and tried again. First, it gave the error "Python is not installed" and then it gave the errors I added below. operating system: server 2019 node version : v20.10.0 python version : Python 3.12.6 ![image](https://github.com/user-attachments/assets/67ce9bf9-3f2b-48f6-8830-096f5900f343)
Author
Owner

@Yusufkulcu commented on GitHub (Sep 29, 2024):

Do you have any solution suggestions?

<!-- gh-comment-id:2381432377 --> @Yusufkulcu commented on GitHub (Sep 29, 2024): Do you have any solution suggestions?
Author
Owner

@0xdevalias commented on GitHub (Sep 30, 2024):

Potentially duplicate of/similar root cause as the following issues:

With some notes from there:

If anyone still cannot make it work in Windows then my suggestion is to use WSL, it works with a couple of tweaks (I didn't test with GPU yet)

Originally posted by @neoOpus in https://github.com/jehna/humanify/issues/10#issuecomment-2359492811

There is a note about needing node.js 20 in the readme, but it seems to be easy to miss. I wonder which place would be better. Maybe a preinstall hook thst ensures the version?

So, you advise using this in WSL instead of directly on Windows NodeJS ?

Originally posted by @neoOpus in https://github.com/jehna/humanify/issues/71#issuecomment-2328180259

Unfortunately I have no idea how Windows works on development, the last time I used Windows as my main dev machine was over 10 years ago 😅

I'd guess both WSL and non-wsl should work fine, as long as you have recent enough version. You can use e.g. nvm to switch between versions on WSL

Originally posted by @jehna in https://github.com/jehna/humanify/issues/71#issuecomment-2329153373


log.txt

Skimming through the initial attached log, there seem to be a number of errors/warnings.

This part might not matter (not 100%, just guessing):

npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\isolated-vm\src'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\isolated-vm\\src'
npm warn cleanup     }
npm warn cleanup   ],
npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp\llama\llama.cpp'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp\\llama\\llama.cpp'
npm warn cleanup     }
npm warn cleanup   ],
npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs'
npm warn cleanup     }
npm warn cleanup   ]
npm warn cleanup ]

This is probably a breaking error:

npm error code 1
npm error path C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp
npm error command failed
npm error command C:\Windows\system32\cmd.exe /d /s /c node ./dist/cli/cli.js postinstall

This is probably only needed because no prebuilt binary was found:

npm error ^[[?25h+ Downloading cmake
npm error × Failed to download cmake

Failing to load a prebuilt binary is probably based on a combination of node version and system architecture. Then the 'build from source' might be failing because you're running it in cmd.exe rather than a shell like bash/similar (assuming it's written to be built on a *nix type system), or because you don't have appropriate dev tools installed/similar:


npm error Failed to load a prebuilt binary for platform "win" "x64", falling back to building from source. Error: Error: The specified module could not be found.
npm error \\?\C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node
npm error     at Module._extensions..node (node:internal/modules/cjs/loader:1586:18)
npm error     at Module.load (node:internal/modules/cjs/loader:1288:32)
npm error     at Module._load (node:internal/modules/cjs/loader:1104:12)
npm error     at Module.require (node:internal/modules/cjs/loader:1311:19)
npm error     at require (node:internal/modules/helpers:179:18)
npm error     at loadBindingModule (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:390:25)
npm error     at loadExistingLlamaBinary (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:276:37)
npm error     at async getLlamaForOptions (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:133:27)
npm error     at async Object.handler (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/cli/commands/OnPostInstallCommand.js:12:13) {
npm error   code: 'ERR_DLOPEN_FAILED'
npm error }
npm error ^[[?25l^[[?25h'xpm' is not recognized as an internal or external command,
npm error operable program or batch file.
npm error Failed to build llama.cpp with no GPU support. Error: SpawnError: Command npm exec --yes -- xpm@^0.16.3 install @xpack-dev-tools/cmake@latest --no-save exited with code 1
npm error     at createError (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:34:20)
npm error     at ChildProcess.<anonymous> (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:47:24)
npm error     at ChildProcess.emit (node:events:519:28)
npm error     at cp.emit (C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\cross-spawn\lib\enoent.js:34:29)
npm error     at ChildProcess._handle.onexit (node:internal/child_process:294:12)
npm error SpawnError: Command npm exec --yes -- xpm@^0.16.3 install @xpack-dev-tools/cmake@latest --no-save exited with code 1
npm error     at createError (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:34:20)
npm error     at ChildProcess.<anonymous> (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:47:24)
npm error     at ChildProcess.emit (node:events:519:28)
npm error     at cp.emit (C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\cross-spawn\lib\enoent.js:34:29)
npm error     at ChildProcess._handle.onexit (node:internal/child_process:294:12)

'xpm' is not recognized as an internal or external command, npm error operable program or batch file.


Failed to load a prebuilt binary for platform "win" "x64", falling back to building from source. Error: Error: The specified module could not be found. npm error \?\C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node

  • https://github.com/withcatai/node-llama-cpp#installation
    • This package comes with pre-built binaries for macOS, Linux and Windows.

      If binaries are not available for your platform, it'll fallback to download a release of llama.cpp and build it from source with cmake. To disable this behavior, set the environment variable NODE_LLAMA_CPP_SKIP_DOWNLOAD to true.

    • https://node-llama-cpp.withcat.ai/guide/building-from-source
      • INFO

        If cmake is not installed on your machine, node-llama-cpp will automatically download cmake to an internal directory and try to use it to build llama.cpp from source.

        If the build fails, make sure you have the required dependencies of cmake installed on your machine.

We can see the prebuilt dependencies in the node-llama-cpp package.json here:

  "optionalDependencies": {
    "@node-llama-cpp/linux-arm64": "0.1.0",
    "@node-llama-cpp/linux-armv7l": "0.1.0",
    "@node-llama-cpp/linux-x64": "0.1.0",
    "@node-llama-cpp/linux-x64-cuda": "0.1.0",
    "@node-llama-cpp/linux-x64-vulkan": "0.1.0",
    "@node-llama-cpp/mac-arm64-metal": "0.1.0",
    "@node-llama-cpp/mac-x64": "0.1.0",
    "@node-llama-cpp/win-arm64": "0.1.0",
    "@node-llama-cpp/win-x64": "0.1.0",
    "@node-llama-cpp/win-x64-cuda": "0.1.0",
    "@node-llama-cpp/win-x64-vulkan": "0.1.0"
  }

The source for these are here:

Since there is a @node-llama-cpp/win-x64, the Failed to load a prebuilt binary for platform "win" "x64" part of the error is definitely.. interesting.

@Yusufkulcu Does the machine you're installing this on have access to the internet? I'm guessing it tries to dynamically decide which optional binary package it will need based on the system and install that.

We can see that when building the package, it runs npm run addPostinstallScript:

Which we can see is defined in package.json, and basically tells the built package to trigger node ./dist/cli/cli.js postinstall:

"addPostinstallScript": "npm pkg set scripts.postinstall=\"node ./dist/cli/cli.js postinstall\"",

In cli/cli.js we can see that it registers the OnPostInstallCommand:

Which is defined here, and calls getLlamaForOptions:

Which is defined here, and seems to be the part of the package that gets the systems platform/architecture, and tries to determine if it can use prebuilt binaries or not/etc:

Within that, we can see there is a call to loadExistingLlamaBinary:

That calls getPrebuiltBinaryPath, which then calls getPrebuiltBinariesPackageDirectoryForBuildOptions, which is what handles attempting to import the various prebuilt binary packages:

    } else if (buildOptions.platform === "win") {
        if (buildOptions.arch === "x64") {
            if (buildOptions.gpu === "cuda")
                // @ts-ignore
                return getBinariesPathFromModules(() => import("@node-llama-cpp/win-x64-cuda"));
            else if (buildOptions.gpu === "vulkan")
                // @ts-ignore
                return getBinariesPathFromModules(() => import("@node-llama-cpp/win-x64-vulkan"));
            else if (buildOptions.gpu === false)
                // @ts-ignore
                return getBinariesPathFromModules(() => import("@node-llama-cpp/win-x64"));
        } else if (buildOptions.arch === "arm64")
            // @ts-ignore
            return getBinariesPathFromModules(() => import("@node-llama-cpp/win-arm64"));
    }

This then calls getBinariesPathFromModules which seems to do some stuff to find the bins dir:

Jumping back out to loadExistingLlamaBinary, we can see the 'failed to load a prebuilt binary' message here within a catch block:

The actual triggering error is appended to the end of that message, so the root issue appears to be this part:

Error: Error: The specified module could not be found.
npm error \\?\C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node
npm error     at Module._extensions..node (node:internal/modules/cjs/loader:1586:18)
npm error     at Module.load (node:internal/modules/cjs/loader:1288:32)
npm error     at Module._load (node:internal/modules/cjs/loader:1104:12)
npm error     at Module.require (node:internal/modules/cjs/loader:1311:19)
npm error     at require (node:internal/modules/helpers:179:18)
npm error     at loadBindingModule (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:390:25)
npm error     at loadExistingLlamaBinary (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:276:37)
npm error     at async getLlamaForOptions (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:133:27)
npm error     at async Object.handler (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/cli/commands/OnPostInstallCommand.js:12:13) {
npm error   code: 'ERR_DLOPEN_FAILED'
npm error }

@Yusufkulcu So I guess the first thing I would be doing is checking whether that file actually exists, and if there is anything that might be blocking it from being able to be loaded (permissions, antivirus, etc):

  • C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node

If that doesn't help, you may be able to get some more specific ideas/support on the GitHub Discussions page for node-llama-cpp:

I took the liberty to open a discussion there based on this issue to potentially expediate getting an official answer:


@Yusufkulcu @jehna It seems that humanify is currently using node-llama-cpp 3.0.0-beta.40:

github.com/jehna/humanify@64a1b9511d/package.json (L57)

Whereas it seems to now be up to 3.0.3, so it's possible there have been some relevant bug fixes/improvements released since then:


I have v20.10.0 installed on my computer. I installed the same on the virtual server and tried again. First, it gave the error "Python is not installed" and then it gave the errors I added below.

operating system: server 2019

node version : v20.10.0 python version : Python 3.12.6

@Yusufkulcu The error in your screenshot here is a different error to the one in your initial post. The error you're getting on the virtual server here is related to there not being a prebuilt binary for isolated-vm for that server; which is basically the root cause of this issue:

But solving that won't solve the issues you're having on your main system install.

<!-- gh-comment-id:2381851607 --> @0xdevalias commented on GitHub (Sep 30, 2024): Potentially duplicate of/similar root cause as the following issues: - https://github.com/jehna/humanify/issues/10 - https://github.com/jehna/humanify/issues/71 - https://github.com/jehna/humanify/issues/50 With some notes from there: > If anyone still cannot make it work in Windows then my suggestion is to use WSL, it works with a couple of tweaks (I didn't test with GPU yet) > > _Originally posted by @neoOpus in https://github.com/jehna/humanify/issues/10#issuecomment-2359492811_ > > There is a note about needing node.js 20 in the readme, but it seems to be easy to miss. I wonder which place would be better. Maybe a preinstall hook thst ensures the version? > > So, you advise using this in WSL instead of directly on Windows NodeJS ? > > _Originally posted by @neoOpus in https://github.com/jehna/humanify/issues/71#issuecomment-2328180259_ > Unfortunately I have no idea how Windows works on development, the last time I used Windows as my main dev machine was over 10 years ago 😅 > > I'd guess both WSL and non-wsl should work fine, as long as you have recent enough version. You can use e.g. [nvm](https://github.com/nvm-sh/nvm) to switch between versions on WSL > > _Originally posted by @jehna in https://github.com/jehna/humanify/issues/71#issuecomment-2329153373_ --- > log.txt Skimming through the initial attached log, there seem to be a number of errors/warnings. This part might not matter (not 100%, just guessing): ``` npm warn cleanup [ npm warn cleanup 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules', npm warn cleanup [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\isolated-vm\src'] { npm warn cleanup errno: -4048, npm warn cleanup code: 'EPERM', npm warn cleanup syscall: 'rmdir', npm warn cleanup path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\isolated-vm\\src' npm warn cleanup } npm warn cleanup ], npm warn cleanup [ npm warn cleanup 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp', npm warn cleanup [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp\llama\llama.cpp'] { npm warn cleanup errno: -4048, npm warn cleanup code: 'EPERM', npm warn cleanup syscall: 'rmdir', npm warn cleanup path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp\\llama\\llama.cpp' npm warn cleanup } npm warn cleanup ], npm warn cleanup [ npm warn cleanup 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs', npm warn cleanup [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs'] { npm warn cleanup errno: -4048, npm warn cleanup code: 'EPERM', npm warn cleanup syscall: 'rmdir', npm warn cleanup path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs' npm warn cleanup } npm warn cleanup ] npm warn cleanup ] ``` This is probably a breaking error: ``` npm error code 1 npm error path C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp npm error command failed npm error command C:\Windows\system32\cmd.exe /d /s /c node ./dist/cli/cli.js postinstall ``` This is probably only needed because no prebuilt binary was found: ``` npm error ^[[?25h+ Downloading cmake npm error × Failed to download cmake ``` Failing to load a prebuilt binary is probably based on a combination of node version and system architecture. Then the 'build from source' might be failing because you're running it in `cmd.exe` rather than a shell like bash/similar (assuming it's written to be built on a *nix type system), or because you don't have appropriate dev tools installed/similar: ``` npm error Failed to load a prebuilt binary for platform "win" "x64", falling back to building from source. Error: Error: The specified module could not be found. npm error \\?\C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node npm error at Module._extensions..node (node:internal/modules/cjs/loader:1586:18) npm error at Module.load (node:internal/modules/cjs/loader:1288:32) npm error at Module._load (node:internal/modules/cjs/loader:1104:12) npm error at Module.require (node:internal/modules/cjs/loader:1311:19) npm error at require (node:internal/modules/helpers:179:18) npm error at loadBindingModule (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:390:25) npm error at loadExistingLlamaBinary (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:276:37) npm error at async getLlamaForOptions (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:133:27) npm error at async Object.handler (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/cli/commands/OnPostInstallCommand.js:12:13) { npm error code: 'ERR_DLOPEN_FAILED' npm error } npm error ^[[?25l^[[?25h'xpm' is not recognized as an internal or external command, npm error operable program or batch file. npm error Failed to build llama.cpp with no GPU support. Error: SpawnError: Command npm exec --yes -- xpm@^0.16.3 install @xpack-dev-tools/cmake@latest --no-save exited with code 1 npm error at createError (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:34:20) npm error at ChildProcess.<anonymous> (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:47:24) npm error at ChildProcess.emit (node:events:519:28) npm error at cp.emit (C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\cross-spawn\lib\enoent.js:34:29) npm error at ChildProcess._handle.onexit (node:internal/child_process:294:12) npm error SpawnError: Command npm exec --yes -- xpm@^0.16.3 install @xpack-dev-tools/cmake@latest --no-save exited with code 1 npm error at createError (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:34:20) npm error at ChildProcess.<anonymous> (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:47:24) npm error at ChildProcess.emit (node:events:519:28) npm error at cp.emit (C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\cross-spawn\lib\enoent.js:34:29) npm error at ChildProcess._handle.onexit (node:internal/child_process:294:12) ``` --- > 'xpm' is not recognized as an internal or external command, npm error operable program or batch file. - https://xpack.github.io/ - > The xPack Reproducible Build Framework > Tools to manage, configure and build complex, package based, multi-target projects, in a reproducible way. - https://xpack.github.io/xpm/install/ --- > Failed to load a prebuilt binary for platform "win" "x64", falling back to building from source. Error: Error: The specified module could not be found. npm error \\?\C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node - https://github.com/withcatai/node-llama-cpp#installation - > This package comes with pre-built binaries for macOS, Linux and Windows. > > If binaries are not available for your platform, it'll fallback to download a release of `llama.cpp` and build it from source with `cmake`. To disable this behavior, set the environment variable `NODE_LLAMA_CPP_SKIP_DOWNLOAD` to true. - https://node-llama-cpp.withcat.ai/guide/building-from-source - > INFO > > If cmake is not installed on your machine, `node-llama-cpp` will automatically download `cmake` to an internal directory and try to use it to build `llama.cpp` from source. > > If the build fails, make sure you have the required dependencies of `cmake` installed on your machine. We can see the prebuilt dependencies in the `node-llama-cpp` `package.json` here: - https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/package.json#L212-L224 ```json "optionalDependencies": { "@node-llama-cpp/linux-arm64": "0.1.0", "@node-llama-cpp/linux-armv7l": "0.1.0", "@node-llama-cpp/linux-x64": "0.1.0", "@node-llama-cpp/linux-x64-cuda": "0.1.0", "@node-llama-cpp/linux-x64-vulkan": "0.1.0", "@node-llama-cpp/mac-arm64-metal": "0.1.0", "@node-llama-cpp/mac-x64": "0.1.0", "@node-llama-cpp/win-arm64": "0.1.0", "@node-llama-cpp/win-x64": "0.1.0", "@node-llama-cpp/win-x64-cuda": "0.1.0", "@node-llama-cpp/win-x64-vulkan": "0.1.0" } ``` The source for these are here: - https://github.com/withcatai/node-llama-cpp/tree/master/packages/%40node-llama-cpp Since there is a `@node-llama-cpp/win-x64`, the `Failed to load a prebuilt binary for platform "win" "x64"` part of the error is definitely.. interesting. @Yusufkulcu Does the machine you're installing this on have access to the internet? I'm guessing it tries to dynamically decide which optional binary package it will need based on the system and install that. We can see that when building the package, it runs `npm run addPostinstallScript`: - https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/.github/workflows/build.yml#L465-L493 Which we can see is defined in `package.json`, and basically tells the built package to trigger `node ./dist/cli/cli.js postinstall`: - https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/package.json#L51 ``` "addPostinstallScript": "npm pkg set scripts.postinstall=\"node ./dist/cli/cli.js postinstall\"", ``` In `cli/cli.js` we can see that it registers the `OnPostInstallCommand`: - https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/src/cli/cli.ts#L39 Which is defined here, and calls `getLlamaForOptions`: - https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/src/cli/commands/OnPostInstallCommand.ts#L8-L30 Which is defined here, and seems to be the part of the package that gets the systems platform/architecture, and tries to determine if it can use prebuilt binaries or not/etc: - https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/src/bindings/getLlama.ts#L291-L462 Within that, we can see there is a call to `loadExistingLlamaBinary`: - https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/src/bindings/getLlama.ts#L367-L387 - https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/src/bindings/getLlama.ts#L464-L623 That calls `getPrebuiltBinaryPath`, which then calls `getPrebuiltBinariesPackageDirectoryForBuildOptions`, which is what handles attempting to import the various prebuilt binary packages: - https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/src/bindings/getLlama.ts#L547-L553 - `getPrebuiltBinaryPath`: https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/src/bindings/utils/compileLLamaCpp.ts#L286-L313 - `getPrebuiltBinariesPackageDirectoryForBuildOptions`: https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/src/bindings/utils/compileLLamaCpp.ts#L364-L426 ```js } else if (buildOptions.platform === "win") { if (buildOptions.arch === "x64") { if (buildOptions.gpu === "cuda") // @ts-ignore return getBinariesPathFromModules(() => import("@node-llama-cpp/win-x64-cuda")); else if (buildOptions.gpu === "vulkan") // @ts-ignore return getBinariesPathFromModules(() => import("@node-llama-cpp/win-x64-vulkan")); else if (buildOptions.gpu === false) // @ts-ignore return getBinariesPathFromModules(() => import("@node-llama-cpp/win-x64")); } else if (buildOptions.arch === "arm64") // @ts-ignore return getBinariesPathFromModules(() => import("@node-llama-cpp/win-arm64")); } ``` This then calls `getBinariesPathFromModules` which seems to do some stuff to find the bins dir: - https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/src/bindings/utils/compileLLamaCpp.ts#L365-L383 Jumping back out to `loadExistingLlamaBinary`, we can see the 'failed to load a prebuilt binary' message here within a `catch` block: - https://github.com/withcatai/node-llama-cpp/blob/4ee10a90e954df26e9e7782d4e2f9e98daab24a5/src/bindings/getLlama.ts#L598-L611 The actual triggering error is appended to the end of that message, so the root issue appears to be this part: ``` Error: Error: The specified module could not be found. npm error \\?\C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node npm error at Module._extensions..node (node:internal/modules/cjs/loader:1586:18) npm error at Module.load (node:internal/modules/cjs/loader:1288:32) npm error at Module._load (node:internal/modules/cjs/loader:1104:12) npm error at Module.require (node:internal/modules/cjs/loader:1311:19) npm error at require (node:internal/modules/helpers:179:18) npm error at loadBindingModule (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:390:25) npm error at loadExistingLlamaBinary (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:276:37) npm error at async getLlamaForOptions (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:133:27) npm error at async Object.handler (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/cli/commands/OnPostInstallCommand.js:12:13) { npm error code: 'ERR_DLOPEN_FAILED' npm error } ``` @Yusufkulcu So I guess the first thing I would be doing is checking whether that file actually exists, and if there is anything that might be blocking it from being able to be loaded (permissions, antivirus, etc): - `C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node` If that doesn't help, you may be able to get some more specific ideas/support on the GitHub Discussions page for `node-llama-cpp`: - https://github.com/withcatai/node-llama-cpp/discussions/categories/general I took the liberty to open a discussion there based on this issue to potentially expediate getting an official answer: - https://github.com/withcatai/node-llama-cpp/discussions/353 --- @Yusufkulcu @jehna It seems that `humanify` is currently using `node-llama-cpp` `3.0.0-beta.40`: https://github.com/jehna/humanify/blob/64a1b9511d2e42b705df0858674eab1f5e83dd0d/package.json#L57 Whereas it seems to now be up to `3.0.3`, so it's possible there have been some relevant bug fixes/improvements released since then: - https://github.com/withcatai/node-llama-cpp/releases - https://github.com/withcatai/node-llama-cpp/releases/tag/v3.0.0-beta.41 - https://github.com/withcatai/node-llama-cpp/releases/tag/v3.0.0-beta.42 - https://github.com/withcatai/node-llama-cpp/releases/tag/v3.0.0-beta.43 - https://github.com/withcatai/node-llama-cpp/releases/tag/v3.0.0-beta.44 - https://github.com/withcatai/node-llama-cpp/releases/tag/v3.0.0-beta.45 - https://github.com/withcatai/node-llama-cpp/releases/tag/v3.0.0-beta.46 - https://github.com/withcatai/node-llama-cpp/releases/tag/v3.0.0-beta.47 - https://github.com/withcatai/node-llama-cpp/releases/tag/v3.0.0 - https://github.com/withcatai/node-llama-cpp/releases/tag/v3.0.1 - https://github.com/withcatai/node-llama-cpp/releases/tag/v3.0.2 - https://github.com/withcatai/node-llama-cpp/releases/tag/v3.0.3 - https://node-llama-cpp.withcat.ai/blog/v3 --- > I have v20.10.0 installed on my computer. I installed the same on the virtual server and tried again. First, it gave the error "Python is not installed" and then it gave the errors I added below. > > operating system: server 2019 > > node version : v20.10.0 python version : Python 3.12.6 @Yusufkulcu The error in your screenshot here is a different error to the one in your initial post. The error you're getting on the virtual server here is related to there not being a prebuilt binary for `isolated-vm` for that server; which is basically the root cause of this issue: - https://github.com/jehna/humanify/issues/71 But solving that won't solve the issues you're having on your main system install.
Author
Owner

@0xdevalias commented on GitHub (Sep 30, 2024):

This part might not matter (not 100%, just guessing):

npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\isolated-vm\src'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\isolated-vm\\src'
npm warn cleanup     }
npm warn cleanup   ],
npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp\llama\llama.cpp'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp\\llama\\llama.cpp'
npm warn cleanup     }
npm warn cleanup   ],
npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs'
npm warn cleanup     }
npm warn cleanup   ]
npm warn cleanup ]

@Yusufkulcu Some further ideas/google results based on Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp\llama\llama.cpp'

<!-- gh-comment-id:2381858153 --> @0xdevalias commented on GitHub (Sep 30, 2024): > This part might not matter (not 100%, just guessing): > > ``` > npm warn cleanup [ > npm warn cleanup 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules', > npm warn cleanup [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\isolated-vm\src'] { > npm warn cleanup errno: -4048, > npm warn cleanup code: 'EPERM', > npm warn cleanup syscall: 'rmdir', > npm warn cleanup path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\isolated-vm\\src' > npm warn cleanup } > npm warn cleanup ], > npm warn cleanup [ > npm warn cleanup 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp', > npm warn cleanup [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp\llama\llama.cpp'] { > npm warn cleanup errno: -4048, > npm warn cleanup code: 'EPERM', > npm warn cleanup syscall: 'rmdir', > npm warn cleanup path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp\\llama\\llama.cpp' > npm warn cleanup } > npm warn cleanup ], > npm warn cleanup [ > npm warn cleanup 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs', > npm warn cleanup [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs'] { > npm warn cleanup errno: -4048, > npm warn cleanup code: 'EPERM', > npm warn cleanup syscall: 'rmdir', > npm warn cleanup path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs' > npm warn cleanup } > npm warn cleanup ] > npm warn cleanup ] > ``` @Yusufkulcu Some further ideas/google results based on `Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp\llama\llama.cpp'` - https://stackoverflow.com/questions/34600932/eperm-operation-not-permitted-on-windows-with-npm - Some suggestions involve running `npm cache clean --force` - Some suggest changing the `npm` install location prefix - Others suggest changing security permissions on folders - I don't know enough about node/npm on windows to say what the best choice/solution here is; but in any case, it sounds like it's not `humanifyjs` specific. - https://learn.microsoft.com/en-us/answers/questions/1466658/erorr-using-npm-package-manager-eperm-operation-no - One suggestion here includes: - > - Run command in CMD Window as an administrator trying to delete the folder. > - Try again when disabling any other 3rdparty antivirus/security software. > - Check file and folder permission: - https://github.com/npm/npm/issues/10826 - One user here suggests `npm cache clean` - Another suggests disabling antivirus - Another suggests checking if another application is using the project folder (like an IDE/editor)
Author
Owner

@0xdevalias commented on GitHub (Oct 15, 2024):

Some notes from upstream:


it would be helpful if you can check if C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node exists.

Originally posted by @giladgd in https://github.com/withcatai/node-llama-cpp/discussions/353#discussioncomment-10853697


Actually, this may indeed be the permission issue I mentioned.
The attached error shows that node-llama-cpp is installed under the Administrator user, which should never be done, and is very prone to permission issues.

Originally posted by @giladgd in https://github.com/withcatai/node-llama-cpp/discussions/353#discussioncomment-10853713


Full comment:

I run some tests on the latest version (3.0.3), and it seems to work as expected on Windows.
I've tested it on both a Windows 10 machine and on a Windows 11 machine.

From the logs you attached, I think there's something wrong in the installation on the user machine, whether it's npm that hasn't installed the prebuilt binaries properly (maybe they're using some npm mirror that doesn't work as expected), maybe it's related to permissions on the machine itself (like running npm install in the folder from one user and trying to use it from another use that doesn't have proper access to the node_modules directory), and maybe it has something to do with some program on the machine the blocks the access to the .node file.

From the linked issue I can see in the logs that it attempted to fallback to building from source, and since CMake is not installed on the user's machine, it tried to call xpm to install CMake locally inside node-llama-cpp, but it failed with this error:

npm error ^[[?25l^[[?25h'xpm' is not recognized as an internal or external command,
npm error operable program or batch file.
npm error Failed to build llama.cpp with no GPU support. Error: SpawnError: Command npm exec --yes -- xpm@^0.16.3 install @xpack-dev-tools/cmake@latest --no-save exited with code 1

The ^[[?25l^[[?25h'xpm' part seems to indicate one of the following:

  • There's and issue with the installation of nodejs/npm on the user machine.

    npm exec --yes is equivalent to npx -y, but npm exec --yes works better with other runtimes such as Bun, which is why I used it.

  • There's an issue with the escaping done by cross-spawn to run the npm exec --yes -- xpm@^0.16.3 command. I used cross-spawn because it solved many issues with process spawning on Windows, so I think this is less probable.

What I would suggest to try to fix this issue would be:

  • Uninstall all global npm modules, including npm, and reinstalling nodejs, since it seems like there's an issue with its installation.
  • Try to use Bun, since it seems they've fixed many Windows compatibility issues.
  • Install CMake manually, so node-llama-cpp can use it.

Before doing all of the above, it would be helpful if you can check if C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node exists.

Originally posted by @giladgd in https://github.com/withcatai/node-llama-cpp/discussions/353#discussioncomment-10853697

<!-- gh-comment-id:2412603357 --> @0xdevalias commented on GitHub (Oct 15, 2024): Some notes from upstream: --- > it would be helpful if you can check if `C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node` exists. > > _Originally posted by @giladgd in https://github.com/withcatai/node-llama-cpp/discussions/353#discussioncomment-10853697_ --- > Actually, this may indeed be the permission issue I mentioned. > The attached error shows that `node-llama-cpp` is installed under the `Administrator` user, which should never be done, and is very prone to permission issues. > > _Originally posted by @giladgd in https://github.com/withcatai/node-llama-cpp/discussions/353#discussioncomment-10853713_ --- Full comment: > I run some tests on the latest version (`3.0.3`), and it seems to work as expected on Windows. > I've tested it on both a Windows 10 machine and on a Windows 11 machine. > > From the logs you attached, I think there's something wrong in the installation on the user machine, whether it's npm that hasn't installed the prebuilt binaries properly (maybe they're using some npm mirror that doesn't work as expected), maybe it's related to permissions on the machine itself (like running `npm install` in the folder from one user and trying to use it from another use that doesn't have proper access to the `node_modules` directory), and maybe it has something to do with some program on the machine the blocks the access to the `.node` file. > > From the [linked issue](https://github.com/jehna/humanify/issues/135) I can see in the logs that it attempted to fallback to building from source, and since CMake is not installed on the user's machine, it tried to call [`xpm`](https://www.npmjs.com/package/xpm) to install CMake locally inside `node-llama-cpp`, but it failed with this error: > ``` > npm error ^[[?25l^[[?25h'xpm' is not recognized as an internal or external command, > npm error operable program or batch file. > npm error Failed to build llama.cpp with no GPU support. Error: SpawnError: Command npm exec --yes -- xpm@^0.16.3 install @xpack-dev-tools/cmake@latest --no-save exited with code 1 > ``` > > The `^[[?25l^[[?25h'xpm'` part seems to indicate one of the following: > * There's and issue with the installation of nodejs/npm on the user machine. > > `npm exec --yes` is equivalent to `npx -y`, but `npm exec --yes` works better with other runtimes such as Bun, which is why I used it. > * There's an issue with the escaping done by [`cross-spawn`](https://www.npmjs.com/package/cross-spawn) to run the `npm exec --yes -- xpm@^0.16.3` command. I used [`cross-spawn`](https://www.npmjs.com/package/cross-spawn) because it solved many issues with process spawning on Windows, so I think this is less probable. > > What I would suggest to try to fix this issue would be: > * Uninstall all global npm modules, including npm, and reinstalling nodejs, since it seems like there's an issue with its installation. > * Try to use [Bun](https://bun.sh), since it seems they've fixed many Windows compatibility issues. > * [Install CMake](https://cmake.org/download/) manually, so `node-llama-cpp` can use it. > > Before doing all of the above, it would be helpful if you can check if `C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node` exists. > > _Originally posted by @giladgd in https://github.com/withcatai/node-llama-cpp/discussions/353#discussioncomment-10853697_
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/humanify#47
No description provided.