[GH-ISSUE #5702] [bug]: Cannot connect Hoppscotch Desktop App to self-hosted instance (Verification error: Invalid signature) #2216

Closed
opened 2026-03-16 23:36:08 +03:00 by kerem · 4 comments
Owner

Originally created by @Warathep187 on GitHub (Dec 17, 2025).
Original GitHub issue: https://github.com/hoppscotch/hoppscotch/issues/5702

Originally assigned to: @CuriousCorrelation on GitHub.

Is there an existing issue for this?

  • I have searched existing issues and this bug hasn't been reported yet

Platform

Desktop App

Browser

Chrome

Operating System

macOS

Bug Description

I’m experiencing an issue when using the Hoppscotch Desktop application to connect to a self-hosted Hoppscotch instance.

Previously, I was able to connect to the same self-hosted instance without any problems. However, recently when I try to add the instance again, it fails.

I have already tried clearing the cache using the Clear Cache button in the desktop app and also completely uninstalling and reinstalling the application, but the issue still persists.

Steps

  1. Open Hoppscotch Desktop app
  2. Click Add an Instance
  3. Enter the URL of my self-hosted Hoppscotch instance (port 3200)

An error message is shown:
Failed to download instance: Verification error: Invalid signature

Image

Deployment Type

Self-hosted (on-prem deployment)

Version

2025.11.2

Originally created by @Warathep187 on GitHub (Dec 17, 2025). Original GitHub issue: https://github.com/hoppscotch/hoppscotch/issues/5702 Originally assigned to: @CuriousCorrelation on GitHub. ### Is there an existing issue for this? - [x] I have searched existing issues and this bug hasn't been reported yet ### Platform Desktop App ### Browser Chrome ### Operating System macOS ### Bug Description I’m experiencing an issue when using the Hoppscotch Desktop application to connect to a self-hosted Hoppscotch instance. Previously, I was able to connect to the same self-hosted instance without any problems. However, recently when I try to add the instance again, it fails. I have already tried clearing the cache using the **Clear Cache** button in the desktop app and also completely uninstalling and reinstalling the application, but the issue still persists. **Steps** 1. Open Hoppscotch Desktop app 2. Click Add an Instance 3. Enter the URL of my self-hosted Hoppscotch instance (port 3200) **An error message is shown:** Failed to download instance: Verification error: Invalid signature ![Image](https://github.com/user-attachments/assets/f5f32dad-d9c2-43e7-afc5-2d7ddad1edb2) ### Deployment Type Self-hosted (on-prem deployment) ### Version 2025.11.2
kerem 2026-03-16 23:36:08 +03:00
Author
Owner

@CuriousCorrelation commented on GitHub (Dec 17, 2025):

Hi @Warathep187, we have seen something similar before https://github.com/hoppscotch/hoppscotch/discussions/4885 - could you see if the discussion helps with your issue as well. If not, I'll be happy to look further into this one.

<!-- gh-comment-id:3664254277 --> @CuriousCorrelation commented on GitHub (Dec 17, 2025): Hi @Warathep187, we have seen something similar before https://github.com/hoppscotch/hoppscotch/discussions/4885 - could you see if the discussion helps with your issue as well. If not, I'll be happy to look further into this one.
Author
Owner

@Warathep187 commented on GitHub (Dec 17, 2025):

Thanks for the suggestion. I’ve checked #4885 and my setup is similar.

I’m also running Hoppscotch in a non–all-in-one deployment on Kubernetes in my homelab server. I followed the solution described in that issue link but it did not resolve the problem in my case.

<!-- gh-comment-id:3664477439 --> @Warathep187 commented on GitHub (Dec 17, 2025): Thanks for the suggestion. I’ve checked [#4885](https://github.com/hoppscotch/hoppscotch/discussions/4885) and my setup is similar. I’m also running Hoppscotch in a **non**–all-in-one deployment on Kubernetes in my homelab server. I followed the solution described in that issue [link](https://github.com/hoppscotch/hoppscotch/discussions/4885#discussioncomment-15022512) but it did not resolve the problem in my case.
Author
Owner

@CuriousCorrelation commented on GitHub (Dec 17, 2025):

Hi @Warathep187, thanks for the context. If you have similar setup, the "Invalid signature" error you're seeing is I believe caused by the webapp-server generating ephemeral signing keys on each restart or across replicas. When the desktop app caches a bundle signed with one key, then the server restarts (or a different replica responds) with a new key, signature verification fails.

While the idea behind that is security and preventing version mismatch that could potentially corrupt user data, we actually just merged a feature for this in #5644 that adds signing key persistence. The changes haven't been released yet, but once they are, you'll have a few options to make sure you have consistent keys across the pods, basically:

For single replica with persistent storage:

volumes:
  - webapp-data:/data/webapp-server

The server will auto-persist the signing key to /data/webapp-server/signing.key.

For multiple replicas (or if you can't mount a volume):

environment:
  - WEBAPP_SERVER_SIGNING_SECRET=your-shared-secret-here

This will make sure all replicas derive the same key deterministically.

Alternative env vars (in priority order):

Variable Format
WEBAPP_SERVER_SIGNING_KEY base64, 64 bytes (full private key)
WEBAPP_SERVER_SIGNING_SEED base64, 32 bytes (derives key)
WEBAPP_SERVER_SIGNING_SECRET any string (SHA256 hashed to seed)

Once the next release drops with #5644, updating should resolve this.

<!-- gh-comment-id:3664605165 --> @CuriousCorrelation commented on GitHub (Dec 17, 2025): Hi @Warathep187, thanks for the context. If you have similar setup, the "Invalid signature" error you're seeing is I believe caused by the webapp-server generating ephemeral signing keys on each restart or across replicas. When the desktop app caches a bundle signed with one key, then the server restarts (or a different replica responds) with a new key, signature verification fails. While the idea behind that is security and preventing version mismatch that could potentially corrupt user data, we actually just merged a feature for this in #5644 that adds signing key persistence. The changes haven't been released yet, but once they are, you'll have a few options to make sure you have consistent keys across the pods, basically: For single replica with persistent storage: ```yaml volumes: - webapp-data:/data/webapp-server ``` The server will auto-persist the signing key to `/data/webapp-server/signing.key`. For multiple replicas (or if you can't mount a volume): ```yaml environment: - WEBAPP_SERVER_SIGNING_SECRET=your-shared-secret-here ``` This will make sure all replicas derive the same key deterministically. Alternative env vars (in priority order): | Variable | Format | |----------|--------| | `WEBAPP_SERVER_SIGNING_KEY` | base64, 64 bytes (full private key) | | `WEBAPP_SERVER_SIGNING_SEED` | base64, 32 bytes (derives key) | | `WEBAPP_SERVER_SIGNING_SECRET` | any string (SHA256 hashed to seed) | Once the next release drops with #5644, updating should resolve this.
Author
Owner

@Warathep187 commented on GitHub (Dec 17, 2025):

Okay, I understand the issue now. Thank you very much for the explanation. I’ll look forward to the new release 😀

<!-- gh-comment-id:3666322110 --> @Warathep187 commented on GitHub (Dec 17, 2025): Okay, I understand the issue now. Thank you very much for the explanation. I’ll look forward to the new release 😀
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/hoppscotch#2216
No description provided.