[GH-ISSUE #4257] [bug]: Version 2024.7.1 won't start on Kubernetes "address already in use" #1542

Closed
opened 2026-03-16 20:46:08 +03:00 by kerem · 9 comments
Owner

Originally created by @Jonathan-Diaz-Rosa on GitHub (Aug 13, 2024).
Original GitHub issue: https://github.com/hoppscotch/hoppscotch/issues/4257

Is there an existing issue for this?

  • I have searched the existing issues

Current behavior

When i try to deploy the version 2024.7.1 on kubernetes without any changes made to the configuration, only one container can start and the others crashed with the log :

{"level":"info","ts":1723541100.5956447,"msg":"using config from file","file":"/etc/caddy/selfhost-web.Caddyfile"}
2024-08-13T11:25:00.596697443+02:00 {"level":"info","ts":1723541100.5965958,"msg":"adapted config to JSON","adapter":"caddyfile"}
Error: loading initial config: loading new config: starting caddy administration endpoint: listen tcp 127.0.0.1:2019: bind: address already in use

With the version 2024.7.0 all works perfectly, what's happened on 7.1 ?

Steps to reproduce

Resources and Kustomize file can be found her :
gist files

Environment

Release

Version

Self-hosted

Originally created by @Jonathan-Diaz-Rosa on GitHub (Aug 13, 2024). Original GitHub issue: https://github.com/hoppscotch/hoppscotch/issues/4257 ### Is there an existing issue for this? - [X] I have searched the existing issues ### Current behavior When i try to deploy the version 2024.7.1 on kubernetes without any changes made to the configuration, only one container can start and the others crashed with the log : ``` {"level":"info","ts":1723541100.5956447,"msg":"using config from file","file":"/etc/caddy/selfhost-web.Caddyfile"} 2024-08-13T11:25:00.596697443+02:00 {"level":"info","ts":1723541100.5965958,"msg":"adapted config to JSON","adapter":"caddyfile"} Error: loading initial config: loading new config: starting caddy administration endpoint: listen tcp 127.0.0.1:2019: bind: address already in use ``` With the version 2024.7.0 all works perfectly, what's happened on 7.1 ? ### Steps to reproduce Resources and Kustomize file can be found her : [gist files](https://gist.github.com/Jonathan-Diaz-Rosa/9351c0537238989e4592f8ba11743c3e) ### Environment Release ### Version Self-hosted
kerem 2026-03-16 20:46:08 +03:00
Author
Owner

@elixxx commented on GitHub (Aug 13, 2024):

It's look like https://github.com/hoppscotch/hoppscotch/pull/4233/files made some issues. I had also an issue in web container with an 404 error

<!-- gh-comment-id:2285886060 --> @elixxx commented on GitHub (Aug 13, 2024): It's look like https://github.com/hoppscotch/hoppscotch/pull/4233/files made some issues. I had also an issue in web container with an 404 error
Author
Owner

@AndrewBastin commented on GitHub (Aug 13, 2024):

@Jonathan-Diaz-Rosa @elixxx thanks for the logs and the config info. We are looking into this and will look to bringing up a patch as soon as possible, in the meantime please stay on 2024.7.0 if possible.

Out of curiosity, have you folks considered moving to the all-in-one containers we have now ? Are there any blockers towards migrating to that ?

See: https://docs.hoppscotch.io/documentation/self-host/community-edition/install-and-build#using-the-aio-container

<!-- gh-comment-id:2286023766 --> @AndrewBastin commented on GitHub (Aug 13, 2024): @Jonathan-Diaz-Rosa @elixxx thanks for the logs and the config info. We are looking into this and will look to bringing up a patch as soon as possible, in the meantime please stay on 2024.7.0 if possible. Out of curiosity, have you folks considered moving to the all-in-one containers we have now ? Are there any blockers towards migrating to that ? See: https://docs.hoppscotch.io/documentation/self-host/community-edition/install-and-build#using-the-aio-container
Author
Owner

@Jonathan-Diaz-Rosa commented on GitHub (Aug 13, 2024):

@Jonathan-Diaz-Rosa @elixxx thanks for the logs and the config info. We are looking into this and will look to bringing up a patch as soon as possible, in the meantime please stay on 2024.7.0 if possible.

Out of curiosity, have you folks considered moving to the all-in-one containers we have now ? Are there any blockers towards migrating to that ?

See: https://docs.hoppscotch.io/documentation/self-host/community-edition/install-and-build#using-the-aio-container

Yes i can stay on 2024.7.0 of course.

No there is nothing blocking for moving to AIO. We just want to keep the hand on each elements for the moments, but, we will definitely move to AIO in a few weeks or months.

<!-- gh-comment-id:2286071642 --> @Jonathan-Diaz-Rosa commented on GitHub (Aug 13, 2024): > @Jonathan-Diaz-Rosa @elixxx thanks for the logs and the config info. We are looking into this and will look to bringing up a patch as soon as possible, in the meantime please stay on 2024.7.0 if possible. > > Out of curiosity, have you folks considered moving to the all-in-one containers we have now ? Are there any blockers towards migrating to that ? > > See: https://docs.hoppscotch.io/documentation/self-host/community-edition/install-and-build#using-the-aio-container Yes i can stay on 2024.7.0 of course. No there is nothing blocking for moving to AIO. We just want to keep the hand on each elements for the moments, but, we will definitely move to AIO in a few weeks or months.
Author
Owner

@elixxx commented on GitHub (Aug 13, 2024):

Thanks for you answer. I will move to the AIO container in our next update cylce!
I think the reson we used and also build the container ourself was that we had issues with setting the right hostname on runtime.

<!-- gh-comment-id:2286079071 --> @elixxx commented on GitHub (Aug 13, 2024): Thanks for you answer. I will move to the AIO container in our next update cylce! I think the reson we used and also build the container ourself was that we had issues with setting the right hostname on runtime.
Author
Owner

@Brainpitcher commented on GitHub (Aug 14, 2024):

@Jonathan-Diaz-Rosa @elixxx thanks for the logs and the config info. We are looking into this and will look to bringing up a patch as soon as possible, in the meantime please stay on 2024.7.0 if possible.

Out of curiosity, have you folks considered moving to the all-in-one containers we have now ? Are there any blockers towards migrating to that ?

See: https://docs.hoppscotch.io/documentation/self-host/community-edition/install-and-build#using-the-aio-container

We use AllinOne in our k8s deply and getting some difficulties with update to 7.1 (7.0 works perfectly)
the container crashes with
npm notice npm notice New patch version of npm available! 10.8.1 -> 10.8.2 npm notice Changelog: https://github.com/npm/cli/releases/tag/v10.8.2 npm notice To update run: npm install -g npm@10.8.2 npm notice App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.507825,"msg":"using provided configuration","config_file":"/etc/caddy/aio-subpath-access.Caddyfile","config_adapter":"caddyfile"} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5110373,"logger":"admin","msg":"admin endpoint started","address":"localhost:2019","enforce_origin":false,"origins":["//[::1]:2019","//127.0.0.1:2019","//localhost:2019"]} App/Admin Dashboard Caddy | {"level":"warn","ts":1723464428.5112789,"logger":"http.auto_https","msg":"server is listening only on the HTTP port, so no automatic HTTPS will be applied to this server","server_name":"srv3","http_port":80} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5113385,"logger":"tls.cache.maintenance","msg":"started background certificate maintenance","cache":"0xc0006a1700"} App/Admin Dashboard Caddy | {"level":"warn","ts":1723464428.5118368,"logger":"tls","msg":"unable to get instance ID; storage clean stamps will be incomplete","error":"open /home/hoppuser/.local/share/caddy/instance.uuid: no such file or directory"} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5118783,"logger":"http.log","msg":"server running","name":"srv0","protocols":["h1","h2","h3"]} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5119252,"logger":"http.log","msg":"server running","name":"srv1","protocols":["h1","h2","h3"]} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5119631,"logger":"http.log","msg":"server running","name":"srv2","protocols":["h1","h2","h3"]} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5120103,"logger":"tls.cache.maintenance","msg":"stopped background certificate maintenance","cache":"0xc0006a1700"} App/Admin Dashboard Caddy | Error: loading initial config: loading new config: http app module: start: listening on :80: listen tcp :80: bind: permission denied Exiting process because Caddy Server exited with code 1

<!-- gh-comment-id:2288231977 --> @Brainpitcher commented on GitHub (Aug 14, 2024): > @Jonathan-Diaz-Rosa @elixxx thanks for the logs and the config info. We are looking into this and will look to bringing up a patch as soon as possible, in the meantime please stay on 2024.7.0 if possible. > > Out of curiosity, have you folks considered moving to the all-in-one containers we have now ? Are there any blockers towards migrating to that ? > > See: https://docs.hoppscotch.io/documentation/self-host/community-edition/install-and-build#using-the-aio-container We use AllinOne in our k8s deply and getting some difficulties with update to 7.1 (7.0 works perfectly) the container crashes with `npm notice npm notice New patch version of npm available! 10.8.1 -> 10.8.2 npm notice Changelog: https://github.com/npm/cli/releases/tag/v10.8.2 npm notice To update run: npm install -g npm@10.8.2 npm notice App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.507825,"msg":"using provided configuration","config_file":"/etc/caddy/aio-subpath-access.Caddyfile","config_adapter":"caddyfile"} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5110373,"logger":"admin","msg":"admin endpoint started","address":"localhost:2019","enforce_origin":false,"origins":["//[::1]:2019","//127.0.0.1:2019","//localhost:2019"]} App/Admin Dashboard Caddy | {"level":"warn","ts":1723464428.5112789,"logger":"http.auto_https","msg":"server is listening only on the HTTP port, so no automatic HTTPS will be applied to this server","server_name":"srv3","http_port":80} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5113385,"logger":"tls.cache.maintenance","msg":"started background certificate maintenance","cache":"0xc0006a1700"} App/Admin Dashboard Caddy | {"level":"warn","ts":1723464428.5118368,"logger":"tls","msg":"unable to get instance ID; storage clean stamps will be incomplete","error":"open /home/hoppuser/.local/share/caddy/instance.uuid: no such file or directory"} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5118783,"logger":"http.log","msg":"server running","name":"srv0","protocols":["h1","h2","h3"]} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5119252,"logger":"http.log","msg":"server running","name":"srv1","protocols":["h1","h2","h3"]} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5119631,"logger":"http.log","msg":"server running","name":"srv2","protocols":["h1","h2","h3"]} App/Admin Dashboard Caddy | {"level":"info","ts":1723464428.5120103,"logger":"tls.cache.maintenance","msg":"stopped background certificate maintenance","cache":"0xc0006a1700"} App/Admin Dashboard Caddy | Error: loading initial config: loading new config: http app module: start: listening on :80: listen tcp :80: bind: permission denied Exiting process because Caddy Server exited with code 1`
Author
Owner

@AndrewBastin commented on GitHub (Aug 15, 2024):

@Brainpitcher There is another similar ticket open for the situation with port 80, #4264, which we are also looking at.

We are having a tough time reproducing this on our machines though. Can you guys provide a bit more details about your environment, probably stuff like are you using Docker, or any other OCI runtime (like Podman and friends) and maybe stuff about the host environment, CPU architecture or anything.

UPDATE: Can repro in my local minikube setup with the above given configs (thanks btw!), I have a patch running locally that should work (I just disable the Caddy admin endpoints which are not used anyways) but it really doesn't explain why the port conflict occurs since they are containers and should have independent port spaces, so I am not able to pin point the root cause of it (would love pointers as to possibilities).

Will have the patch up live as part of 2024.7.2 hopefully early next week to close this issue.

Also, I have deemed this as a separate issue from #4264 as that is an issue I was able to reproduce in Podman only and doesn't seem to be in affected by the same issue as this.

<!-- gh-comment-id:2291862694 --> @AndrewBastin commented on GitHub (Aug 15, 2024): @Brainpitcher There is another similar ticket open for the situation with port 80, #4264, which we are also looking at. ~We are having a tough time reproducing this on our machines though. Can you guys provide a bit more details about your environment, probably stuff like are you using Docker, or any other OCI runtime (like Podman and friends) and maybe stuff about the host environment, CPU architecture or anything.~ UPDATE: Can repro in my local minikube setup with the above given configs (thanks btw!), I have a patch running locally that should work (I just disable the Caddy admin endpoints which are not used anyways) but it really doesn't explain why the port conflict occurs since they are containers and should have independent port spaces, so I am not able to pin point the root cause of it (would love pointers as to possibilities). Will have the patch up live as part of 2024.7.2 hopefully early next week to close this issue. Also, I have deemed this as a separate issue from #4264 as that is an issue I was able to reproduce in Podman only and doesn't seem to be in affected by the same issue as this.
Author
Owner

@AndrewBastin commented on GitHub (Aug 19, 2024):

@Brainpitcher @elixxx @Jonathan-Diaz-Rosa #4279 should ideally fix this issue. If you are able to help verify whether the fix solves your issue, would appreciate it. The fix will go live as soon as the PR is merged (with the release of 2024.7.2)

<!-- gh-comment-id:2297355177 --> @AndrewBastin commented on GitHub (Aug 19, 2024): @Brainpitcher @elixxx @Jonathan-Diaz-Rosa #4279 should ideally fix this issue. If you are able to help verify whether the fix solves your issue, would appreciate it. The fix will go live as soon as the PR is merged (with the release of 2024.7.2)
Author
Owner

@Brainpitcher commented on GitHub (Aug 21, 2024):

@Brainpitcher @elixxx @Jonathan-Diaz-Rosa #4279 should ideally fix this issue. If you are able to help verify whether the fix solves your issue, would appreciate it. The fix will go live as soon as the PR is merged (with the release of 2024.7.2)

yeap it should work. BTW i've managed to deploy .2024.7.1 in my cluster and it was a one security context change:

      securityContext:
        runAsUser: 0 
        privileged: true

but it is not so good for the security reasons

<!-- gh-comment-id:2301788875 --> @Brainpitcher commented on GitHub (Aug 21, 2024): > @Brainpitcher @elixxx @Jonathan-Diaz-Rosa #4279 should ideally fix this issue. If you are able to help verify whether the fix solves your issue, would appreciate it. The fix will go live as soon as the PR is merged (with the release of 2024.7.2) yeap it should work. BTW i've managed to deploy .2024.7.1 in my cluster and it was a one security context change: ``` securityContext: runAsUser: 0 privileged: true ``` but it is not so good for the security reasons
Author
Owner

@AndrewBastin commented on GitHub (Aug 24, 2024):

@Brainpitcher @elixxx @Jonathan-Diaz-Rosa 2024.7.2 got released today, please do let me know if the issue still persists.

<!-- gh-comment-id:2307959998 --> @AndrewBastin commented on GitHub (Aug 24, 2024): @Brainpitcher @elixxx @Jonathan-Diaz-Rosa 2024.7.2 got released today, please do let me know if the issue still persists.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/hoppscotch#1542
No description provided.