mirror of
https://github.com/dani-garcia/vaultwarden.git
synced 2026-04-26 01:35:54 +03:00
[GH-ISSUE #227] Random 502-Erros using https behind nginx-proxy #112
Labels
No labels
SSO
Third party
better for forum
bug
bug
documentation
duplicate
enhancement
future Vault
future Vault
future Vault
good first issue
help wanted
low priority
notes
pull-request
question
troubleshooting
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/vaultwarden#112
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @rix1337 on GitHub (Oct 20, 2018).
Original GitHub issue: https://github.com/dani-garcia/vaultwarden/issues/227
I have set up bitwarden_rs (docker!) successfully using http. For security reasons I have then switched over to https supplying a cert/key.
This works locally if I request
https://docker_ip:80.My nginx block looks like this
With the switch to rocket https I can no longer use the web vault. Every time I load the page a random set of resources will not load with a 502 error.
Any help would be highly appreciated. I do not want to run my password manager on insecure http.
@mprasil commented on GitHub (Oct 20, 2018):
Hi, do you see any errors in the docker logs? Did you try to terminate ssl on nginx instead of bitwarden_rs?
@rix1337 commented on GitHub (Oct 20, 2018):
I am terminating bitwarden_rs over https through nginx.
This works unless I follow https://github.com/dani-garcia/bitwarden_rs#enabling-https to secure my local traffic between nginx and bitwarden_rs.
To clarify:
https://docker_ip:80works fine.Is there any nginx magic required to fix the 502s?
@rix1337 commented on GitHub (Oct 20, 2018):
This I found in the nginx error.log:
[crit] 388#388: *6 SSL_do_handshake() failed (SSL: error:14007072:SSL routines:CONNECT_CR_CERT:bad message type) while SSL handshaking to upstream, client: Y.Y.Y.Y, server: XXX, request: "GET /images/logo-dark@2x.png HTTP/2.0", upstream: "https://docker_ip:80/images/logo-dark@2x.png", host: "XXX", referrer: "https://XXX/"@mprasil commented on GitHub (Oct 20, 2018):
That's weird. What kind of networking do you use with your docker container? Is nginx also in container?
@rix1337 commented on GitHub (Oct 20, 2018):
Is that setup working for you? Https-enabled bitwarden_rs behind nginx?
Both nginx and bitwarden_rs are inside separate docker containers with only the relevant ports (443 nginx and 80 for bitwarden_rs) passed through. This works for another 10 services also proxied by nginx.. 😢
@mprasil commented on GitHub (Oct 20, 2018):
I actually don't have https enabled directly on bitwarden_rs, instead I terminate ssl on proxy. (I'm not using nginx, but the configuration is similar) I'd say most people don't do https on bitwarden_rs side while also doing ssl termination on proxy on the same server, but this should work.
The error message kinda sounds like you have some intermitted routing issues between the nginx and bitwarden_rs. Maybe try and change the bitwarden port to something else (
ROCKET_PORT) and see if that still does the same.@mprasil commented on GitHub (Oct 20, 2018):
Also try to add both containers into some shared network and proxy the connection over that network to avoid any NAT or docker port forwarding.
@rix1337 commented on GitHub (Oct 21, 2018):
Thank you for the replies. It seems, Rocket is not correctly serving all files through https. I have tried changing the port, like you described.
I have now opened an issue at the Rocket repo.
@rix1337 commented on GitHub (Oct 21, 2018):
Are you serving the files that break with 502's differently than the others?
@mprasil commented on GitHub (Oct 21, 2018):
No the Vault files are all served the same way, so this shouldn't be a factor. Did you try changing the port?
I'm still not convinced it's issue in Rocket or bitwarden_rs, considering there are no issues when accessing the site directly.
Check if your browser gets all the responses encrypted, but I assume it does.
Also did you try to create network between the containers? What is your current networking setup? Are you using macvlan by any chance?
My money is on some routing issue between the containers as most likely route cause.
@rix1337 commented on GitHub (Oct 21, 2018):
I tried changing the port, but not the network.
I'll investigate further.
@rix1337 commented on GitHub (Oct 21, 2018):
I have now tried a few things, sharing the same vlan with nginx and bitwarden_rs and even sharing the bridge net of nginx with bitwarden using
--network container:nginx. Even in this last scenario proxying https://127.0.0.1:80 fails as described above.Doesn't that look more like an issue with Rocket's TLS implementation? Networking can almost certainly be ruled out.
@mprasil commented on GitHub (Oct 21, 2018):
Thanks for testing that. That's weird indeed. It's strange that it works in browser though.
If you're free for more tests, maybe you could increase the number of worker threads? Or limit the number of connections from nginx side to less than 10.
@rix1337 commented on GitHub (Oct 22, 2018):
Hm, even with 40 worker thread this issue persists. Ill try to get more info over at the Rocket repo..
@mprasil commented on GitHub (Oct 22, 2018):
Thanks for the update. For anyone that wants to follow the Rocket issue, here's the link.
@rix1337 commented on GitHub (Nov 2, 2018):
Could you take a look at the latest reply over there?
@dani-garcia commented on GitHub (Nov 2, 2018):
Well coincidentally, I spent some time yesterday creating a branch using rocket 0.4!
I pushed it now, and it's available here:
https://github.com/dani-garcia/bitwarden_rs/tree/rocket-0.4
It's only lightly tested on the web vault, but it should work.
@mprasil commented on GitHub (Nov 3, 2018):
I've built
mprasil/bitwarden:rocket-0.4-betaimage if anyone wants to test it.@rix1337 commented on GitHub (Nov 22, 2018):
The latest release has this issue fixed for me. Thanks for the kind support!
@mprasil commented on GitHub (Nov 22, 2018):
You mean latest or the rocket-0.4 branch?
@rix1337 commented on GitHub (Nov 23, 2018):
The latest release, in fact. No idea why it suddenöy works 😀