[GH-ISSUE #57] Cannot connect to ipsec-vpn-server #51

Closed
opened 2026-03-02 07:11:19 +03:00 by kerem · 14 comments
Owner

Originally created by @Roming22 on GitHub (Feb 19, 2018).
Original GitHub issue: https://github.com/hwdsl2/docker-ipsec-vpn-server/issues/57

Hi,

I'm deploying the container on a CoreOS 1632.3.0 server, and the container seems fine.

I've tried connecting from iOS, and from macOS. On macOS I tried using both the local IP of the CoreOS server and the public IP of the server (the Mac is on the same network).

Error is The L2TP-VPN server did not respond. Try reconnecting. If the problem continues, verify your settings and contact your Administrator.

This is the status of the container:

core@coreos ~/vpn $ docker restart ipsec-vpn-server
ipsec-vpn-server

core@coreos ~/vpn $ docker ps | grep ipsec-vpn-server
9a510d3ce2d4        hwdsl2/ipsec-vpn-server   "/opt/src/run.sh"        8 minutes ago       Up 7 minutes        0.0.0.0:500->500/udp, 0.0.0.0:4500->4500/udp   ipsec-vpn-server

core@coreos ~/vpn $ docker logs ipsec-vpn-server

Trying to auto discover IP of this server...

================================================

IPsec VPN server is now ready for use!

Connect to your new VPN with these details:

Server IP: *edited*
IPsec PSK: *edited*
Username: *edited*
Password: *edited*

Write these down. You'll need them to connect!

Important notes:   https://git.io/vpnnotes2
Setup VPN clients: https://git.io/vpnclients

================================================

Redirecting to: /etc/init.d/ipsec start
Starting pluto IKE daemon for IPsec: Initializing NSS database

.
xl2tpd[1]: setsockopt recvref[30]: Protocol not available
xl2tpd[1]: L2TP kernel support not detected (try modprobing l2tp_ppp and pppol2tp)
xl2tpd[1]: xl2tpd version xl2tpd-1.3.8 started on 9a510d3ce2d4 PID:1
xl2tpd[1]: Written by Mark Spencer, Copyright (C) 1998, Adtran, Inc.
xl2tpd[1]: Forked by Scott Balmos and David Stipp, (C) 2001
xl2tpd[1]: Inherited by Jeff McAdams, (C) 2002
xl2tpd[1]: Forked again by Xelerance (www.xelerance.com) (C) 2006-2016
xl2tpd[1]: Listening on IP address 0.0.0.0, port 1701
xl2tpd[1]: death_handler: Fatal signal 15 received

Trying to auto discover IP of this server...

================================================

IPsec VPN server is now ready for use!

Connect to your new VPN with these details:

Server IP: *edited*
IPsec PSK: *edited*
Username: *edited*
Password: *edited*

Write these down. You'll need them to connect!

Important notes:   https://git.io/vpnnotes2
Setup VPN clients: https://git.io/vpnclients

================================================

Redirecting to: /etc/init.d/ipsec start
Starting pluto IKE daemon for IPsec: .
xl2tpd[1]: setsockopt recvref[30]: Protocol not available
xl2tpd[1]: L2TP kernel support not detected (try modprobing l2tp_ppp and pppol2tp)
xl2tpd[1]: xl2tpd version xl2tpd-1.3.8 started on 9a510d3ce2d4 PID:1
xl2tpd[1]: Written by Mark Spencer, Copyright (C) 1998, Adtran, Inc.
xl2tpd[1]: Forked by Scott Balmos and David Stipp, (C) 2001
xl2tpd[1]: Inherited by Jeff McAdams, (C) 2002
xl2tpd[1]: Forked again by Xelerance (www.xelerance.com) (C) 2006-2016
xl2tpd[1]: Listening on IP address 0.0.0.0, port 1701


core@coreos ~/vpn $ docker exec -it ipsec-vpn-server netstat -anput
Active Internet connections (servers and established)
Proto Recv-Q Send-Q Local Address           Foreign Address         State       PID/Program name    
udp        0      0 127.0.0.1:4500          0.0.0.0:*                           463/pluto           
udp        0      0 172.17.0.2:4500         0.0.0.0:*                           463/pluto           
udp        0      0 127.0.0.1:500           0.0.0.0:*                           463/pluto           
udp        0      0 172.17.0.2:500          0.0.0.0:*                           463/pluto           
udp        0      0 0.0.0.0:1701            0.0.0.0:*                           1/xl2tpd            

I do not think there is a firewall issue as I'm able to transmit data between the Mac and the server using netcat. I'm unsure on how to troubleshoot further.

Thank you for your help

Originally created by @Roming22 on GitHub (Feb 19, 2018). Original GitHub issue: https://github.com/hwdsl2/docker-ipsec-vpn-server/issues/57 Hi, I'm deploying the container on a CoreOS 1632.3.0 server, and the container seems fine. I've tried connecting from iOS, and from macOS. On macOS I tried using both the local IP of the CoreOS server and the public IP of the server (the Mac is on the same network). Error is `The L2TP-VPN server did not respond. Try reconnecting. If the problem continues, verify your settings and contact your Administrator.` This is the status of the container: ``` core@coreos ~/vpn $ docker restart ipsec-vpn-server ipsec-vpn-server core@coreos ~/vpn $ docker ps | grep ipsec-vpn-server 9a510d3ce2d4 hwdsl2/ipsec-vpn-server "/opt/src/run.sh" 8 minutes ago Up 7 minutes 0.0.0.0:500->500/udp, 0.0.0.0:4500->4500/udp ipsec-vpn-server core@coreos ~/vpn $ docker logs ipsec-vpn-server Trying to auto discover IP of this server... ================================================ IPsec VPN server is now ready for use! Connect to your new VPN with these details: Server IP: *edited* IPsec PSK: *edited* Username: *edited* Password: *edited* Write these down. You'll need them to connect! Important notes: https://git.io/vpnnotes2 Setup VPN clients: https://git.io/vpnclients ================================================ Redirecting to: /etc/init.d/ipsec start Starting pluto IKE daemon for IPsec: Initializing NSS database . xl2tpd[1]: setsockopt recvref[30]: Protocol not available xl2tpd[1]: L2TP kernel support not detected (try modprobing l2tp_ppp and pppol2tp) xl2tpd[1]: xl2tpd version xl2tpd-1.3.8 started on 9a510d3ce2d4 PID:1 xl2tpd[1]: Written by Mark Spencer, Copyright (C) 1998, Adtran, Inc. xl2tpd[1]: Forked by Scott Balmos and David Stipp, (C) 2001 xl2tpd[1]: Inherited by Jeff McAdams, (C) 2002 xl2tpd[1]: Forked again by Xelerance (www.xelerance.com) (C) 2006-2016 xl2tpd[1]: Listening on IP address 0.0.0.0, port 1701 xl2tpd[1]: death_handler: Fatal signal 15 received Trying to auto discover IP of this server... ================================================ IPsec VPN server is now ready for use! Connect to your new VPN with these details: Server IP: *edited* IPsec PSK: *edited* Username: *edited* Password: *edited* Write these down. You'll need them to connect! Important notes: https://git.io/vpnnotes2 Setup VPN clients: https://git.io/vpnclients ================================================ Redirecting to: /etc/init.d/ipsec start Starting pluto IKE daemon for IPsec: . xl2tpd[1]: setsockopt recvref[30]: Protocol not available xl2tpd[1]: L2TP kernel support not detected (try modprobing l2tp_ppp and pppol2tp) xl2tpd[1]: xl2tpd version xl2tpd-1.3.8 started on 9a510d3ce2d4 PID:1 xl2tpd[1]: Written by Mark Spencer, Copyright (C) 1998, Adtran, Inc. xl2tpd[1]: Forked by Scott Balmos and David Stipp, (C) 2001 xl2tpd[1]: Inherited by Jeff McAdams, (C) 2002 xl2tpd[1]: Forked again by Xelerance (www.xelerance.com) (C) 2006-2016 xl2tpd[1]: Listening on IP address 0.0.0.0, port 1701 core@coreos ~/vpn $ docker exec -it ipsec-vpn-server netstat -anput Active Internet connections (servers and established) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name udp 0 0 127.0.0.1:4500 0.0.0.0:* 463/pluto udp 0 0 172.17.0.2:4500 0.0.0.0:* 463/pluto udp 0 0 127.0.0.1:500 0.0.0.0:* 463/pluto udp 0 0 172.17.0.2:500 0.0.0.0:* 463/pluto udp 0 0 0.0.0.0:1701 0.0.0.0:* 1/xl2tpd ``` I do not think there is a firewall issue as I'm able to transmit data between the Mac and the server using netcat. I'm unsure on how to troubleshoot further. Thank you for your help
kerem closed this issue 2026-03-02 07:11:19 +03:00
Author
Owner

@hwdsl2 commented on GitHub (Feb 20, 2018):

Note: Please first set up your own VPN server.

注:请首先 搭建自己的 VPN 服务器


@Roming22 Hello! To troubleshoot further, please enable Libreswan logs by following the instructions [1], then try connecting the VPN. After that check the logs with:

docker exec -it ipsec-vpn-server grep pluto /var/log/auth.log

If you don't see your connection attempt appearing in the logs, then it is a firewall issue.

[1] https://github.com/hwdsl2/docker-ipsec-vpn-server#enable-libreswan-logs

<!-- gh-comment-id:366877573 --> @hwdsl2 commented on GitHub (Feb 20, 2018): ### Note: Please first [set up your own VPN server](https://github.com/hwdsl2/setup-ipsec-vpn). ### 注:请首先 [搭建自己的 VPN 服务器](https://github.com/hwdsl2/setup-ipsec-vpn/blob/master/README-zh.md)。 --- @Roming22 Hello! To troubleshoot further, please enable Libreswan logs by following the instructions [1], then try connecting the VPN. After that check the logs with: ``` docker exec -it ipsec-vpn-server grep pluto /var/log/auth.log ``` If you don't see your connection attempt appearing in the logs, then it is a firewall issue. [1] https://github.com/hwdsl2/docker-ipsec-vpn-server#enable-libreswan-logs
Author
Owner

@Roming22 commented on GitHub (Feb 20, 2018):

The output from the logs are:

Feb 20 23:01:23 fa81a3c532e2 pluto[1843]: packet from ****:47077: initial Main Mode message received on 172.17.0.2:500 but no connection has been authorized with policy PSK+IKEV1_ALLOW

Any idea what that could be? A google search turned up mostly empty.

I've configured the VPN with the following values, on the latest macOS and iOS:

  • VPN Type: L2TP Over IPSec
  • Server Address: mysubdomain.ddns.net
  • Account name: $VPN_USER
  • Password: $VPN_PASSWORD
  • Shared secret: $VPN_IPSEC_PSK
<!-- gh-comment-id:367153036 --> @Roming22 commented on GitHub (Feb 20, 2018): The output from the logs are: ``` Feb 20 23:01:23 fa81a3c532e2 pluto[1843]: packet from ****:47077: initial Main Mode message received on 172.17.0.2:500 but no connection has been authorized with policy PSK+IKEV1_ALLOW ``` Any idea what that could be? A google search turned up mostly empty. I've configured the VPN with the following values, on the latest macOS and iOS: - VPN Type: L2TP Over IPSec - Server Address: mysubdomain.ddns.net - Account name: $VPN_USER - Password: $VPN_PASSWORD - Shared secret: $VPN_IPSEC_PSK
Author
Owner

@cirience-zz commented on GitHub (Feb 22, 2018):

I have exactly the same problem. I don't even have auth.log

<!-- gh-comment-id:367761073 --> @cirience-zz commented on GitHub (Feb 22, 2018): I have exactly the same problem. I don't even have auth.log
Author
Owner

@hwdsl2 commented on GitHub (Feb 22, 2018):

@Roming22 Can you please share more log lines? In particular all the lines
that were output when Libreswan starts.

@SHA-256 Please follow the instructions in my earlier comment in this issue
to enable Libreswan logs, then post your logs.

<!-- gh-comment-id:367785050 --> @hwdsl2 commented on GitHub (Feb 22, 2018): @Roming22 Can you please share more log lines? In particular all the lines that were output when Libreswan starts. @SHA-256 Please follow the instructions in my earlier comment in this issue to enable Libreswan logs, then post your logs.
Author
Owner

@cirience-zz commented on GitHub (Feb 22, 2018):

Thank you for your answer! That's my log:

Feb 22 19:01:55 63e180cacf32 pluto[2747]: packet from xxx.xxx.xxx.xxx:500: ignoring unknown Vendor ID payload [01528bbbc00696121849ab9a1c5b2a5100000001]
Feb 22 19:01:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: responding to Main Mode from unknown peer xxx.xxx.xxx.xxx on port 500
Feb 22 19:01:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: Oakley Transform [AES_CBC (256), HMAC_SHA1, DH20] refused
Feb 22 19:01:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: Oakley Transform [AES_CBC (128), HMAC_SHA1, DH19] refused
Feb 22 19:01:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: STATE_MAIN_R1: sent MR1, expecting MI2
Feb 22 19:01:56 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: retransmitting in response to duplicate packet; already STATE_MAIN_R1
Feb 22 19:01:57 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: retransmitting in response to duplicate packet; already STATE_MAIN_R1
Feb 22 19:02:00 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: retransmitting in response to duplicate packet; already STATE_MAIN_R1
Feb 22 19:02:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: deleting incomplete state after 60.000 seconds
Feb 22 19:02:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: deleting state (STATE_MAIN_R1)
Feb 22 19:02:55 63e180cacf32 pluto[2747]: deleting connection "l2tp-psk"[1] xxx.xxx.xxx.xxx instance with peer xxx.xxx.xxx.xxx {isakmp=#0/ipsec=#0}
<!-- gh-comment-id:367786943 --> @cirience-zz commented on GitHub (Feb 22, 2018): Thank you for your answer! That's my log: ``` Feb 22 19:01:55 63e180cacf32 pluto[2747]: packet from xxx.xxx.xxx.xxx:500: ignoring unknown Vendor ID payload [01528bbbc00696121849ab9a1c5b2a5100000001] Feb 22 19:01:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: responding to Main Mode from unknown peer xxx.xxx.xxx.xxx on port 500 Feb 22 19:01:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: Oakley Transform [AES_CBC (256), HMAC_SHA1, DH20] refused Feb 22 19:01:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: Oakley Transform [AES_CBC (128), HMAC_SHA1, DH19] refused Feb 22 19:01:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: STATE_MAIN_R1: sent MR1, expecting MI2 Feb 22 19:01:56 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: retransmitting in response to duplicate packet; already STATE_MAIN_R1 Feb 22 19:01:57 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: retransmitting in response to duplicate packet; already STATE_MAIN_R1 Feb 22 19:02:00 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: retransmitting in response to duplicate packet; already STATE_MAIN_R1 Feb 22 19:02:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: deleting incomplete state after 60.000 seconds Feb 22 19:02:55 63e180cacf32 pluto[2747]: "l2tp-psk"[1] xxx.xxx.xxx.xxx #1: deleting state (STATE_MAIN_R1) Feb 22 19:02:55 63e180cacf32 pluto[2747]: deleting connection "l2tp-psk"[1] xxx.xxx.xxx.xxx instance with peer xxx.xxx.xxx.xxx {isakmp=#0/ipsec=#0} ```
Author
Owner

@hwdsl2 commented on GitHub (Feb 22, 2018):

@SHA-256 The ‘retransmit’ lines in your logs indicate that this is caused
by network issues between your VPN client and server, and not a problem
with the VPN server itself.

<!-- gh-comment-id:367788066 --> @hwdsl2 commented on GitHub (Feb 22, 2018): @SHA-256 The ‘retransmit’ lines in your logs indicate that this is caused by network issues between your VPN client and server, and not a problem with the VPN server itself.
Author
Owner

@cirience-zz commented on GitHub (Feb 22, 2018):

@hwdsl2 Oh, I'm so stupid. Thanks for your help and effort!

<!-- gh-comment-id:367789358 --> @cirience-zz commented on GitHub (Feb 22, 2018): @hwdsl2 Oh, I'm so stupid. Thanks for your help and effort!
Author
Owner

@Roming22 commented on GitHub (Feb 22, 2018):

@hwdsl2 Here's the full log, sorry for not including it earlier.

Feb 22 22:32:48 ef1b007d1a6c ipsec__plutorun: Starting Pluto
Feb 22 22:32:49 ef1b007d1a6c pluto[1295]: NSS DB directory: sql:/etc/ipsec.d
Feb 22 22:32:49 ef1b007d1a6c pluto[1295]: Initializing NSS
Feb 22 22:32:49 ef1b007d1a6c pluto[1295]: Opening NSS database "sql:/etc/ipsec.d" read-only
Feb 22 22:33:38 dbd8f2131341 ipsec__plutorun: Starting Pluto
Feb 22 22:33:39 dbd8f2131341 pluto[481]: NSS DB directory: sql:/etc/ipsec.d
Feb 22 22:33:39 dbd8f2131341 pluto[481]: Initializing NSS
Feb 22 22:33:39 dbd8f2131341 pluto[481]: Opening NSS database "sql:/etc/ipsec.d" read-only
Feb 22 22:33:39 dbd8f2131341 pluto[481]: NSS initialized
Feb 22 22:33:39 dbd8f2131341 pluto[481]: NSS crypto library initialized
Feb 22 22:33:39 dbd8f2131341 pluto[481]: FIPS HMAC integrity support [disabled]
Feb 22 22:33:39 dbd8f2131341 pluto[481]: libcap-ng support [enabled]
Feb 22 22:33:39 dbd8f2131341 pluto[481]: Linux audit support [disabled]
Feb 22 22:33:39 dbd8f2131341 pluto[481]: Starting Pluto (Libreswan Version 3.23 XFRM(netkey) KLIPS FORK PTHREAD_SETSCHEDPRIO NSS LABELED_IPSEC LIBCAP_NG XAUTH_PAM NETWORKMANAGER CURL(non-NSS)) pid:481
Feb 22 22:33:39 dbd8f2131341 pluto[481]: core dump dir: /run/pluto
Feb 22 22:33:39 dbd8f2131341 pluto[481]: secrets file: /etc/ipsec.secrets
Feb 22 22:33:39 dbd8f2131341 pluto[481]: leak-detective disabled
Feb 22 22:33:39 dbd8f2131341 pluto[481]: NSS crypto [enabled]
Feb 22 22:33:39 dbd8f2131341 pluto[481]: XAUTH PAM support [enabled]
Feb 22 22:33:39 dbd8f2131341 pluto[481]: NAT-Traversal support  [enabled]
Feb 22 22:33:39 dbd8f2131341 pluto[481]: Initializing libevent in pthreads mode: headers: 2.0.21-stable (2001500); library: 2.0.21-stable (2001500)
Feb 22 22:33:39 dbd8f2131341 pluto[481]: Encryption algorithms:
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   AES_CCM_16          IKEv1:     ESP     IKEv2:     ESP     FIPS  {256,192,*128}  (aes_ccm aes_ccm_c)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   AES_CCM_12          IKEv1:     ESP     IKEv2:     ESP     FIPS  {256,192,*128}  (aes_ccm_b)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   AES_CCM_8           IKEv1:     ESP     IKEv2:     ESP     FIPS  {256,192,*128}  (aes_ccm_a)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   3DES_CBC            IKEv1: IKE ESP     IKEv2: IKE ESP     FIPS  [*192]  (3des)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   CAMELLIA_CTR        IKEv1:     ESP     IKEv2:     ESP           {256,192,*128}
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   CAMELLIA_CBC        IKEv1: IKE ESP     IKEv2: IKE ESP           {256,192,*128}  (camellia)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   AES_GCM_16          IKEv1:     ESP     IKEv2: IKE ESP     FIPS  {256,192,*128}  (aes_gcm aes_gcm_c)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   AES_GCM_12          IKEv1:     ESP     IKEv2: IKE ESP     FIPS  {256,192,*128}  (aes_gcm_b)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   AES_GCM_8           IKEv1:     ESP     IKEv2: IKE ESP     FIPS  {256,192,*128}  (aes_gcm_a)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   AES_CTR             IKEv1: IKE ESP     IKEv2: IKE ESP     FIPS  {256,192,*128}  (aesctr)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   AES_CBC             IKEv1: IKE ESP     IKEv2: IKE ESP     FIPS  {256,192,*128}  (aes)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   SERPENT_CBC         IKEv1: IKE ESP     IKEv2: IKE ESP           {256,192,*128}  (serpent)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   TWOFISH_CBC         IKEv1: IKE ESP     IKEv2: IKE ESP           {256,192,*128}  (twofish)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   TWOFISH_SSH         IKEv1: IKE         IKEv2: IKE ESP           {256,192,*128}  (twofish_cbc_ssh)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   CAST_CBC            IKEv1:     ESP     IKEv2:     ESP           {*128}  (cast)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   NULL_AUTH_AES_GMAC  IKEv1:     ESP     IKEv2:     ESP           {256,192,*128}  (aes_gmac)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   NULL                IKEv1:     ESP     IKEv2:     ESP           []
Feb 22 22:33:39 dbd8f2131341 pluto[481]: Hash algorithms:
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   MD5                 IKEv1: IKE         IKEv2:                 
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   SHA1                IKEv1: IKE         IKEv2:             FIPS  (sha)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   SHA2_256            IKEv1: IKE         IKEv2:             FIPS  (sha2 sha256)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   SHA2_384            IKEv1: IKE         IKEv2:             FIPS  (sha384)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   SHA2_512            IKEv1: IKE         IKEv2:             FIPS  (sha512)
Feb 22 22:33:39 dbd8f2131341 pluto[481]: PRF algorithms:
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   HMAC_MD5            IKEv1: IKE         IKEv2: IKE               (md5)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   HMAC_SHA1           IKEv1: IKE         IKEv2: IKE         FIPS  (sha sha1)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   HMAC_SHA2_256       IKEv1: IKE         IKEv2: IKE         FIPS  (sha2 sha256 sha2_256)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   HMAC_SHA2_384       IKEv1: IKE         IKEv2: IKE         FIPS  (sha384 sha2_384)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   HMAC_SHA2_512       IKEv1: IKE         IKEv2: IKE         FIPS  (sha512 sha2_512)
Feb 22 22:33:39 dbd8f2131341 pluto[481]: Integrity algorithms:
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   HMAC_MD5_96         IKEv1: IKE ESP AH  IKEv2: IKE ESP AH        (md5 hmac_md5)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   HMAC_SHA1_96        IKEv1: IKE ESP AH  IKEv2: IKE ESP AH  FIPS  (sha sha1 sha1_96 hmac_sha1)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   HMAC_SHA2_512_256   IKEv1: IKE ESP AH  IKEv2: IKE ESP AH  FIPS  (sha512 sha2_512 hmac_sha2_512)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   HMAC_SHA2_384_192   IKEv1: IKE ESP AH  IKEv2: IKE ESP AH  FIPS  (sha384 sha2_384 hmac_sha2_384)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   HMAC_SHA2_256_128   IKEv1: IKE ESP AH  IKEv2: IKE ESP AH  FIPS  (sha2 sha256 sha2_256 hmac_sha2_256)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   AES_XCBC_96         IKEv1:     ESP AH  IKEv2:     ESP AH  FIPS  (aes_xcbc)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   AES_CMAC_96         IKEv1:     ESP AH  IKEv2:     ESP AH  FIPS  (aes_cmac)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   NONE                IKEv1:     ESP     IKEv2:     ESP     FIPS  (null)
Feb 22 22:33:39 dbd8f2131341 pluto[481]: DH algorithms:
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   MODP1024            IKEv1: IKE ESP AH  IKEv2: IKE ESP AH        (dh2)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   MODP1536            IKEv1: IKE ESP AH  IKEv2: IKE ESP AH        (dh5)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   MODP2048            IKEv1: IKE ESP AH  IKEv2: IKE ESP AH  FIPS  (dh14)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   MODP3072            IKEv1: IKE ESP AH  IKEv2: IKE ESP AH  FIPS  (dh15)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   MODP4096            IKEv1: IKE ESP AH  IKEv2: IKE ESP AH  FIPS  (dh16)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   MODP6144            IKEv1: IKE ESP AH  IKEv2: IKE ESP AH  FIPS  (dh17)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   MODP8192            IKEv1: IKE ESP AH  IKEv2: IKE ESP AH  FIPS  (dh18)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   DH19                IKEv1: IKE         IKEv2: IKE ESP AH  FIPS  (ecp_256)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   DH20                IKEv1: IKE         IKEv2: IKE ESP AH  FIPS  (ecp_384)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   DH21                IKEv1: IKE         IKEv2: IKE ESP AH  FIPS  (ecp_521)
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   DH23                IKEv1: IKE ESP AH  IKEv2: IKE ESP AH  FIPS
Feb 22 22:33:39 dbd8f2131341 pluto[481]:   DH24                IKEv1: IKE ESP AH  IKEv2: IKE ESP AH  FIPS
Feb 22 22:33:39 dbd8f2131341 pluto[481]: starting up 4 crypto helpers
Feb 22 22:33:39 dbd8f2131341 pluto[481]: started thread for crypto helper 0
Feb 22 22:33:39 dbd8f2131341 pluto[481]: seccomp security for crypto helper not supported
Feb 22 22:33:39 dbd8f2131341 pluto[481]: started thread for crypto helper 1
Feb 22 22:33:39 dbd8f2131341 pluto[481]: started thread for crypto helper 2
Feb 22 22:33:39 dbd8f2131341 pluto[481]: seccomp security for crypto helper not supported
Feb 22 22:33:39 dbd8f2131341 pluto[481]: started thread for crypto helper 3
Feb 22 22:33:39 dbd8f2131341 pluto[481]: Using Linux XFRM/NETKEY IPsec interface code on 4.14.19-coreos
Feb 22 22:33:39 dbd8f2131341 pluto[481]: seccomp security for crypto helper not supported
Feb 22 22:33:39 dbd8f2131341 pluto[481]: seccomp security for crypto helper not supported
Feb 22 22:33:39 dbd8f2131341 pluto[481]: | selinux support is NOT enabled.
Feb 22 22:33:39 dbd8f2131341 pluto[481]: seccomp security not supported
Feb 22 22:33:40 dbd8f2131341 pluto[481]: Failed to add connection "l2tp-psk", esp="3des-sha1,3des-sha2,aes-sha1,aes-sha2,aes256-sha2_512" is invalid: ESP integrity algorithm 'sha2_512' is not supported, enc_alg="aes"(256), auth_alg="sha2_512", modp=""
Feb 22 22:33:40 dbd8f2131341 pluto[481]: Failed to add connection "xauth-psk", esp="3des-sha1,3des-sha2,aes-sha1,aes-sha2,aes256-sha2_512" is invalid: ESP integrity algorithm 'sha2_512' is not supported, enc_alg="aes"(256), auth_alg="sha2_512", modp=""
Feb 22 22:33:40 dbd8f2131341 pluto[481]: listening for IKE messages
Feb 22 22:33:40 dbd8f2131341 pluto[481]: adding interface eth0/eth0 172.17.0.2:500
Feb 22 22:33:40 dbd8f2131341 pluto[481]: adding interface eth0/eth0 172.17.0.2:4500
Feb 22 22:33:40 dbd8f2131341 pluto[481]: adding interface lo/lo 127.0.0.1:500
Feb 22 22:33:40 dbd8f2131341 pluto[481]: adding interface lo/lo 127.0.0.1:4500
Feb 22 22:33:40 dbd8f2131341 pluto[481]: | setup callback for interface lo:4500 fd 19
Feb 22 22:33:40 dbd8f2131341 pluto[481]: | setup callback for interface lo:500 fd 18
Feb 22 22:33:40 dbd8f2131341 pluto[481]: | setup callback for interface eth0:4500 fd 17
Feb 22 22:33:40 dbd8f2131341 pluto[481]: | setup callback for interface eth0:500 fd 16
Feb 22 22:33:40 dbd8f2131341 pluto[481]: loading secrets from "/etc/ipsec.secrets"
Feb 22 22:35:32 dbd8f2131341 pluto[481]: packet from 146.115.170.44:500: initial Main Mode message received on 172.17.0.2:500 but no connection has been authorized with policy PSK+IKEV1_ALLOW
Feb 22 22:35:35 dbd8f2131341 pluto[481]: packet from 192.168.2.14:500: initial Main Mode message received on 172.17.0.2:500 but no connection has been authorized with policy PSK+IKEV1_ALLOW
Feb 22 22:35:38 dbd8f2131341 pluto[481]: packet from 192.168.2.14:500: initial Main Mode message received on 172.17.0.2:500 but no connection has been authorized with policy PSK+IKEV1_ALLOW
Feb 22 22:35:41 dbd8f2131341 pluto[481]: packet from 192.168.2.14:500: initial Main Mode message received on 172.17.0.2:500 but no connection has been authorized with policy PSK+IKEV1_ALLOW
<!-- gh-comment-id:367846833 --> @Roming22 commented on GitHub (Feb 22, 2018): @hwdsl2 Here's the full log, sorry for not including it earlier. ``` Feb 22 22:32:48 ef1b007d1a6c ipsec__plutorun: Starting Pluto Feb 22 22:32:49 ef1b007d1a6c pluto[1295]: NSS DB directory: sql:/etc/ipsec.d Feb 22 22:32:49 ef1b007d1a6c pluto[1295]: Initializing NSS Feb 22 22:32:49 ef1b007d1a6c pluto[1295]: Opening NSS database "sql:/etc/ipsec.d" read-only Feb 22 22:33:38 dbd8f2131341 ipsec__plutorun: Starting Pluto Feb 22 22:33:39 dbd8f2131341 pluto[481]: NSS DB directory: sql:/etc/ipsec.d Feb 22 22:33:39 dbd8f2131341 pluto[481]: Initializing NSS Feb 22 22:33:39 dbd8f2131341 pluto[481]: Opening NSS database "sql:/etc/ipsec.d" read-only Feb 22 22:33:39 dbd8f2131341 pluto[481]: NSS initialized Feb 22 22:33:39 dbd8f2131341 pluto[481]: NSS crypto library initialized Feb 22 22:33:39 dbd8f2131341 pluto[481]: FIPS HMAC integrity support [disabled] Feb 22 22:33:39 dbd8f2131341 pluto[481]: libcap-ng support [enabled] Feb 22 22:33:39 dbd8f2131341 pluto[481]: Linux audit support [disabled] Feb 22 22:33:39 dbd8f2131341 pluto[481]: Starting Pluto (Libreswan Version 3.23 XFRM(netkey) KLIPS FORK PTHREAD_SETSCHEDPRIO NSS LABELED_IPSEC LIBCAP_NG XAUTH_PAM NETWORKMANAGER CURL(non-NSS)) pid:481 Feb 22 22:33:39 dbd8f2131341 pluto[481]: core dump dir: /run/pluto Feb 22 22:33:39 dbd8f2131341 pluto[481]: secrets file: /etc/ipsec.secrets Feb 22 22:33:39 dbd8f2131341 pluto[481]: leak-detective disabled Feb 22 22:33:39 dbd8f2131341 pluto[481]: NSS crypto [enabled] Feb 22 22:33:39 dbd8f2131341 pluto[481]: XAUTH PAM support [enabled] Feb 22 22:33:39 dbd8f2131341 pluto[481]: NAT-Traversal support [enabled] Feb 22 22:33:39 dbd8f2131341 pluto[481]: Initializing libevent in pthreads mode: headers: 2.0.21-stable (2001500); library: 2.0.21-stable (2001500) Feb 22 22:33:39 dbd8f2131341 pluto[481]: Encryption algorithms: Feb 22 22:33:39 dbd8f2131341 pluto[481]: AES_CCM_16 IKEv1: ESP IKEv2: ESP FIPS {256,192,*128} (aes_ccm aes_ccm_c) Feb 22 22:33:39 dbd8f2131341 pluto[481]: AES_CCM_12 IKEv1: ESP IKEv2: ESP FIPS {256,192,*128} (aes_ccm_b) Feb 22 22:33:39 dbd8f2131341 pluto[481]: AES_CCM_8 IKEv1: ESP IKEv2: ESP FIPS {256,192,*128} (aes_ccm_a) Feb 22 22:33:39 dbd8f2131341 pluto[481]: 3DES_CBC IKEv1: IKE ESP IKEv2: IKE ESP FIPS [*192] (3des) Feb 22 22:33:39 dbd8f2131341 pluto[481]: CAMELLIA_CTR IKEv1: ESP IKEv2: ESP {256,192,*128} Feb 22 22:33:39 dbd8f2131341 pluto[481]: CAMELLIA_CBC IKEv1: IKE ESP IKEv2: IKE ESP {256,192,*128} (camellia) Feb 22 22:33:39 dbd8f2131341 pluto[481]: AES_GCM_16 IKEv1: ESP IKEv2: IKE ESP FIPS {256,192,*128} (aes_gcm aes_gcm_c) Feb 22 22:33:39 dbd8f2131341 pluto[481]: AES_GCM_12 IKEv1: ESP IKEv2: IKE ESP FIPS {256,192,*128} (aes_gcm_b) Feb 22 22:33:39 dbd8f2131341 pluto[481]: AES_GCM_8 IKEv1: ESP IKEv2: IKE ESP FIPS {256,192,*128} (aes_gcm_a) Feb 22 22:33:39 dbd8f2131341 pluto[481]: AES_CTR IKEv1: IKE ESP IKEv2: IKE ESP FIPS {256,192,*128} (aesctr) Feb 22 22:33:39 dbd8f2131341 pluto[481]: AES_CBC IKEv1: IKE ESP IKEv2: IKE ESP FIPS {256,192,*128} (aes) Feb 22 22:33:39 dbd8f2131341 pluto[481]: SERPENT_CBC IKEv1: IKE ESP IKEv2: IKE ESP {256,192,*128} (serpent) Feb 22 22:33:39 dbd8f2131341 pluto[481]: TWOFISH_CBC IKEv1: IKE ESP IKEv2: IKE ESP {256,192,*128} (twofish) Feb 22 22:33:39 dbd8f2131341 pluto[481]: TWOFISH_SSH IKEv1: IKE IKEv2: IKE ESP {256,192,*128} (twofish_cbc_ssh) Feb 22 22:33:39 dbd8f2131341 pluto[481]: CAST_CBC IKEv1: ESP IKEv2: ESP {*128} (cast) Feb 22 22:33:39 dbd8f2131341 pluto[481]: NULL_AUTH_AES_GMAC IKEv1: ESP IKEv2: ESP {256,192,*128} (aes_gmac) Feb 22 22:33:39 dbd8f2131341 pluto[481]: NULL IKEv1: ESP IKEv2: ESP [] Feb 22 22:33:39 dbd8f2131341 pluto[481]: Hash algorithms: Feb 22 22:33:39 dbd8f2131341 pluto[481]: MD5 IKEv1: IKE IKEv2: Feb 22 22:33:39 dbd8f2131341 pluto[481]: SHA1 IKEv1: IKE IKEv2: FIPS (sha) Feb 22 22:33:39 dbd8f2131341 pluto[481]: SHA2_256 IKEv1: IKE IKEv2: FIPS (sha2 sha256) Feb 22 22:33:39 dbd8f2131341 pluto[481]: SHA2_384 IKEv1: IKE IKEv2: FIPS (sha384) Feb 22 22:33:39 dbd8f2131341 pluto[481]: SHA2_512 IKEv1: IKE IKEv2: FIPS (sha512) Feb 22 22:33:39 dbd8f2131341 pluto[481]: PRF algorithms: Feb 22 22:33:39 dbd8f2131341 pluto[481]: HMAC_MD5 IKEv1: IKE IKEv2: IKE (md5) Feb 22 22:33:39 dbd8f2131341 pluto[481]: HMAC_SHA1 IKEv1: IKE IKEv2: IKE FIPS (sha sha1) Feb 22 22:33:39 dbd8f2131341 pluto[481]: HMAC_SHA2_256 IKEv1: IKE IKEv2: IKE FIPS (sha2 sha256 sha2_256) Feb 22 22:33:39 dbd8f2131341 pluto[481]: HMAC_SHA2_384 IKEv1: IKE IKEv2: IKE FIPS (sha384 sha2_384) Feb 22 22:33:39 dbd8f2131341 pluto[481]: HMAC_SHA2_512 IKEv1: IKE IKEv2: IKE FIPS (sha512 sha2_512) Feb 22 22:33:39 dbd8f2131341 pluto[481]: Integrity algorithms: Feb 22 22:33:39 dbd8f2131341 pluto[481]: HMAC_MD5_96 IKEv1: IKE ESP AH IKEv2: IKE ESP AH (md5 hmac_md5) Feb 22 22:33:39 dbd8f2131341 pluto[481]: HMAC_SHA1_96 IKEv1: IKE ESP AH IKEv2: IKE ESP AH FIPS (sha sha1 sha1_96 hmac_sha1) Feb 22 22:33:39 dbd8f2131341 pluto[481]: HMAC_SHA2_512_256 IKEv1: IKE ESP AH IKEv2: IKE ESP AH FIPS (sha512 sha2_512 hmac_sha2_512) Feb 22 22:33:39 dbd8f2131341 pluto[481]: HMAC_SHA2_384_192 IKEv1: IKE ESP AH IKEv2: IKE ESP AH FIPS (sha384 sha2_384 hmac_sha2_384) Feb 22 22:33:39 dbd8f2131341 pluto[481]: HMAC_SHA2_256_128 IKEv1: IKE ESP AH IKEv2: IKE ESP AH FIPS (sha2 sha256 sha2_256 hmac_sha2_256) Feb 22 22:33:39 dbd8f2131341 pluto[481]: AES_XCBC_96 IKEv1: ESP AH IKEv2: ESP AH FIPS (aes_xcbc) Feb 22 22:33:39 dbd8f2131341 pluto[481]: AES_CMAC_96 IKEv1: ESP AH IKEv2: ESP AH FIPS (aes_cmac) Feb 22 22:33:39 dbd8f2131341 pluto[481]: NONE IKEv1: ESP IKEv2: ESP FIPS (null) Feb 22 22:33:39 dbd8f2131341 pluto[481]: DH algorithms: Feb 22 22:33:39 dbd8f2131341 pluto[481]: MODP1024 IKEv1: IKE ESP AH IKEv2: IKE ESP AH (dh2) Feb 22 22:33:39 dbd8f2131341 pluto[481]: MODP1536 IKEv1: IKE ESP AH IKEv2: IKE ESP AH (dh5) Feb 22 22:33:39 dbd8f2131341 pluto[481]: MODP2048 IKEv1: IKE ESP AH IKEv2: IKE ESP AH FIPS (dh14) Feb 22 22:33:39 dbd8f2131341 pluto[481]: MODP3072 IKEv1: IKE ESP AH IKEv2: IKE ESP AH FIPS (dh15) Feb 22 22:33:39 dbd8f2131341 pluto[481]: MODP4096 IKEv1: IKE ESP AH IKEv2: IKE ESP AH FIPS (dh16) Feb 22 22:33:39 dbd8f2131341 pluto[481]: MODP6144 IKEv1: IKE ESP AH IKEv2: IKE ESP AH FIPS (dh17) Feb 22 22:33:39 dbd8f2131341 pluto[481]: MODP8192 IKEv1: IKE ESP AH IKEv2: IKE ESP AH FIPS (dh18) Feb 22 22:33:39 dbd8f2131341 pluto[481]: DH19 IKEv1: IKE IKEv2: IKE ESP AH FIPS (ecp_256) Feb 22 22:33:39 dbd8f2131341 pluto[481]: DH20 IKEv1: IKE IKEv2: IKE ESP AH FIPS (ecp_384) Feb 22 22:33:39 dbd8f2131341 pluto[481]: DH21 IKEv1: IKE IKEv2: IKE ESP AH FIPS (ecp_521) Feb 22 22:33:39 dbd8f2131341 pluto[481]: DH23 IKEv1: IKE ESP AH IKEv2: IKE ESP AH FIPS Feb 22 22:33:39 dbd8f2131341 pluto[481]: DH24 IKEv1: IKE ESP AH IKEv2: IKE ESP AH FIPS Feb 22 22:33:39 dbd8f2131341 pluto[481]: starting up 4 crypto helpers Feb 22 22:33:39 dbd8f2131341 pluto[481]: started thread for crypto helper 0 Feb 22 22:33:39 dbd8f2131341 pluto[481]: seccomp security for crypto helper not supported Feb 22 22:33:39 dbd8f2131341 pluto[481]: started thread for crypto helper 1 Feb 22 22:33:39 dbd8f2131341 pluto[481]: started thread for crypto helper 2 Feb 22 22:33:39 dbd8f2131341 pluto[481]: seccomp security for crypto helper not supported Feb 22 22:33:39 dbd8f2131341 pluto[481]: started thread for crypto helper 3 Feb 22 22:33:39 dbd8f2131341 pluto[481]: Using Linux XFRM/NETKEY IPsec interface code on 4.14.19-coreos Feb 22 22:33:39 dbd8f2131341 pluto[481]: seccomp security for crypto helper not supported Feb 22 22:33:39 dbd8f2131341 pluto[481]: seccomp security for crypto helper not supported Feb 22 22:33:39 dbd8f2131341 pluto[481]: | selinux support is NOT enabled. Feb 22 22:33:39 dbd8f2131341 pluto[481]: seccomp security not supported Feb 22 22:33:40 dbd8f2131341 pluto[481]: Failed to add connection "l2tp-psk", esp="3des-sha1,3des-sha2,aes-sha1,aes-sha2,aes256-sha2_512" is invalid: ESP integrity algorithm 'sha2_512' is not supported, enc_alg="aes"(256), auth_alg="sha2_512", modp="" Feb 22 22:33:40 dbd8f2131341 pluto[481]: Failed to add connection "xauth-psk", esp="3des-sha1,3des-sha2,aes-sha1,aes-sha2,aes256-sha2_512" is invalid: ESP integrity algorithm 'sha2_512' is not supported, enc_alg="aes"(256), auth_alg="sha2_512", modp="" Feb 22 22:33:40 dbd8f2131341 pluto[481]: listening for IKE messages Feb 22 22:33:40 dbd8f2131341 pluto[481]: adding interface eth0/eth0 172.17.0.2:500 Feb 22 22:33:40 dbd8f2131341 pluto[481]: adding interface eth0/eth0 172.17.0.2:4500 Feb 22 22:33:40 dbd8f2131341 pluto[481]: adding interface lo/lo 127.0.0.1:500 Feb 22 22:33:40 dbd8f2131341 pluto[481]: adding interface lo/lo 127.0.0.1:4500 Feb 22 22:33:40 dbd8f2131341 pluto[481]: | setup callback for interface lo:4500 fd 19 Feb 22 22:33:40 dbd8f2131341 pluto[481]: | setup callback for interface lo:500 fd 18 Feb 22 22:33:40 dbd8f2131341 pluto[481]: | setup callback for interface eth0:4500 fd 17 Feb 22 22:33:40 dbd8f2131341 pluto[481]: | setup callback for interface eth0:500 fd 16 Feb 22 22:33:40 dbd8f2131341 pluto[481]: loading secrets from "/etc/ipsec.secrets" Feb 22 22:35:32 dbd8f2131341 pluto[481]: packet from 146.115.170.44:500: initial Main Mode message received on 172.17.0.2:500 but no connection has been authorized with policy PSK+IKEV1_ALLOW Feb 22 22:35:35 dbd8f2131341 pluto[481]: packet from 192.168.2.14:500: initial Main Mode message received on 172.17.0.2:500 but no connection has been authorized with policy PSK+IKEV1_ALLOW Feb 22 22:35:38 dbd8f2131341 pluto[481]: packet from 192.168.2.14:500: initial Main Mode message received on 172.17.0.2:500 but no connection has been authorized with policy PSK+IKEV1_ALLOW Feb 22 22:35:41 dbd8f2131341 pluto[481]: packet from 192.168.2.14:500: initial Main Mode message received on 172.17.0.2:500 but no connection has been authorized with policy PSK+IKEV1_ALLOW ```
Author
Owner

@hugo187 commented on GitHub (Feb 24, 2018):

I am having the same issue. It seems there isn't userspace support for IPSEC in CoreOS.
See https://github.com/coreos/bugs/issues/558

<!-- gh-comment-id:368251310 --> @hugo187 commented on GitHub (Feb 24, 2018): I am having the same issue. It seems there isn't userspace support for IPSEC in CoreOS. See https://github.com/coreos/bugs/issues/558
Author
Owner

@a1liz commented on GitHub (Feb 27, 2018):

The same issue. I have used this service for several weeks, but it suddenly doesn't work.
Here's my log:

Feb 27 02:13:49 7e3f8955bace pluto[2269]: deleting connection "xauth-psk"[1] 111.73.177.102 instance with peer 111.73.177.102 {isakmp=#0/ipsec=#0}
Feb 27 02:16:11 7e3f8955bace pluto[2269]: "l2tp-psk"[1] 111.73.177.102 #2: responding to Main Mode from unknown peer 111.73.177.102 on port 2062
Feb 27 02:16:11 7e3f8955bace pluto[2269]: "l2tp-psk"[1] 111.73.177.102 #2: STATE_MAIN_R1: sent MR1, expecting MI2
Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: not enough room in input packet for ISAKMP Message (remain=0, sd->size=28)
Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: Received packet with mangled IKE header - dropped
Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: not enough room in input packet for ISAKMP Message (remain=0, sd->size=28)
Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: Received packet with mangled IKE header - dropped
Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: not enough room in input packet for ISAKMP Message (remain=0, sd->size=28)
Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: Received packet with mangled IKE header - dropped
Feb 27 02:16:15 7e3f8955bace pluto[2269]: "l2tp-psk"[1] 111.73.177.102 #2: retransmitting in response to duplicate packet; already STATE_MAIN_R1
Feb 27 02:16:18 7e3f8955bace pluto[2269]: "l2tp-psk"[1] 111.73.177.102 #2: retransmitting in response to duplicate packet; already STATE_MAIN_R1
Feb 27 02:16:21 7e3f8955bace pluto[2269]: "l2tp-psk"[1] 111.73.177.102 #2: retransmitting in response to duplicate packet; already STATE_MAIN_R1
<!-- gh-comment-id:368724143 --> @a1liz commented on GitHub (Feb 27, 2018): The same issue. I have used this service for several weeks, but it suddenly doesn't work. Here's my log: ``` Feb 27 02:13:49 7e3f8955bace pluto[2269]: deleting connection "xauth-psk"[1] 111.73.177.102 instance with peer 111.73.177.102 {isakmp=#0/ipsec=#0} Feb 27 02:16:11 7e3f8955bace pluto[2269]: "l2tp-psk"[1] 111.73.177.102 #2: responding to Main Mode from unknown peer 111.73.177.102 on port 2062 Feb 27 02:16:11 7e3f8955bace pluto[2269]: "l2tp-psk"[1] 111.73.177.102 #2: STATE_MAIN_R1: sent MR1, expecting MI2 Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: not enough room in input packet for ISAKMP Message (remain=0, sd->size=28) Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: Received packet with mangled IKE header - dropped Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: not enough room in input packet for ISAKMP Message (remain=0, sd->size=28) Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: Received packet with mangled IKE header - dropped Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: not enough room in input packet for ISAKMP Message (remain=0, sd->size=28) Feb 27 02:16:11 7e3f8955bace pluto[2269]: packet from 111.73.177.102:2062: Received packet with mangled IKE header - dropped Feb 27 02:16:15 7e3f8955bace pluto[2269]: "l2tp-psk"[1] 111.73.177.102 #2: retransmitting in response to duplicate packet; already STATE_MAIN_R1 Feb 27 02:16:18 7e3f8955bace pluto[2269]: "l2tp-psk"[1] 111.73.177.102 #2: retransmitting in response to duplicate packet; already STATE_MAIN_R1 Feb 27 02:16:21 7e3f8955bace pluto[2269]: "l2tp-psk"[1] 111.73.177.102 #2: retransmitting in response to duplicate packet; already STATE_MAIN_R1 ```
Author
Owner

@Roming22 commented on GitHub (Feb 27, 2018):

@hugo187 Did you try running within the toolbox? I wonder if one could run docker within the toolbox (which seems to support userspace) and have the ipsec-vpn-server running on that. Let me know if you manage to run it on coreos, as I'd prefer running that distro.

@hwdsl2 Following hugo187 comment, I've switched to Ubuntu server 17.10, and it works fine on my iPhone when using the IPSec VPN configuration.

<!-- gh-comment-id:369070838 --> @Roming22 commented on GitHub (Feb 27, 2018): @hugo187 Did you try running within the toolbox? I wonder if one could run docker within the toolbox (which seems to support userspace) and have the ipsec-vpn-server running on that. Let me know if you manage to run it on coreos, as I'd prefer running that distro. @hwdsl2 Following hugo187 comment, I've switched to Ubuntu server 17.10, and it works fine on my iPhone when using the IPSec VPN configuration.
Author
Owner

@hwdsl2 commented on GitHub (May 5, 2018):

@Roming22 Hello! Your IPsec logs indicate that the VPN cipher aes256-sha2_512 is not supported under CoreOS. Edit /etc/ipsec.conf and remove the ,aes256-sha2_512 part from ike= and phase2alg= lines, then run service ipsec restart.

<!-- gh-comment-id:386840061 --> @hwdsl2 commented on GitHub (May 5, 2018): @Roming22 Hello! Your IPsec logs indicate that the VPN cipher `aes256-sha2_512` is not supported under CoreOS. Edit `/etc/ipsec.conf` and remove the `,aes256-sha2_512` part from `ike=` and `phase2alg=` lines, then run `service ipsec restart`.
Author
Owner

@tswsxk commented on GitHub (Apr 10, 2019):

The same issue and occurs suddenly.
Here is my log

Apr 10 15:04:34 dm pluto[25112]: packet from 114.214.246.164:500: ignoring unknown Vendor ID payload [01528bbbc00696121849ab9a1c5b2a5100000001]
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: responding to Main Mode from unknown peer 114.214.246.164 on port 500
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: Oakley Transform [AES_CBC (256), HMAC_SHA1, DH20] refused
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: Oakley Transform [AES_CBC (128), HMAC_SHA1, DH19] refused
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: STATE_MAIN_R1: sent MR1, expecting MI2
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: STATE_MAIN_R2: sent MR2, expecting MI3
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: Peer ID is ID_IPV4_ADDR: '114.214.246.164'
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: STATE_MAIN_R3: sent MR3, ISAKMP SA established {auth=PRESHARED_KEY cipher=aes_256 integ=sha group=MODP2048}
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: Configured DPD (RFC 3706) support not enabled because remote peer did not advertise DPD support
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: the peer proposed: 202.38.75.5/32:17/1701 -> 114.214.246.164/32:17/0
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: NAT-Traversal: received 2 NAT-OA. Using first, ignoring others
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: responding to Quick Mode proposal {msgid:01000000}
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8:     us: 202.38.75.5:17/1701
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8:   them: 114.214.246.164:17/1701
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: STATE_QUICK_R1: sent QR1, inbound IPsec SA installed, expecting QI2 transport mode {ESP/NAT=>0x83e86832 <0x4c36aeda xfrm=AES_CBC_256-HMAC_SHA1_96 NATOA=114.214.246.164 NATD=114.214.246.164:4500 DPD=active}
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: Configured DPD (RFC 3706) support not enabled because remote peer did not advertise DPD support
Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: STATE_QUICK_R2: IPsec SA established transport mode {ESP/NAT=>0x83e86832 <0x4c36aeda xfrm=AES_CBC_256-HMAC_SHA1_96 NATOA=114.214.246.164 NATD=114.214.246.164:4500 DPD=active}
Apr 10 15:05:09 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: received Delete SA(0x83e86832) payload: deleting IPSEC State #8
Apr 10 15:05:09 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: deleting other state #8 (STATE_QUICK_R2) and sending notification
Apr 10 15:05:09 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: ESP traffic information: in=696B out=0B
Apr 10 15:05:09 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164: deleting connection "l2tp-psk"[4] 114.214.246.164 instance with peer 114.214.246.164 {isakmp=#0/ipsec=#0}
Apr 10 15:05:09 dm pluto[25112]: packet from 114.214.246.164:4500: received and ignored empty informational notification payload
<!-- gh-comment-id:481565942 --> @tswsxk commented on GitHub (Apr 10, 2019): The same issue and occurs suddenly. Here is my log ``` Apr 10 15:04:34 dm pluto[25112]: packet from 114.214.246.164:500: ignoring unknown Vendor ID payload [01528bbbc00696121849ab9a1c5b2a5100000001] Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: responding to Main Mode from unknown peer 114.214.246.164 on port 500 Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: Oakley Transform [AES_CBC (256), HMAC_SHA1, DH20] refused Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: Oakley Transform [AES_CBC (128), HMAC_SHA1, DH19] refused Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: STATE_MAIN_R1: sent MR1, expecting MI2 Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: STATE_MAIN_R2: sent MR2, expecting MI3 Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: Peer ID is ID_IPV4_ADDR: '114.214.246.164' Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: STATE_MAIN_R3: sent MR3, ISAKMP SA established {auth=PRESHARED_KEY cipher=aes_256 integ=sha group=MODP2048} Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: Configured DPD (RFC 3706) support not enabled because remote peer did not advertise DPD support Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: the peer proposed: 202.38.75.5/32:17/1701 -> 114.214.246.164/32:17/0 Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: NAT-Traversal: received 2 NAT-OA. Using first, ignoring others Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: responding to Quick Mode proposal {msgid:01000000} Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: us: 202.38.75.5:17/1701 Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: them: 114.214.246.164:17/1701 Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: STATE_QUICK_R1: sent QR1, inbound IPsec SA installed, expecting QI2 transport mode {ESP/NAT=>0x83e86832 <0x4c36aeda xfrm=AES_CBC_256-HMAC_SHA1_96 NATOA=114.214.246.164 NATD=114.214.246.164:4500 DPD=active} Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: Configured DPD (RFC 3706) support not enabled because remote peer did not advertise DPD support Apr 10 15:04:34 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: STATE_QUICK_R2: IPsec SA established transport mode {ESP/NAT=>0x83e86832 <0x4c36aeda xfrm=AES_CBC_256-HMAC_SHA1_96 NATOA=114.214.246.164 NATD=114.214.246.164:4500 DPD=active} Apr 10 15:05:09 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #7: received Delete SA(0x83e86832) payload: deleting IPSEC State #8 Apr 10 15:05:09 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: deleting other state #8 (STATE_QUICK_R2) and sending notification Apr 10 15:05:09 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164 #8: ESP traffic information: in=696B out=0B Apr 10 15:05:09 dm pluto[25112]: "l2tp-psk"[4] 114.214.246.164: deleting connection "l2tp-psk"[4] 114.214.246.164 instance with peer 114.214.246.164 {isakmp=#0/ipsec=#0} Apr 10 15:05:09 dm pluto[25112]: packet from 114.214.246.164:4500: received and ignored empty informational notification payload ```
Author
Owner

@parsalotfy commented on GitHub (Sep 24, 2019):

@tswsxk did you finally come up with a solution for this problem ?
I have this problem and I couldn't find a way to fix that :(
I googled a lot

<!-- gh-comment-id:534759246 --> @parsalotfy commented on GitHub (Sep 24, 2019): @tswsxk did you finally come up with a solution for this problem ? I have this problem and I couldn't find a way to fix that :( I googled a lot
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/docker-ipsec-vpn-server#51
No description provided.