[GH-ISSUE #847] [INTEGRATION] Creating a DB with k8s setup fails #303

Closed
opened 2026-02-27 08:16:32 +03:00 by kerem · 26 comments
Owner

Originally created by @zelogik on GitHub (Feb 23, 2024).
Original GitHub issue: https://github.com/lldap/lldap/issues/847

Describe the bug
On k8s with the help of Evantage-WS/lldap-kubernetes repo.
At first start, got the error:

2024-02-23T08:53:29.537552051+00:00  ERROR       ┕━ 🚨 [error]:  | error: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`
Error: while creating the admin user

Caused by:
    Error setting up admin login/account: Error creating admin user: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`: Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email: error returned from database: (code: 2067) UNIQUE constraint failed: users.email

To Reproduce
Set or don't set LLDAP_LDAP_USER_DN or LLDAP_LDAP_USER_EMAIL
FS have been tested with longhorn and hostPath.

Expected behavior
Creation of first admin account.

Logs

k logs -n private pod/lldap-7df9848684-prmzv 
> Setup permissions..
> Starting lldap..

Loading configuration from /data/lldap_config.toml
Configuration: Configuration {
    ldap_host: "0.0.0.0",
    ldap_port: 3890,
    http_host: "0.0.0.0",
    http_port: 17170,
    jwt_secret: ***SECRET***,
    ldap_base_dn: "dc=XXXXXX,dc=XXXXXX",
    ldap_user_dn: UserId(
        CaseInsensitiveString(
            "admium",
        ),
    ),
    ldap_user_email: "XXXXXX@XXXXXX",
    ldap_user_pass: ***SECRET***,
    force_ldap_user_pass_reset: false,
    force_update_private_key: false,
    database_url: sqlite:///data/users.db?mode=rwc,
    ignored_user_attributes: [],
    ignored_group_attributes: [],
    verbose: true,
    key_file: "server_key",
    key_seed: Some(
        ***SECRET***,
    ),
    smtp_options: MailOptions {
        enable_password_reset: false,
        from: None,
        reply_to: None,
        server: "localhost",
        port: 587,
        user: "",
        password: ***SECRET***,
        smtp_encryption: Tls,
        tls_required: None,
    },
    ldaps_options: LdapsOptions {
        enabled: false,
        port: 6360,
        cert_file: "cert.pem",
        key_file: "key.pem",
    },
    http_url: Url {
        scheme: "http",
        cannot_be_a_base: false,
        username: "",
        password: None,
        host: Some(
            Domain(
                "localhost",
            ),
        ),
        port: None,
        path: "/",
        query: None,
        fragment: None,
    },
    server_setup: None,
}
WARNING: A key_seed was given, we will ignore the server_key and generate one from the seed!
2024-02-23T08:53:29.337024987+00:00  INFO     set_up_server [ 4.97ms | 32.07% / 100.00% ]
2024-02-23T08:53:29.337038123+00:00  INFO     ┝━ i [info]: Starting LLDAP version 0.5.1-alpha
2024-02-23T08:53:29.339072505+00:00  DEBUG    ┝━ get_schema_version [ 238µs | 4.78% ]
2024-02-23T08:53:29.434694648+00:00  DEBUG    │  ┕━ 🐛 [debug]:  | return: Some(SchemaVersion(9))
2024-02-23T08:53:29.438421469+00:00  DEBUG    ┝━ list_groups [ 713µs | 14.35% ] filters: Some(DisplayName(GroupName("lldap_admin")))
2024-02-23T08:53:29.442364722+00:00  DEBUG    │  ┕━ 🐛 [debug]:  | return: [Group { id: GroupId(1), display_name: GroupName("lldap_admin"), creation_date: 2024-02-23T08:53:27.010666311, uuid: Uuid("7fbe6c01-3b12-3e13-8a34-a34197156d91"), users: [], attributes: [] }]
2024-02-23T08:53:29.442387661+00:00  DEBUG    ┝━ list_groups [ 564µs | 11.36% ] filters: Some(DisplayName(GroupName("lldap_password_manager")))
2024-02-23T08:53:29.443976533+00:00  DEBUG    │  ┕━ 🐛 [debug]:  | return: [Group { id: GroupId(2), display_name: GroupName("lldap_password_manager"), creation_date: 2024-02-23T08:53:27.031676113, uuid: Uuid("63b051ca-88a4-3a23-8b05-9cb4b38f8886"), users: [], attributes: [] }]
2024-02-23T08:53:29.444017482+00:00  DEBUG    ┝━ list_groups [ 536µs | 10.79% ] filters: Some(DisplayName(GroupName("lldap_strict_readonly")))
2024-02-23T08:53:29.530572723+00:00  DEBUG    │  ┕━ 🐛 [debug]:  | return: [Group { id: GroupId(3), display_name: GroupName("lldap_strict_readonly"), creation_date: 2024-02-23T08:53:27.035768268, uuid: Uuid("386b3399-f04f-3a25-b21a-2cbf4e78582e"), users: [], attributes: [] }]
2024-02-23T08:53:29.530610180+00:00  DEBUG    ┝━ list_users [ 786µs | 15.83% ] filters: Some(MemberOf(GroupName("lldap_admin"))) | _get_groups: false
2024-02-23T08:53:29.534475869+00:00  DEBUG    │  ┕━ 🐛 [debug]:  | return: []
2024-02-23T08:53:29.534483993+00:00  WARN     ┝━ 🚧 [warn]: Could not find an admin user, trying to create the user "admin" with the config-provided password
2024-02-23T08:53:29.534499515+00:00  DEBUG    ┕━ create_user [ 538µs | 10.82% ] request: CreateUserRequest { user_id: UserId(CaseInsensitiveString("XXXXXX")), email: Email("XXXXXX@XXXXXX.XXXXXX"), display_name: Some("Administrator"), first_name: None, last_name: None, avatar: None, attributes: [] } | user_id: "XXXXXX"
2024-02-23T08:53:29.537552051+00:00  ERROR       ┕━ 🚨 [error]:  | error: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`
Error: while creating the admin user

Caused by:
    Error setting up admin login/account: Error creating admin user: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`: Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email: error returned from database: (code: 2067) UNIQUE constraint failed: users.email

Additional context
Add any other context about the problem here.

Originally created by @zelogik on GitHub (Feb 23, 2024). Original GitHub issue: https://github.com/lldap/lldap/issues/847 **Describe the bug** On k8s with the help of Evantage-WS/lldap-kubernetes repo. At first start, got the error: ``` 2024-02-23T08:53:29.537552051+00:00 ERROR ┕━ 🚨 [error]: | error: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email` Error: while creating the admin user Caused by: Error setting up admin login/account: Error creating admin user: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`: Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email: error returned from database: (code: 2067) UNIQUE constraint failed: users.email ``` **To Reproduce** Set or don't set LLDAP_LDAP_USER_DN or LLDAP_LDAP_USER_EMAIL FS have been tested with longhorn and hostPath. **Expected behavior** Creation of first admin account. **Logs** ``` k logs -n private pod/lldap-7df9848684-prmzv > Setup permissions.. > Starting lldap.. Loading configuration from /data/lldap_config.toml Configuration: Configuration { ldap_host: "0.0.0.0", ldap_port: 3890, http_host: "0.0.0.0", http_port: 17170, jwt_secret: ***SECRET***, ldap_base_dn: "dc=XXXXXX,dc=XXXXXX", ldap_user_dn: UserId( CaseInsensitiveString( "admium", ), ), ldap_user_email: "XXXXXX@XXXXXX", ldap_user_pass: ***SECRET***, force_ldap_user_pass_reset: false, force_update_private_key: false, database_url: sqlite:///data/users.db?mode=rwc, ignored_user_attributes: [], ignored_group_attributes: [], verbose: true, key_file: "server_key", key_seed: Some( ***SECRET***, ), smtp_options: MailOptions { enable_password_reset: false, from: None, reply_to: None, server: "localhost", port: 587, user: "", password: ***SECRET***, smtp_encryption: Tls, tls_required: None, }, ldaps_options: LdapsOptions { enabled: false, port: 6360, cert_file: "cert.pem", key_file: "key.pem", }, http_url: Url { scheme: "http", cannot_be_a_base: false, username: "", password: None, host: Some( Domain( "localhost", ), ), port: None, path: "/", query: None, fragment: None, }, server_setup: None, } WARNING: A key_seed was given, we will ignore the server_key and generate one from the seed! 2024-02-23T08:53:29.337024987+00:00 INFO set_up_server [ 4.97ms | 32.07% / 100.00% ] 2024-02-23T08:53:29.337038123+00:00 INFO ┝━ i [info]: Starting LLDAP version 0.5.1-alpha 2024-02-23T08:53:29.339072505+00:00 DEBUG ┝━ get_schema_version [ 238µs | 4.78% ] 2024-02-23T08:53:29.434694648+00:00 DEBUG │ ┕━ 🐛 [debug]: | return: Some(SchemaVersion(9)) 2024-02-23T08:53:29.438421469+00:00 DEBUG ┝━ list_groups [ 713µs | 14.35% ] filters: Some(DisplayName(GroupName("lldap_admin"))) 2024-02-23T08:53:29.442364722+00:00 DEBUG │ ┕━ 🐛 [debug]: | return: [Group { id: GroupId(1), display_name: GroupName("lldap_admin"), creation_date: 2024-02-23T08:53:27.010666311, uuid: Uuid("7fbe6c01-3b12-3e13-8a34-a34197156d91"), users: [], attributes: [] }] 2024-02-23T08:53:29.442387661+00:00 DEBUG ┝━ list_groups [ 564µs | 11.36% ] filters: Some(DisplayName(GroupName("lldap_password_manager"))) 2024-02-23T08:53:29.443976533+00:00 DEBUG │ ┕━ 🐛 [debug]: | return: [Group { id: GroupId(2), display_name: GroupName("lldap_password_manager"), creation_date: 2024-02-23T08:53:27.031676113, uuid: Uuid("63b051ca-88a4-3a23-8b05-9cb4b38f8886"), users: [], attributes: [] }] 2024-02-23T08:53:29.444017482+00:00 DEBUG ┝━ list_groups [ 536µs | 10.79% ] filters: Some(DisplayName(GroupName("lldap_strict_readonly"))) 2024-02-23T08:53:29.530572723+00:00 DEBUG │ ┕━ 🐛 [debug]: | return: [Group { id: GroupId(3), display_name: GroupName("lldap_strict_readonly"), creation_date: 2024-02-23T08:53:27.035768268, uuid: Uuid("386b3399-f04f-3a25-b21a-2cbf4e78582e"), users: [], attributes: [] }] 2024-02-23T08:53:29.530610180+00:00 DEBUG ┝━ list_users [ 786µs | 15.83% ] filters: Some(MemberOf(GroupName("lldap_admin"))) | _get_groups: false 2024-02-23T08:53:29.534475869+00:00 DEBUG │ ┕━ 🐛 [debug]: | return: [] 2024-02-23T08:53:29.534483993+00:00 WARN ┝━ 🚧 [warn]: Could not find an admin user, trying to create the user "admin" with the config-provided password 2024-02-23T08:53:29.534499515+00:00 DEBUG ┕━ create_user [ 538µs | 10.82% ] request: CreateUserRequest { user_id: UserId(CaseInsensitiveString("XXXXXX")), email: Email("XXXXXX@XXXXXX.XXXXXX"), display_name: Some("Administrator"), first_name: None, last_name: None, avatar: None, attributes: [] } | user_id: "XXXXXX" 2024-02-23T08:53:29.537552051+00:00 ERROR ┕━ 🚨 [error]: | error: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email` Error: while creating the admin user Caused by: Error setting up admin login/account: Error creating admin user: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`: Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email: error returned from database: (code: 2067) UNIQUE constraint failed: users.email ``` **Additional context** Add any other context about the problem here.
Author
Owner

@nitnelave commented on GitHub (Feb 23, 2024):

It looks like your db already contains a user with no email address. If you don't have anything important in there, can you delete the DB? And make sure to grab the logs when you restart LLDAP, the first run logs might be able to tell us how we got there.

<!-- gh-comment-id:1960991266 --> @nitnelave commented on GitHub (Feb 23, 2024): It looks like your db already contains a user with no email address. If you don't have anything important in there, can you delete the DB? And make sure to grab the logs when you restart LLDAP, the first run logs might be able to tell us how we got there.
Author
Owner

@zelogik commented on GitHub (Feb 23, 2024):

I have removed the Volume, deleted manifest and reapply my manifest (even changed the name of my PVC)
same problem.

WARNING: A key_seed was given, we will ignore the server_key and generate one from the seed!
2024-02-23T09:33:13.432173919+00:00  INFO     set_up_server [ 3.61ms | 25.21% / 100.00% ]
2024-02-23T09:33:13.432189868+00:00  INFO     ┝━ i [info]: Starting LLDAP version 0.5.1-alpha
2024-02-23T09:33:13.434667152+00:00  DEBUG    ┝━ get_schema_version [ 129µs | 3.57% ]
2024-02-23T09:33:13.437503615+00:00  DEBUG    │  ┕━ 🐛 [debug]:  | return: Some(SchemaVersion(9))
2024-02-23T09:33:13.438719793+00:00  DEBUG    ┝━ list_groups [ 478µs | 13.26% ] filters: Some(DisplayName(GroupName("lldap_admin")))
2024-02-23T09:33:13.441709985+00:00  DEBUG    │  ┕━ 🐛 [debug]:  | return: [Group { id: GroupId(1), display_name: GroupName("lldap_admin"), creation_date: 2024-02-23T09:33:10.090674621, uuid: Uuid("42eb52ba-9235-342a-919f-396fd35c48ba"), users: [], attributes: [] }]
2024-02-23T09:33:13.441726709+00:00  DEBUG    ┝━ list_groups [ 698µs | 19.34% ] filters: Some(DisplayName(GroupName("lldap_password_manager")))
2024-02-23T09:33:13.442830278+00:00  DEBUG    │  ┕━ 🐛 [debug]:  | return: [Group { id: GroupId(2), display_name: GroupName("lldap_password_manager"), creation_date: 2024-02-23T09:33:10.102386335, uuid: Uuid("61c78dee-15f0-323f-bc6e-0c85ef3bfceb"), users: [], attributes: [] }]
2024-02-23T09:33:13.442843433+00:00  DEBUG    ┝━ list_groups [ 581µs | 16.10% ] filters: Some(DisplayName(GroupName("lldap_strict_readonly")))
2024-02-23T09:33:13.443837732+00:00  DEBUG    │  ┕━ 🐛 [debug]:  | return: [Group { id: GroupId(3), display_name: GroupName("lldap_strict_readonly"), creation_date: 2024-02-23T09:33:10.113176064, uuid: Uuid("8c32feee-5efa-3111-b297-3ef4de7075e2"), users: [], attributes: [] }]
2024-02-23T09:33:13.443858025+00:00  DEBUG    ┝━ list_users [ 404µs | 11.20% ] filters: Some(MemberOf(GroupName("lldap_admin"))) | _get_groups: false
2024-02-23T09:33:13.447519759+00:00  DEBUG    │  ┕━ 🐛 [debug]:  | return: []
2024-02-23T09:33:13.447525576+00:00  WARN     ┝━ 🚧 [warn]: Could not find an admin user, trying to create the user "admin" with the config-provided password
2024-02-23T09:33:13.447537419+00:00  DEBUG    ┕━ create_user [ 408µs | 11.32% ] request: CreateUserRequest { user_id: UserId(CaseInsensitiveString("XXXXXX")), email: Email("XXXXXX@XXXXXX.XXXXXX"), display_name: Some("Administrator"), first_name: None, last_name: None, avatar: None, attributes: [] } | user_id: "XXXXXX"
2024-02-23T09:33:13.530890639+00:00  ERROR       ┕━ 🚨 [error]:  | error: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`
Error: while creating the admin user

Caused by:
    Error setting up admin login/account: Error creating admin user: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`: Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email: error returned from database: (code: 2067) UNIQUE constraint failed: users.email

<!-- gh-comment-id:1961005135 --> @zelogik commented on GitHub (Feb 23, 2024): I have removed the Volume, deleted manifest and reapply my manifest (even changed the name of my PVC) same problem. ``` WARNING: A key_seed was given, we will ignore the server_key and generate one from the seed! 2024-02-23T09:33:13.432173919+00:00 INFO set_up_server [ 3.61ms | 25.21% / 100.00% ] 2024-02-23T09:33:13.432189868+00:00 INFO ┝━ i [info]: Starting LLDAP version 0.5.1-alpha 2024-02-23T09:33:13.434667152+00:00 DEBUG ┝━ get_schema_version [ 129µs | 3.57% ] 2024-02-23T09:33:13.437503615+00:00 DEBUG │ ┕━ 🐛 [debug]: | return: Some(SchemaVersion(9)) 2024-02-23T09:33:13.438719793+00:00 DEBUG ┝━ list_groups [ 478µs | 13.26% ] filters: Some(DisplayName(GroupName("lldap_admin"))) 2024-02-23T09:33:13.441709985+00:00 DEBUG │ ┕━ 🐛 [debug]: | return: [Group { id: GroupId(1), display_name: GroupName("lldap_admin"), creation_date: 2024-02-23T09:33:10.090674621, uuid: Uuid("42eb52ba-9235-342a-919f-396fd35c48ba"), users: [], attributes: [] }] 2024-02-23T09:33:13.441726709+00:00 DEBUG ┝━ list_groups [ 698µs | 19.34% ] filters: Some(DisplayName(GroupName("lldap_password_manager"))) 2024-02-23T09:33:13.442830278+00:00 DEBUG │ ┕━ 🐛 [debug]: | return: [Group { id: GroupId(2), display_name: GroupName("lldap_password_manager"), creation_date: 2024-02-23T09:33:10.102386335, uuid: Uuid("61c78dee-15f0-323f-bc6e-0c85ef3bfceb"), users: [], attributes: [] }] 2024-02-23T09:33:13.442843433+00:00 DEBUG ┝━ list_groups [ 581µs | 16.10% ] filters: Some(DisplayName(GroupName("lldap_strict_readonly"))) 2024-02-23T09:33:13.443837732+00:00 DEBUG │ ┕━ 🐛 [debug]: | return: [Group { id: GroupId(3), display_name: GroupName("lldap_strict_readonly"), creation_date: 2024-02-23T09:33:10.113176064, uuid: Uuid("8c32feee-5efa-3111-b297-3ef4de7075e2"), users: [], attributes: [] }] 2024-02-23T09:33:13.443858025+00:00 DEBUG ┝━ list_users [ 404µs | 11.20% ] filters: Some(MemberOf(GroupName("lldap_admin"))) | _get_groups: false 2024-02-23T09:33:13.447519759+00:00 DEBUG │ ┕━ 🐛 [debug]: | return: [] 2024-02-23T09:33:13.447525576+00:00 WARN ┝━ 🚧 [warn]: Could not find an admin user, trying to create the user "admin" with the config-provided password 2024-02-23T09:33:13.447537419+00:00 DEBUG ┕━ create_user [ 408µs | 11.32% ] request: CreateUserRequest { user_id: UserId(CaseInsensitiveString("XXXXXX")), email: Email("XXXXXX@XXXXXX.XXXXXX"), display_name: Some("Administrator"), first_name: None, last_name: None, avatar: None, attributes: [] } | user_id: "XXXXXX" 2024-02-23T09:33:13.530890639+00:00 ERROR ┕━ 🚨 [error]: | error: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email` Error: while creating the admin user Caused by: Error setting up admin login/account: Error creating admin user: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`: Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email: error returned from database: (code: 2067) UNIQUE constraint failed: users.email ```
Author
Owner

@nitnelave commented on GitHub (Feb 23, 2024):

Either your database still exists or it's not the very first run of LLDAP (do you have something that auto-restarts it?)

You can see that because it's getting the current db schema version and getting version 9 (instead of no version for an empty db)

<!-- gh-comment-id:1961009818 --> @nitnelave commented on GitHub (Feb 23, 2024): Either your database still exists or it's not the very first run of LLDAP (do you have something that auto-restarts it?) You can see that because it's getting the current db schema version and getting version 9 (instead of no version for an empty db)
Author
Owner

@nitnelave commented on GitHub (Feb 23, 2024):

Where is the file "/data/users.db" from, and can you delete it?

<!-- gh-comment-id:1961011015 --> @nitnelave commented on GitHub (Feb 23, 2024): Where is the file "/data/users.db" from, and can you delete it?
Author
Owner

@zelogik commented on GitHub (Feb 23, 2024):

---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
  labels:
    app: lldap
  name: lldap-data-pvc
  namespace: private
spec:
  storageClassName: longhorn
  accessModes:
    - ReadWriteOnce
  resources:
    requests:
      storage: 100Mi


---
... Deployment ....
          volumeMounts:
            - mountPath: /data
              name: lldap-data
      restartPolicy: Always
      volumes:
        - name: lldap-data
          persistentVolumeClaim:
            claimName: lldap-data-pvc

And I have deleted the PVC ...

I understand what you mean... but I don't know where come from the /data/users.db as I create a fresh PV ...

ingress.networking.k8s.io "grafana-private-ingress" deleted
secret "lldap-credentials" deleted
persistentvolumeclaim "lldap-data-pvc" deleted
deployment.apps "lldap" deleted
service "lldap-service" deleted

Last Edit: k get persistentvolume -A, return 0 lldap-data-pvc

<!-- gh-comment-id:1961029163 --> @zelogik commented on GitHub (Feb 23, 2024): ``` --- apiVersion: v1 kind: PersistentVolumeClaim metadata: labels: app: lldap name: lldap-data-pvc namespace: private spec: storageClassName: longhorn accessModes: - ReadWriteOnce resources: requests: storage: 100Mi --- ... Deployment .... volumeMounts: - mountPath: /data name: lldap-data restartPolicy: Always volumes: - name: lldap-data persistentVolumeClaim: claimName: lldap-data-pvc ``` And I have deleted the PVC ... I understand what you mean... but I don't know where come from the /data/users.db as I create a fresh PV ... ``` ingress.networking.k8s.io "grafana-private-ingress" deleted secret "lldap-credentials" deleted persistentvolumeclaim "lldap-data-pvc" deleted deployment.apps "lldap" deleted service "lldap-service" deleted ``` Last Edit: k get persistentvolume -A, return 0 lldap-data-pvc
Author
Owner

@zelogik commented on GitHub (Feb 23, 2024):

For test I use lldap/lldap:latest, and I have checked the Dockerfile, and entry-points.sh, and normally just check permissions, if I'm right

<!-- gh-comment-id:1961037792 --> @zelogik commented on GitHub (Feb 23, 2024): For test I use lldap/lldap:latest, and I have checked the Dockerfile, and entry-points.sh, and normally just check permissions, if I'm right
Author
Owner

@nitnelave commented on GitHub (Feb 23, 2024):

Sorry, I don't know enough about Kubernetes to help you... You can try changing the database path (change users.db to something else)

<!-- gh-comment-id:1961047248 --> @nitnelave commented on GitHub (Feb 23, 2024): Sorry, I don't know enough about Kubernetes to help you... You can try changing the database path (change users.db to something else)
Author
Owner

@zelogik commented on GitHub (Feb 23, 2024):

I have set: LLDAP_DATABASE_URL to "database_url: sqlite:///data/users-dev.db?mode=rwc,"

And I have exactly the same problem...

But, thank for your help, and your software that "look" really good and light (compared to FreeIPA/OpenLDAP..)

<!-- gh-comment-id:1961075025 --> @zelogik commented on GitHub (Feb 23, 2024): I have set: LLDAP_DATABASE_URL to "database_url: sqlite:///data/users-dev.db?mode=rwc," And I have exactly the same problem... But, thank for your help, and your software that "look" really good and light (compared to FreeIPA/OpenLDAP..)
Author
Owner

@martadinata666 commented on GitHub (Feb 23, 2024):

I think this is some incompatibility between storage type and sqlite. Something like NFS can't be used when using SQLITE type database. As taking the reference of https://github.com/Evantage-WS/lldap-kubernetes/blob/main/lldap-persistentvolumeclaim.yaml it use local-path instead longhorn that a networked storage I guess?

<!-- gh-comment-id:1961110509 --> @martadinata666 commented on GitHub (Feb 23, 2024): I think this is some incompatibility between storage type and sqlite. Something like NFS can't be used when using SQLITE type database. As taking the reference of https://github.com/Evantage-WS/lldap-kubernetes/blob/main/lldap-persistentvolumeclaim.yaml it use local-path instead longhorn that a networked storage I guess?
Author
Owner

@zelogik commented on GitHub (Feb 23, 2024):

Yes, was thinking about that, but longhorn is not NFS, and haven't seen "bug", with sqlite and longhorn Storage.
Need to recheck with hostPath to check and verify if it's longhorn/sqlite, or lldap...

<!-- gh-comment-id:1961120241 --> @zelogik commented on GitHub (Feb 23, 2024): Yes, was thinking about that, but longhorn is not NFS, and haven't seen "bug", with sqlite and longhorn Storage. Need to recheck with hostPath to check and verify if it's longhorn/sqlite, or lldap...
Author
Owner

@zelogik commented on GitHub (Feb 23, 2024):

I have replaced longhorn storage by hostPath storage ( more or less same than Docker)

Make sure that the directory have no config.toml or users.db (users-dev.db) in my case.

apply the manifest

volumes:
        # - name: lldap-data
        #   persistentVolumeClaim:
        #     claimName: lldap-pvc
        - name: lldap-data
          hostPath:
            path: /tmp/data/
            type: Directory

simple ls /tmp/data on the k8s runner where lldap running.
The users-dev.db is created ... but same error. I pulling my hair...

me("Administrator"), first_name: None, last_name: None, avatar: None, attributes: [] } | user_id: "admium"
2024-02-23T11:11:59.296988424+00:00  ERROR       ┕━ 🚨 [error]:  | error: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`
Error: while creating the admin user

Caused by:
    Error setting up admin login/account: Error creating admin user: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`: Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email: error returned from database: (code: 2067) UNIQUE constraint failed: users.email
<!-- gh-comment-id:1961145325 --> @zelogik commented on GitHub (Feb 23, 2024): I have replaced longhorn storage by hostPath storage ( more or less same than Docker) Make sure that the directory have no config.toml or users.db (users-dev.db) in my case. apply the manifest ``` volumes: # - name: lldap-data # persistentVolumeClaim: # claimName: lldap-pvc - name: lldap-data hostPath: path: /tmp/data/ type: Directory ``` simple ls /tmp/data on the k8s runner where lldap running. The users-dev.db is created ... but same error. I pulling my hair... ``` me("Administrator"), first_name: None, last_name: None, avatar: None, attributes: [] } | user_id: "admium" 2024-02-23T11:11:59.296988424+00:00 ERROR ┕━ 🚨 [error]: | error: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email` Error: while creating the admin user Caused by: Error setting up admin login/account: Error creating admin user: Database error: `Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email`: Execution Error: error returned from database: (code: 2067) UNIQUE constraint failed: users.email: error returned from database: (code: 2067) UNIQUE constraint failed: users.email ```
Author
Owner

@zelogik commented on GitHub (Feb 23, 2024):

Got new!
tested:

  • lldap/lldap:2023-11-05-alpine : same error
  • lldap/lldap:v0.4.3-alpine : [info]: Starting the API/web server on port 17170 "Working"
  • lldap/lldap:v0.5.0-alpine : Not working ... same error 2067

So seem like there is a "feature" / bug added from 0.4.3 and 0.5.0

Edit: upgrade 0.4.3 to 2023-02-08-alpine and got:

Note: If you just migrated from <=v0.4 to >=v0.5, the previous version did not support key_seed, so it was falling back onto a key file. Remove the seed from the configuration.
<!-- gh-comment-id:1961466971 --> @zelogik commented on GitHub (Feb 23, 2024): Got new! tested: - lldap/lldap:2023-11-05-alpine : same error - lldap/lldap:v0.4.3-alpine : [info]: Starting the API/web server on port 17170 "Working" - lldap/lldap:v0.5.0-alpine : Not working ... same error 2067 So seem like there is a "feature" / bug added from 0.4.3 and 0.5.0 Edit: upgrade 0.4.3 to 2023-02-08-alpine and got: ``` Note: If you just migrated from <=v0.4 to >=v0.5, the previous version did not support key_seed, so it was falling back onto a key file. Remove the seed from the configuration. ```
Author
Owner

@nitnelave commented on GitHub (Feb 26, 2024):

Sorry to push back again on this issue, but as long as we don't understand what's going on with your setup, we can't debug the issue in LLDAP. In particular, as I mentioned earlier, these logs cannot be the logs for a first start of LLDAP with an empty database. We should at the very least see DB migration messages, and user/group creations for the built-in users (admin, admin groups and so on).

<!-- gh-comment-id:1963601774 --> @nitnelave commented on GitHub (Feb 26, 2024): Sorry to push back again on this issue, but as long as we don't understand what's going on with your setup, we can't debug the issue in LLDAP. In particular, as I mentioned earlier, these logs cannot be the logs for a first start of LLDAP with an empty database. We should _at the very least_ see DB migration messages, and user/group creations for the built-in users (admin, admin groups and so on).
Author
Owner

@zelogik commented on GitHub (Feb 26, 2024):

Yes i understand the problem and look to get the same as:

docker compose up

lldap-1  | 2024-02-26T10:01:45.686631809+00:00  INFO     ┝━ i [info]: Starting LLDAP version 0.5.1-alpha
lldap-1  | 2024-02-26T10:01:45.710152111+00:00  INFO     ┝━ i [info]: Upgrading DB schema from version 1
lldap-1  | 2024-02-26T10:01:45.710154055+00:00  INFO     ┝━ i [info]: Upgrading DB schema to version 2
lldap-1  | 2024-02-26T10:01:45.716325170+00:00  INFO     ┝━ i [info]: Upgrading DB schema to version 3
lldap-1  | 2024-02-26T10:01:45.724658684+00:00  INFO     ┝━ i [info]: Upgrading DB schema to version 4
lldap-1  | 2024-02-26T10:01:45.729460821+00:00  INFO     ┝━ i [info]: Upgrading DB schema to version 5
lldap-1  | 2024-02-26T10:01:45.736737778+00:00  INFO     ┝━ i [info]: Upgrading DB schema to version 6
lldap-1  | 2024-02-26T10:01:45.741667446+00:00  INFO     ┝━ i [info]: Upgrading DB schema to version 7
lldap-1  | 2024-02-26T10:01:45.746038259+00:00  INFO     ┝━ i [info]: Upgrading DB schema to version 8
lldap-1  | 2024-02-26T10:01:45.749844935+00:00  INFO     ┝━ i [info]: Upgrading DB schema to version 9
lldap-1  | 2024-02-26T10:01:45.770137424+00:00  WARN     ┝━ 🚧 [warn]: Could not find lldap_admin group, trying to create it
lldap-1  | 2024-02-26T10:01:45.775474173+00:00  WARN     ┝━ 🚧 [warn]: Could not find lldap_password_manager group, trying to create it
lldap-1  | 2024-02-26T10:01:45.779813065+00:00  WARN     ┝━ 🚧 [warn]: Could not find lldap_strict_readonly group, trying to create it

But you can update that the k8s deployment don´t work up to lldap v4.5

<!-- gh-comment-id:1963739062 --> @zelogik commented on GitHub (Feb 26, 2024): Yes i understand the problem and look to get the same as: docker compose up ``` lldap-1 | 2024-02-26T10:01:45.686631809+00:00 INFO ┝━ i [info]: Starting LLDAP version 0.5.1-alpha lldap-1 | 2024-02-26T10:01:45.710152111+00:00 INFO ┝━ i [info]: Upgrading DB schema from version 1 lldap-1 | 2024-02-26T10:01:45.710154055+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 2 lldap-1 | 2024-02-26T10:01:45.716325170+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 3 lldap-1 | 2024-02-26T10:01:45.724658684+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 4 lldap-1 | 2024-02-26T10:01:45.729460821+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 5 lldap-1 | 2024-02-26T10:01:45.736737778+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 6 lldap-1 | 2024-02-26T10:01:45.741667446+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 7 lldap-1 | 2024-02-26T10:01:45.746038259+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 8 lldap-1 | 2024-02-26T10:01:45.749844935+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 9 lldap-1 | 2024-02-26T10:01:45.770137424+00:00 WARN ┝━ 🚧 [warn]: Could not find lldap_admin group, trying to create it lldap-1 | 2024-02-26T10:01:45.775474173+00:00 WARN ┝━ 🚧 [warn]: Could not find lldap_password_manager group, trying to create it lldap-1 | 2024-02-26T10:01:45.779813065+00:00 WARN ┝━ 🚧 [warn]: Could not find lldap_strict_readonly group, trying to create it ``` But you can update that the k8s deployment don´t work up to lldap v4.5
Author
Owner

@nitnelave commented on GitHub (Feb 26, 2024):

Alright, until proof of the contrary, I'll assume the fault is in the k8s setup rather than in LLDAP itself, so downgrading from bug to integration + documentation.

<!-- gh-comment-id:1963754226 --> @nitnelave commented on GitHub (Feb 26, 2024): Alright, until proof of the contrary, I'll assume the fault is in the k8s setup rather than in LLDAP itself, so downgrading from `bug` to `integration` + `documentation`.
Author
Owner

@onedr0p commented on GitHub (Mar 15, 2024):

@zelogik maybe you need to setup a startup probe due to kubernetes killing the pod before the DB migrations have completed? If that's the case, the pod would restart and might lead to the issue you are seeing since the migration hasn't completely finished?

<!-- gh-comment-id:1999923342 --> @onedr0p commented on GitHub (Mar 15, 2024): @zelogik maybe you need to setup a startup probe due to kubernetes killing the pod before the DB migrations have completed? If that's the case, the pod would restart and might lead to the issue you are seeing since the migration hasn't completely finished?
Author
Owner

@zelogik commented on GitHub (Mar 18, 2024):

@onedr0p , thanks for the suggestion, I have already done that, even tested with a initContainer with the same problem.
One test I haven´t done, as lldap docker image have build error 2 weeks ago was to remove the line:
HEALTHCHECK CMD ["/app/lldap", "healthcheck", "--config-file", "/data/lldap_config.toml"] on the DockerFile file

I don't know if it's not the problem with k8s. (ie: race condition with the CMD [run....])
Regards

<!-- gh-comment-id:2003422164 --> @zelogik commented on GitHub (Mar 18, 2024): @onedr0p , thanks for the suggestion, I have already done that, even tested with a initContainer with the same problem. One test I haven´t done, as lldap docker image have build error 2 weeks ago was to remove the line: HEALTHCHECK CMD ["/app/lldap", "healthcheck", "--config-file", "/data/lldap_config.toml"] on the DockerFile file I don't know if it's not the problem with k8s. (ie: race condition with the CMD [run....]) Regards
Author
Owner

@nitnelave commented on GitHub (Mar 18, 2024):

The healthcheck shouldn't affect anything: it's essentially sending a ping on the HTTP/LDAP(S) interfaces to see if they're up. It wouldn't set up the DB, for instance. The interfaces only start listening after everything else is setup, including the DB.

<!-- gh-comment-id:2003437410 --> @nitnelave commented on GitHub (Mar 18, 2024): The healthcheck shouldn't affect anything: it's essentially sending a ping on the HTTP/LDAP(S) interfaces to see if they're up. It wouldn't set up the DB, for instance. The interfaces only start listening after everything else is setup, including the DB.
Author
Owner

@zelogik commented on GitHub (Mar 18, 2024):

..ok :/
I'm looking everywhere I can find a race condition
And I don't know how is proceeded the HEALTHCHECK cmd in k8s

<!-- gh-comment-id:2003443979 --> @zelogik commented on GitHub (Mar 18, 2024): ..ok :/ I'm looking everywhere I can find a race condition And I don't know how is proceeded the HEALTHCHECK cmd in k8s
Author
Owner

@zelogik commented on GitHub (Mar 18, 2024):

my bad....

2024-03-18T10:18:45.765034085+00:00  DEBUG    ┝━ list_groups [ 166µs | 0.30% ] filters: Some(DisplayName(GroupName("lldap_admin")))
2024-03-18T10:18:45.765367794+00:00  DEBUG    │  ┕━ 🐛 [debug]:  | return: [Group { id: GroupId(1), display_name: GroupName("lldap_admin"), creation_date: 2024-03-18T10:18:45.656059292, uuid: Uuid("2febe2ab-f390-34b6-a4dc-60e8ada49718"), users: [], attributes: [] }]
2024-03-18T10:18:45.765372387+00:00  DEBUG    ┝━ add_user_to_group [ 75.9µs | 0.14% ] user_id: "admium"
2024-03-18T10:18:45.777226702+00:00  INFO     ┝━ i [info]: Starting the LDAP server on port 3890
2024-03-18T10:18:45.777271814+00:00  DEBUG    ┝━ get_jwt_blacklist [ 27.8µs | 0.05% ]
2024-03-18T10:18:45.777366770+00:00  INFO     ┕━ i [info]: Starting the API/web server on port 17170
2024-03-18T10:18:45.777488572+00:00  INFO     i [info]: starting 1 workers
2024-03-18T10:18:45.777494767+00:00  INFO     i [info]: Actix runtime found; starting in Actix runtime
2024-03-18T10:18:45.778031628+00:00  INFO     i [info]: DB Cleanup Cron started

seem like it's working now.
I have changed two things, the last version 2024-03-07-alpine vs "old" 2024-02-08-alpine
But I have increase the resource limit from 100m to 4000m and memory 50M to 500M...

I check if it's was the resources limit the problem and close the issue.
Sorry for that...

<!-- gh-comment-id:2003501872 --> @zelogik commented on GitHub (Mar 18, 2024): my bad.... ``` 2024-03-18T10:18:45.765034085+00:00 DEBUG ┝━ list_groups [ 166µs | 0.30% ] filters: Some(DisplayName(GroupName("lldap_admin"))) 2024-03-18T10:18:45.765367794+00:00 DEBUG │ ┕━ 🐛 [debug]: | return: [Group { id: GroupId(1), display_name: GroupName("lldap_admin"), creation_date: 2024-03-18T10:18:45.656059292, uuid: Uuid("2febe2ab-f390-34b6-a4dc-60e8ada49718"), users: [], attributes: [] }] 2024-03-18T10:18:45.765372387+00:00 DEBUG ┝━ add_user_to_group [ 75.9µs | 0.14% ] user_id: "admium" 2024-03-18T10:18:45.777226702+00:00 INFO ┝━ i [info]: Starting the LDAP server on port 3890 2024-03-18T10:18:45.777271814+00:00 DEBUG ┝━ get_jwt_blacklist [ 27.8µs | 0.05% ] 2024-03-18T10:18:45.777366770+00:00 INFO ┕━ i [info]: Starting the API/web server on port 17170 2024-03-18T10:18:45.777488572+00:00 INFO i [info]: starting 1 workers 2024-03-18T10:18:45.777494767+00:00 INFO i [info]: Actix runtime found; starting in Actix runtime 2024-03-18T10:18:45.778031628+00:00 INFO i [info]: DB Cleanup Cron started ``` seem like it's working now. I have changed two things, the last version 2024-03-07-alpine vs "old" 2024-02-08-alpine But I have increase the resource limit from 100m to 4000m and memory 50M to 500M... I check if it's was the resources limit the problem and close the issue. Sorry for that...
Author
Owner

@zelogik commented on GitHub (Mar 18, 2024):

So it's the memory limit the problem, with 50M the app "crash" at init, without saying anything "useful". With 100M of ram it's seem to work well.

and when running:

NAME                     CPU(cores)   MEMORY(bytes)   
lldap-78ccb659c5-mg9bc   1m           3Mi

@nitnelave
I think, i can close the issue?

<!-- gh-comment-id:2003522917 --> @zelogik commented on GitHub (Mar 18, 2024): So it's the memory limit the problem, with 50M the app "crash" at init, without saying anything "useful". With 100M of ram it's seem to work well. and when running: ``` NAME CPU(cores) MEMORY(bytes) lldap-78ccb659c5-mg9bc 1m 3Mi ``` @nitnelave I think, i can close the issue?
Author
Owner

@nitnelave commented on GitHub (Mar 18, 2024):

Yes, that sounds like the culprit. We need more RAM (by design) when setting/checking a password ("hashing" the password is intentionally resource intensive).

We can close this.

Maybe you want to add to the LLDAP K8s docs a note about the minimum resources required?

<!-- gh-comment-id:2003534773 --> @nitnelave commented on GitHub (Mar 18, 2024): Yes, that sounds like the culprit. We need more RAM (by design) when setting/checking a password ("hashing" the password is intentionally resource intensive). We can close this. Maybe you want to add to the LLDAP K8s docs a note about the minimum resources required?
Author
Owner

@zelogik commented on GitHub (Mar 18, 2024):

@nitnelave : It's not really docs but a more recent working and sane manifest for k8s than the good base from Evantage-WS/lldap-kubernetes

I don't know if we want to recreate new documentation for k8s specific or update Evantage-WS/lldap-kubernetes

A working simple k8s manifest as example: (requiring ingress-nginx + longhorn)

apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
  name: lldap-private-ingress
  annotations:
    nginx.ingress.kubernetes.io/use-regex: "true"
    nginx.ingress.kubernetes.io/rewrite-target: /$2
    nginx.ingress.kubernetes.io/proxy-body-size: 10m

spec:
  ingressClassName: private
  rules:
  - host: private.example.com
    http:
      paths:
        - pathType: ImplementationSpecific
          path: /ldap(/|$)(.*)
          backend:
            service:
              name: lldap-service
              port:
                name: http

---
# USE AS EXAMPLE NOT USE SECRETS IN PLAIN TEST!!
# prefere kustomize | sops | .env | vault | etc..., for production use

apiVersion: v1
kind: Secret
metadata:
  name: lldap-credentials
type: Opaque
stringData:
  LLDAP_UID: "1000"
  LLDAP_GID: "1000"
  LLDAP_TZ: Europe/Paris
  LLDAP_JWT_SECRET: # view lldap documentation "generate_secrets.sh"
  LLDAP_LDAP_BASE_DN: dc=example,dc=com
  LLDAP_LDAP_USER_PASS: ImaBadPassword
  LLDAP_KEY_SEED: # view lldap documentation "generate_secrets.sh"
  LLDAP_LDAP_USER_DN: admin
  LLDAP_LDAP_USER_EMAIL: admin@example.com
  LLDAP_DATABASE_URL: sqlite:///data/users.db?mode=rwc
  LLDAP_HTTP_URL: "https://example.com/ldap/"

---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
  labels:
    app: lldap
  name: lldap-conf-pvc
spec:
  storageClassName: longhorn
  accessModes:
    - ReadWriteOnce
  resources:
    requests:
      storage: 20Mi

---
apiVersion: apps/v1
kind: Deployment
metadata:
  annotations:
    lldap: https://github.com/nitnelave/lldap
  labels:
    app: lldap
  name: lldap
spec:
  replicas: 1
  selector:
    matchLabels:
      app: lldap
  strategy:
    # type: Recreate
    type: RollingUpdate
    rollingUpdate:
      maxSurge: 1
      maxUnavailable: 0
  template:
    metadata:
      annotations:
        lldap: https://github.com/nitnelave/lldap
        # k8s: https://github.com/Evantage-WS/lldap-kubernetes
      labels:
        app: lldap
    spec:
      containers:
        - name: lldap
          env:
            - name: UID
              valueFrom:
                secretKeyRef:
                  name: lldap-credentials
                  key: LLDAP_UID
            - name: GID
              valueFrom:
                secretKeyRef:
                  name: lldap-credentials
                  key: LLDAP_GID
            - name: TZ
              valueFrom:
                secretKeyRef:
                  name: lldap-credentials
                  key: LLDAP_TZ
            - name: LLDAP_JWT_SECRET
              valueFrom:
                secretKeyRef:
                  name: lldap-credentials
                  key: LLDAP_JWT_SECRET
            - name: LLDAP_HTTP_URL
              valueFrom:
                secretKeyRef:
                  name: lldap-credentials
                  key: LLDAP_HTTP_URL
            - name: LLDAP_LDAP_BASE_DN
              valueFrom:
                secretKeyRef:
                  name: lldap-credentials
                  key: LLDAP_LDAP_BASE_DN
            - name: LLDAP_KEY_SEED
              valueFrom:
                secretKeyRef:
                  name: lldap-credentials
                  key: LLDAP_KEY_SEED
            - name: LLDAP_LDAP_USER_DN
              valueFrom:
                secretKeyRef:
                  name: lldap-credentials
                  key: LLDAP_LDAP_USER_DN
            - name: LLDAP_LDAP_USER_PASS
              valueFrom:
                secretKeyRef:
                  name: lldap-credentials
                  key: LLDAP_LDAP_USER_PASS
            - name: LLDAP_LDAP_USER_EMAIL
              valueFrom:
                secretKeyRef:
                  name: lldap-credentials
                  key: LLDAP_LDAP_USER_EMAIL
            - name: LLDAP_DATABASE_URL
              valueFrom:
                secretKeyRef:
                  name: lldap-credentials
                  key: LLDAP_DATABASE_URL
            - name: LLDAP_VERBOSE
              value: "true"

          image: lldap/lldap:2024-03-07-alpine
          imagePullPolicy: IfNotPresent
          resources:
            limits:
              cpu: 400m
              memory: 100Mi # Can't be lower than 50Mi, lldap init phase take memory
            requests:
              cpu: 100m
              memory: 10Mi
          ports:
            - containerPort: 3890
            - containerPort: 6360
            - containerPort: 17170
          volumeMounts:
            - mountPath: /data
              name: lldap-conf
      restartPolicy: Always
      terminationGracePeriodSeconds: 120
      volumes:
        - name: lldap-conf
          persistentVolumeClaim:
            claimName: lldap-conf-pvc


---
apiVersion: v1
kind: Service
metadata:
  annotations:
    lldap: https://github.com/nitnelave/lldap
    # k8s: https://github.com/Evantage-WS/lldap-kubernetes
  labels:
    app: lldap-service
  name: lldap-service
  namespace: private
spec:
  ports:
    - name: ldap
      port: 389
      targetPort: 3890
    - name: ldaps
      port: 636
      targetPort: 6360
    - name: http
      port: 1717
      targetPort: 17170
  selector:
    app: lldap
    
<!-- gh-comment-id:2004101926 --> @zelogik commented on GitHub (Mar 18, 2024): @nitnelave : It's not really docs but a more recent working and sane manifest for k8s than the good base from Evantage-WS/lldap-kubernetes I don't know if we want to recreate new documentation for k8s specific or update Evantage-WS/lldap-kubernetes A working simple k8s manifest as example: (requiring ingress-nginx + longhorn) ``` apiVersion: networking.k8s.io/v1 kind: Ingress metadata: name: lldap-private-ingress annotations: nginx.ingress.kubernetes.io/use-regex: "true" nginx.ingress.kubernetes.io/rewrite-target: /$2 nginx.ingress.kubernetes.io/proxy-body-size: 10m spec: ingressClassName: private rules: - host: private.example.com http: paths: - pathType: ImplementationSpecific path: /ldap(/|$)(.*) backend: service: name: lldap-service port: name: http --- # USE AS EXAMPLE NOT USE SECRETS IN PLAIN TEST!! # prefere kustomize | sops | .env | vault | etc..., for production use apiVersion: v1 kind: Secret metadata: name: lldap-credentials type: Opaque stringData: LLDAP_UID: "1000" LLDAP_GID: "1000" LLDAP_TZ: Europe/Paris LLDAP_JWT_SECRET: # view lldap documentation "generate_secrets.sh" LLDAP_LDAP_BASE_DN: dc=example,dc=com LLDAP_LDAP_USER_PASS: ImaBadPassword LLDAP_KEY_SEED: # view lldap documentation "generate_secrets.sh" LLDAP_LDAP_USER_DN: admin LLDAP_LDAP_USER_EMAIL: admin@example.com LLDAP_DATABASE_URL: sqlite:///data/users.db?mode=rwc LLDAP_HTTP_URL: "https://example.com/ldap/" --- apiVersion: v1 kind: PersistentVolumeClaim metadata: labels: app: lldap name: lldap-conf-pvc spec: storageClassName: longhorn accessModes: - ReadWriteOnce resources: requests: storage: 20Mi --- apiVersion: apps/v1 kind: Deployment metadata: annotations: lldap: https://github.com/nitnelave/lldap labels: app: lldap name: lldap spec: replicas: 1 selector: matchLabels: app: lldap strategy: # type: Recreate type: RollingUpdate rollingUpdate: maxSurge: 1 maxUnavailable: 0 template: metadata: annotations: lldap: https://github.com/nitnelave/lldap # k8s: https://github.com/Evantage-WS/lldap-kubernetes labels: app: lldap spec: containers: - name: lldap env: - name: UID valueFrom: secretKeyRef: name: lldap-credentials key: LLDAP_UID - name: GID valueFrom: secretKeyRef: name: lldap-credentials key: LLDAP_GID - name: TZ valueFrom: secretKeyRef: name: lldap-credentials key: LLDAP_TZ - name: LLDAP_JWT_SECRET valueFrom: secretKeyRef: name: lldap-credentials key: LLDAP_JWT_SECRET - name: LLDAP_HTTP_URL valueFrom: secretKeyRef: name: lldap-credentials key: LLDAP_HTTP_URL - name: LLDAP_LDAP_BASE_DN valueFrom: secretKeyRef: name: lldap-credentials key: LLDAP_LDAP_BASE_DN - name: LLDAP_KEY_SEED valueFrom: secretKeyRef: name: lldap-credentials key: LLDAP_KEY_SEED - name: LLDAP_LDAP_USER_DN valueFrom: secretKeyRef: name: lldap-credentials key: LLDAP_LDAP_USER_DN - name: LLDAP_LDAP_USER_PASS valueFrom: secretKeyRef: name: lldap-credentials key: LLDAP_LDAP_USER_PASS - name: LLDAP_LDAP_USER_EMAIL valueFrom: secretKeyRef: name: lldap-credentials key: LLDAP_LDAP_USER_EMAIL - name: LLDAP_DATABASE_URL valueFrom: secretKeyRef: name: lldap-credentials key: LLDAP_DATABASE_URL - name: LLDAP_VERBOSE value: "true" image: lldap/lldap:2024-03-07-alpine imagePullPolicy: IfNotPresent resources: limits: cpu: 400m memory: 100Mi # Can't be lower than 50Mi, lldap init phase take memory requests: cpu: 100m memory: 10Mi ports: - containerPort: 3890 - containerPort: 6360 - containerPort: 17170 volumeMounts: - mountPath: /data name: lldap-conf restartPolicy: Always terminationGracePeriodSeconds: 120 volumes: - name: lldap-conf persistentVolumeClaim: claimName: lldap-conf-pvc --- apiVersion: v1 kind: Service metadata: annotations: lldap: https://github.com/nitnelave/lldap # k8s: https://github.com/Evantage-WS/lldap-kubernetes labels: app: lldap-service name: lldap-service namespace: private spec: ports: - name: ldap port: 389 targetPort: 3890 - name: ldaps port: 636 targetPort: 6360 - name: http port: 1717 targetPort: 17170 selector: app: lldap ```
Author
Owner

@zelogik commented on GitHub (Mar 18, 2024):

Yes, that sounds like the culprit. We need more RAM (by design) when setting/checking a password ("hashing" the password is intentionally resource intensive).

The problem is not really the "design", but the log, even the the verbose mode haven't said anything except the strange "[debug]: | return: Some(SchemaVersion(9))" at the first run.

We can close this.

Done (too early?)

Maybe you want to add to the LLDAP K8s docs a note about the minimum resources required?

It's not really k8s specific finally, all productions servers using docker/k8s/distro with limit allocations (cpu/ram/...) could have that problem. no?

And @nitnelave thanks for the work on lldap!

<!-- gh-comment-id:2004152112 --> @zelogik commented on GitHub (Mar 18, 2024): > Yes, that sounds like the culprit. We need more RAM (by design) when setting/checking a password ("hashing" the password is intentionally resource intensive). The problem is not really the "design", but the log, even the the verbose mode haven't said anything except the strange "[debug]: | return: Some(SchemaVersion(9))" at the first run. > > We can close this. Done (too early?) > > Maybe you want to add to the LLDAP K8s docs a note about the minimum resources required? It's not really k8s specific finally, all productions servers using docker/k8s/distro with limit allocations (cpu/ram/...) could have that problem. no? And @nitnelave thanks for the work on lldap!
Author
Owner

@martadinata666 commented on GitHub (Mar 18, 2024):

it just hard to say. As the container directly terminated, even when there are logs for oom, it won't show. Any program that had limit allocations will act the same, like nodejs known as memory hog, as the program reach the allocation limit the container will terminated without any suspect of oom, it just dead.

edit: on docker usually determined by (137) that could be OOM or some other issue. Still unclear, essentially it just "container died non zero exit".

<!-- gh-comment-id:2004171630 --> @martadinata666 commented on GitHub (Mar 18, 2024): it just hard to say. As the container directly terminated, even when there are logs for oom, it won't show. Any program that had limit allocations will act the same, like nodejs known as memory hog, as the program reach the allocation limit the container will terminated without any suspect of oom, it just dead. edit: on docker usually determined by (137) that could be OOM or some other issue. Still unclear, essentially it just "container died non zero exit".
Author
Owner

@nitnelave commented on GitHub (Mar 18, 2024):

You should probably get some logs about the OOM from k8s, no? Maybe it should be more visible.

<!-- gh-comment-id:2004725120 --> @nitnelave commented on GitHub (Mar 18, 2024): You should probably get some logs about the OOM from k8s, no? Maybe it should be more visible.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/lldap-lldap#303
No description provided.