[GH-ISSUE #860] Gitlab Alerts #602

Open
opened 2026-02-25 23:42:59 +03:00 by kerem · 2 comments
Owner

Originally created by @mtesch-um on GitHub (Jul 14, 2023).
Original GitHub issue: https://github.com/healthchecks/healthchecks/issues/860

[Draft - still not sure what the requirements should be here... but wanted a place to gather and share... maybe this just turns into documentation about how to set this up to save others from working through it or having a suboptimal setup (or me from having a suboptimal setup!).]

Would be nice to have an integration w/ Gitlab Alerts. It can be (sort of) done manually right now with webhooks, but it's not obvious how to do it, and maybe doesn't quite have full-feature support(?)

The Alert webhook interface documentation: https://docs.gitlab.com/ee/operations/incident_management/integrations.html#http-endpoints

To setup a gitlab webhook integration in https://healthchecks.io/integrations/<uuid>/edit/

  • Create Webhook Alert under https://gitlab.com/<path-to-project>/-/settings/operations - Take note of "Webhook URL" and "Authorization key"
  • Execute when a check goes down : select POST : url=https://gitlab.com/<path-to-project>/alerts/notify/<alert-slug>/<some-numbers>.json ("Webhook URL" from above)
    Request Body=
{"start_time": "$NOW", "status": "down", "name": "$NAME"}`

Request Headers=

Authorization: Bearer <"Authorization key" from alert setup>
Content-Type: application/json
  • Execute when a check goes up : select POST : url=https://gitlab.com/<path-to-project>/alerts/notify/<alert-slug>/<some-numbers>.json ("Webhook URL" from above)
    Request Body=
{"end_time": "$NOW", "status": "down", "name": "$NAME"}

Request Headers=

Authorization: Bearer <"Authorization key" from alert setup>
Content-Type: application/json

One thing that appears to be missing (I haven't figured it out yet anyway) is a per-failure "fingerprint" which I think would allow the healthcheck failure to map 1-1 with an Alert and Incident in gitlab.

Originally created by @mtesch-um on GitHub (Jul 14, 2023). Original GitHub issue: https://github.com/healthchecks/healthchecks/issues/860 [Draft - still not sure what the requirements should be here... but wanted a place to gather and share... maybe this just turns into documentation about how to set this up to save others from working through it or having a suboptimal setup (or me from having a suboptimal setup!).] Would be nice to have an integration w/ Gitlab Alerts. It can be (sort of) done manually right now with webhooks, but it's not obvious how to do it, and maybe doesn't quite have full-feature support(?) The Alert webhook interface documentation: https://docs.gitlab.com/ee/operations/incident_management/integrations.html#http-endpoints To setup a gitlab webhook integration in `https://healthchecks.io/integrations/<uuid>/edit/` - Create Webhook Alert under `https://gitlab.com/<path-to-project>/-/settings/operations` - Take note of "Webhook URL" and "Authorization key" - `Execute when a check goes down` : select `POST` : url=`https://gitlab.com/<path-to-project>/alerts/notify/<alert-slug>/<some-numbers>.json` ("Webhook URL" from above) Request Body= ``` {"start_time": "$NOW", "status": "down", "name": "$NAME"}` ``` Request Headers= ``` Authorization: Bearer <"Authorization key" from alert setup> Content-Type: application/json ``` - `Execute when a check goes up` : select `POST` : url=`https://gitlab.com/<path-to-project>/alerts/notify/<alert-slug>/<some-numbers>.json` ("Webhook URL" from above) Request Body= ``` {"end_time": "$NOW", "status": "down", "name": "$NAME"} ``` Request Headers= ``` Authorization: Bearer <"Authorization key" from alert setup> Content-Type: application/json ``` One thing that appears to be missing (I haven't figured it out yet anyway) is a per-failure "fingerprint" which I think would allow the healthcheck failure to map 1-1 with an Alert and Incident in gitlab.
Author
Owner

@cuu508 commented on GitHub (Jul 14, 2023):

Pagerduty webhook payloads have an incident_key field, which I think is similar to the fingerprint, it is used for grouping notifications about the "same thing" together. In the Pagerduty integration we use check's code as the incident key.

<!-- gh-comment-id:1636201194 --> @cuu508 commented on GitHub (Jul 14, 2023): Pagerduty webhook payloads have an `incident_key` field, which I think is similar to the fingerprint, it is used for grouping notifications about the "same thing" together. In the Pagerduty integration we use check's code as the incident key.
Author
Owner

@mtesch-um commented on GitHub (Jul 14, 2023):

👍 I'll let it run as is for a few days and see how it works in relation to the Alerts/Incident management built into gitlab.

I suspect we might want to have separate Alerts for separate failures, even for the same code, which maybe could be the last good rid before the failure, or some event identifier for the webhook-down event or the cron schedule time that triggered the last failure (even if it's an UP event)?

<!-- gh-comment-id:1636300199 --> @mtesch-um commented on GitHub (Jul 14, 2023): 👍 I'll let it run as is for a few days and see how it works in relation to the Alerts/Incident management built into gitlab. I suspect we might want to have separate Alerts for separate failures, even for the same code, which maybe could be the last good `rid` before the failure, or some event identifier for the webhook-down event or the cron schedule time that triggered the last failure (even if it's an UP event)?
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/healthchecks#602
No description provided.