[GH-ISSUE #111] No robots.txt #68

Closed
opened 2026-02-27 10:15:35 +03:00 by kerem · 1 comment
Owner

Originally created by @FliegendeWurst on GitHub (Feb 3, 2025).
Original GitHub issue: https://github.com/matze/wastebin/issues/111

It would be good to provide a /robots.txt for (search engine) crawlers. Currently https://bin.bloerg.net/robots.txt returns a 404.

At least /burn/:id URLs should be blocked, I think.

Originally created by @FliegendeWurst on GitHub (Feb 3, 2025). Original GitHub issue: https://github.com/matze/wastebin/issues/111 It would be good to provide a `/robots.txt` for (search engine) crawlers. Currently `https://bin.bloerg.net/robots.txt` returns a 404. At least `/burn/:id` URLs should be blocked, I think.
kerem closed this issue 2026-02-27 10:15:35 +03:00
Author
Owner

@matze commented on GitHub (Feb 3, 2025):

Makes sense but I have no idea if this is actually a breaking change for some installations (i.e. someone set up a specific rule in NGINX for that URL). But I guess the proxy itself would handle that route before it proxies to the application.

<!-- gh-comment-id:2631051713 --> @matze commented on GitHub (Feb 3, 2025): Makes sense but I have no idea if this is actually a breaking change for some installations (i.e. someone set up a specific rule in NGINX for that URL). But I guess the proxy itself would handle that route before it proxies to the application.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/wastebin-matze#68
No description provided.