[GH-ISSUE #222] Multiple DNS record for domain #190

Closed
opened 2026-02-26 05:33:04 +03:00 by kerem · 9 comments
Owner

Originally created by @SamuelePilleri on GitHub (Jan 29, 2020).
Original GitHub issue: https://github.com/ngosang/trackerslist/issues/222

A single DNS name can point to multiple IPs (usually for load-balancing and redundancy purposes). This is not handled by the script behind the bot.

image

Originally created by @SamuelePilleri on GitHub (Jan 29, 2020). Original GitHub issue: https://github.com/ngosang/trackerslist/issues/222 A single DNS name can point to multiple IPs (usually for load-balancing and redundancy purposes). This is not handled by the script behind the bot. ![image](https://user-images.githubusercontent.com/13276940/73388664-f2a9c880-42d2-11ea-8852-bef160731588.png)
kerem closed this issue 2026-02-26 05:33:04 +03:00
Author
Owner

@Fluffkin commented on GitHub (Feb 1, 2020):

Thanks for the info. Probably better to avoid the IP lists then.

<!-- gh-comment-id:580963334 --> @Fluffkin commented on GitHub (Feb 1, 2020): Thanks for the info. Probably better to avoid the IP lists then.
Author
Owner

@SamuelePilleri commented on GitHub (Feb 1, 2020):

I think the IP lists are useful, many countries censor torrent websites and trackers at DNS level.

The point of my issue was discussing whether all IPs for each domain should be added (yet a few are better that nothing) and possibly improving the "find duplicates" function.

<!-- gh-comment-id:580964372 --> @SamuelePilleri commented on GitHub (Feb 1, 2020): I think the IP lists are useful, many countries censor torrent websites and trackers at DNS level. The point of my issue was discussing whether _all_ IPs for each domain should be added (yet a few are better that nothing) and possibly improving the "find duplicates" function.
Author
Owner

@Fluffkin commented on GitHub (Feb 1, 2020):

There's no realistic solution though. If you include 2 IP's for the same tracker, then your torrent client is effectively double spamming that tracker with requests. If you include only 1 IP you're evading that trackers load balancing. That wouldn't matter so much if we all got a random IP for that tracker site, but the lists will only give 1 IP for that tracker site.

Either way, using IP lists isn't "being a nice tracker user". Sure, maintain the IP lists for people that have no other choice, but the pragmatic solution is to use urls if you can.

<!-- gh-comment-id:580988872 --> @Fluffkin commented on GitHub (Feb 1, 2020): There's no realistic solution though. If you include 2 IP's for the same tracker, then your torrent client is effectively double spamming that tracker with requests. If you include only 1 IP you're evading that trackers load balancing. That wouldn't matter so much if we all got a random IP for that tracker site, but the lists will only give 1 IP for that tracker site. Either way, using IP lists isn't "being a nice tracker user". Sure, maintain the IP lists for people that have no other choice, but the pragmatic solution is to use urls if you can.
Author
Owner

@SamuelePilleri commented on GitHub (Feb 1, 2020):

Good point.

On the "duplicates" side, do you think two domain names should be considered equals if they resolve to the same set of IPs? In the picture, is there any advantage in keeping both the highlighted entries? I can only think in one being censored and the other not, but at this point one should use the IP list instead, maybe.

<!-- gh-comment-id:581017750 --> @SamuelePilleri commented on GitHub (Feb 1, 2020): Good point. On the "duplicates" side, do you think two domain names should be considered equals if they resolve to the same _set_ of IPs? In the picture, is there any advantage in keeping both the highlighted entries? I can only think in one being censored and the other not, but at this point one should use the IP list instead, maybe.
Author
Owner

@Fluffkin commented on GitHub (Feb 1, 2020):

As a tracker user, it's pointless having two urls that resolve to the same tracker. They would probably be ranked next to each other in reliability / speed, so if I decide to add the top 3 urls, and 2 of those resolve to the same tracker, I've only actually added 2 trackers.

Trackers don't care that much if you spam them, they'll just occasionally ban your IP for a while when you exceed a certain request rate. But as a user, that's not ideal, especially if you keep triggering a temporary block because you didn't realise two of your tracker entries are resolving to the same tracker.

As a tracker server it's pointless having a user with 2 urls for the same thing. You're just being asked the same query twice. The only reasoning behind it is probably redundancy. If one is censored or it's DNS entry expires, the other will still work.

Since the whole point of these bot generated lists is to check speed and reliability, you can just safely blacklist the duplicate resolving urls from testing until the original url stops working. Then try the duplicate instead. The speed results should be identical after the initial DNS lookup. It would be handy if there was a list of duplicate resolving urls, not intended for adding to trackers, just as information.

The IP lists are problematic. They seem like a great idea to remove DNS lookups, but for a tracker that resolves to say 4 IP's, the speed / latency test is no longer valid if it was run on the url, not the IP because when testing the url, you'd be testing load balanced across the 4 IP's. Any individual IP may not be as fast or reliable, especially if people start ignoring the load balancing and referring to a single IP only.

Users (and tracker lists like this one) are part of the problem. Say for a tracker url that resolves to multiple IP's you decided to speed test the IP's instead of the urls, and your results suggest IP-3 is the fastest. You put it in the list and publish it. Hours later, it's getting far more requests from multiple IP's than it was when you were testing it. Now it's not the fastest most reliable IP any more.

I guess my solution would be to remove trackers that resolve to multiple IP's from the IP list entirely. (Leave a single entry for them in the url lists, because that's how they are supposed to work). It's that or instead of having a static IP tracker text list, have an url that returns the list, but selects a random single resolving IP for each tracker that resolves to multiple IP's each time it gets a query. It's not perfect, but it's a better solution than listing a single IP.

<!-- gh-comment-id:581023888 --> @Fluffkin commented on GitHub (Feb 1, 2020): As a tracker user, it's pointless having two urls that resolve to the same tracker. They would probably be ranked next to each other in reliability / speed, so if I decide to add the top 3 urls, and 2 of those resolve to the same tracker, I've only actually added 2 trackers. Trackers don't care that much if you spam them, they'll just occasionally ban your IP for a while when you exceed a certain request rate. But as a user, that's not ideal, especially if you keep triggering a temporary block because you didn't realise two of your tracker entries are resolving to the same tracker. As a tracker server it's pointless having a user with 2 urls for the same thing. You're just being asked the same query twice. The only reasoning behind it is probably redundancy. If one is censored or it's DNS entry expires, the other will still work. Since the whole point of these bot generated lists is to check speed and reliability, you can just safely blacklist the duplicate resolving urls from testing until the original url stops working. Then try the duplicate instead. The speed results should be identical after the initial DNS lookup. It would be handy if there was a list of duplicate resolving urls, not intended for adding to trackers, just as information. The IP lists are problematic. They seem like a great idea to remove DNS lookups, but for a tracker that resolves to say 4 IP's, the speed / latency test is no longer valid if it was run on the url, not the IP because when testing the url, you'd be testing load balanced across the 4 IP's. Any individual IP may not be as fast or reliable, especially if people start ignoring the load balancing and referring to a single IP only. Users (and tracker lists like this one) are part of the problem. Say for a tracker url that resolves to multiple IP's you decided to speed test the IP's instead of the urls, and your results suggest IP-3 is the fastest. You put it in the list and publish it. Hours later, it's getting far more requests from multiple IP's than it was when you were testing it. Now it's not the fastest most reliable IP any more. I guess my solution would be to remove trackers that resolve to multiple IP's from the IP list entirely. (Leave a single entry for them in the url lists, because that's how they are supposed to work). It's that or instead of having a static IP tracker text list, have an url that returns the list, but selects a random single resolving IP for each tracker that resolves to multiple IP's each time it gets a query. It's not perfect, but it's a better solution than listing a single IP.
Author
Owner

@SamuelePilleri commented on GitHub (Feb 1, 2020):

my solution would be to remove trackers that resolve to multiple IP's from the IP list entirely. (Leave a single entry for them in the url lists, because that's how they are supposed to work)

That would be an appropriate solution (for the URL list).

or [...] have an url that returns the list, but selects a random single resolving IP for each tracker that resolves to multiple IP's each time it gets a query

That would be for the best tracker-wise as you're effectively load balancing trackers, but requires changing the scope of this repository and write some backend to host somewhere to do that for you.

Not sure which is better.

<!-- gh-comment-id:581026243 --> @SamuelePilleri commented on GitHub (Feb 1, 2020): > my solution would be to remove trackers that resolve to multiple IP's from the IP list entirely. (Leave a single entry for them in the url lists, because that's how they are supposed to work) That would be an appropriate solution (for the URL list). > or [...] have an url that returns the list, but selects a random single resolving IP for each tracker that resolves to multiple IP's each time it gets a query That would be for the best tracker-wise as you're effectively load balancing trackers, but requires changing the scope of this repository and write some backend to host somewhere to do that for you. Not sure which is better.
Author
Owner

@Fluffkin commented on GitHub (Feb 1, 2020):

The ideal solution is for more people to turn on DHT/PEX.

<!-- gh-comment-id:581029563 --> @Fluffkin commented on GitHub (Feb 1, 2020): The ideal solution is for more people to turn on DHT/PEX.
Author
Owner

@ngosang commented on GitHub (Feb 2, 2020):

The bot is already removing duplicate trackers. The process is more complex than comparing IP/port.
I added an extra check to remove duplicates from IP lists.
DISCLAIMER: I advise against using the IP lists unless there is no other solution.

<!-- gh-comment-id:581169684 --> @ngosang commented on GitHub (Feb 2, 2020): The bot is already removing duplicate trackers. The process is more complex than comparing IP/port. I added an extra check to remove duplicates from IP lists. DISCLAIMER: I advise against using the IP lists unless there is no other solution.
Author
Owner

@ngosang commented on GitHub (Feb 3, 2020):

Fixed #178 too. The IP lists should be clean now.

<!-- gh-comment-id:581542934 --> @ngosang commented on GitHub (Feb 3, 2020): Fixed #178 too. The IP lists should be clean now.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/trackerslist-ngosang#190
No description provided.