mirror of
https://github.com/ngosang/trackerslist.git
synced 2026-04-25 17:35:57 +03:00
[GH-ISSUE #222] Multiple DNS record for domain #190
Labels
No labels
bug
enhancement
help wanted
pull-request
question
waiting
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/trackerslist-ngosang#190
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @SamuelePilleri on GitHub (Jan 29, 2020).
Original GitHub issue: https://github.com/ngosang/trackerslist/issues/222
A single DNS name can point to multiple IPs (usually for load-balancing and redundancy purposes). This is not handled by the script behind the bot.
@Fluffkin commented on GitHub (Feb 1, 2020):
Thanks for the info. Probably better to avoid the IP lists then.
@SamuelePilleri commented on GitHub (Feb 1, 2020):
I think the IP lists are useful, many countries censor torrent websites and trackers at DNS level.
The point of my issue was discussing whether all IPs for each domain should be added (yet a few are better that nothing) and possibly improving the "find duplicates" function.
@Fluffkin commented on GitHub (Feb 1, 2020):
There's no realistic solution though. If you include 2 IP's for the same tracker, then your torrent client is effectively double spamming that tracker with requests. If you include only 1 IP you're evading that trackers load balancing. That wouldn't matter so much if we all got a random IP for that tracker site, but the lists will only give 1 IP for that tracker site.
Either way, using IP lists isn't "being a nice tracker user". Sure, maintain the IP lists for people that have no other choice, but the pragmatic solution is to use urls if you can.
@SamuelePilleri commented on GitHub (Feb 1, 2020):
Good point.
On the "duplicates" side, do you think two domain names should be considered equals if they resolve to the same set of IPs? In the picture, is there any advantage in keeping both the highlighted entries? I can only think in one being censored and the other not, but at this point one should use the IP list instead, maybe.
@Fluffkin commented on GitHub (Feb 1, 2020):
As a tracker user, it's pointless having two urls that resolve to the same tracker. They would probably be ranked next to each other in reliability / speed, so if I decide to add the top 3 urls, and 2 of those resolve to the same tracker, I've only actually added 2 trackers.
Trackers don't care that much if you spam them, they'll just occasionally ban your IP for a while when you exceed a certain request rate. But as a user, that's not ideal, especially if you keep triggering a temporary block because you didn't realise two of your tracker entries are resolving to the same tracker.
As a tracker server it's pointless having a user with 2 urls for the same thing. You're just being asked the same query twice. The only reasoning behind it is probably redundancy. If one is censored or it's DNS entry expires, the other will still work.
Since the whole point of these bot generated lists is to check speed and reliability, you can just safely blacklist the duplicate resolving urls from testing until the original url stops working. Then try the duplicate instead. The speed results should be identical after the initial DNS lookup. It would be handy if there was a list of duplicate resolving urls, not intended for adding to trackers, just as information.
The IP lists are problematic. They seem like a great idea to remove DNS lookups, but for a tracker that resolves to say 4 IP's, the speed / latency test is no longer valid if it was run on the url, not the IP because when testing the url, you'd be testing load balanced across the 4 IP's. Any individual IP may not be as fast or reliable, especially if people start ignoring the load balancing and referring to a single IP only.
Users (and tracker lists like this one) are part of the problem. Say for a tracker url that resolves to multiple IP's you decided to speed test the IP's instead of the urls, and your results suggest IP-3 is the fastest. You put it in the list and publish it. Hours later, it's getting far more requests from multiple IP's than it was when you were testing it. Now it's not the fastest most reliable IP any more.
I guess my solution would be to remove trackers that resolve to multiple IP's from the IP list entirely. (Leave a single entry for them in the url lists, because that's how they are supposed to work). It's that or instead of having a static IP tracker text list, have an url that returns the list, but selects a random single resolving IP for each tracker that resolves to multiple IP's each time it gets a query. It's not perfect, but it's a better solution than listing a single IP.
@SamuelePilleri commented on GitHub (Feb 1, 2020):
That would be an appropriate solution (for the URL list).
That would be for the best tracker-wise as you're effectively load balancing trackers, but requires changing the scope of this repository and write some backend to host somewhere to do that for you.
Not sure which is better.
@Fluffkin commented on GitHub (Feb 1, 2020):
The ideal solution is for more people to turn on DHT/PEX.
@ngosang commented on GitHub (Feb 2, 2020):
The bot is already removing duplicate trackers. The process is more complex than comparing IP/port.
I added an extra check to remove duplicates from IP lists.
DISCLAIMER: I advise against using the IP lists unless there is no other solution.
@ngosang commented on GitHub (Feb 3, 2020):
Fixed #178 too. The IP lists should be clean now.