Jackett: [Enhancement] Pagination - Jackett search results vs. tracker(s) search results – Discrepancies & limitations

Created on 21 Jan 2017  Â·  9Comments  Â·  Source: Jackett/Jackett

I apologize in advance for the relatively long description ahead :)

All trackers have a certain number of torrents displayed per page when browsing or searching for torrents on the tracker’s website. Most trackers allow users to change this number from their profile settings, with some allowing up to 100 torrents per page, or more (e.g. 255 in HD-Torrents).

When configuring a tracker in Jackett, the user should ideally set this number to the maximum limit allowed by the tracker _since Jackett only displays the search results that appear on the first page of the tracker’s website._

This limitation is perhaps the only disadvantage of searching in Jackett vs. trackers’ websites.

It’s not usually as big a problem when searching one configured tracker, or when using very specific keywords. However, it becomes a problem when searching across all trackers, and/or with less specific keywords. It’s even more of an issue with trackers that only allow a small number of results per page (say 25) without permitting the user to change this number. On the tracker’s website, the user would just browse through multiple pages of search results. Unfortunately, this is not possible in Jackett.

I have come across instances where such omissions/discrepancies have left out some crucial search results in Jackett that I managed to find when searching the tracker directly.

I believe Jackett shows the search results using the default order displayed by the trackers, so if I sort the search results by size after they’re displayed in Jackett, that does not mean that I will actually see the absolute largest torrents first. It just means that what I will see first are the largest torrents that were displayed on the first page of the tracker’s search results. Any torrents that appear by default on the 2nd page of the tracker(s), and beyond, are not shown in Jackett at all.

Can something be done about this issue so that Jackett users can see the exact same results when searching in Jackett vs. the tracker, regardless of whether the search is done in one or all configured trackers, and irrespective of the torrents per page limit imposed by each tracker?

I understand that Jackett should not overload the trackers, but is there a workaround? Something like showing the results in tabs or pages that are loaded gradually in Jackett, but still allow users to view and sort across ALL search results?

Example: If I’m searching 8 trackers individually (using their websites) for ‘The Walking Dead S06’, and the combined number of results I found across all 8 trackers is, say, 470 torrents, I should see 470 torrents in Jackett as well when I search all 8 (together), and I should be able to sort through all 470 results, regardless of the torrents per page limitations.

Thank you.


Want to back this issue? Post a bounty on it! We accept bounties via Bountysource.

Core Enhancement Needs C#

All 9 comments

You're absolutely right.
Jackett should definitely support pagination, currently only very few trackers support it.
In a simple implementation we would just honor the limit Newznab argument and continue crawling more pages until we reached the limit.

Additionally supporting the offset argument shouldn't be to hard either. That way clients (including Jacketts manual search) could implement a "load more results" feature without having to get all results in one go.

There's one minor issue, the newznab API doesn't support "pages", only "total" (number of results) are supported. As not all trackers will return the exact number of results. Some might not even return the number of pages, in that case we can't even guess the total number of results.

@kaso17 Thanks for responding. I need to clarify a couple of points since I don't understand all the technicalities of how things work:

  1. So most trackers do not support pagination even though they all display results on multiple pages. Could you elaborate on this point, please?

  2. Why is it essential to know the number of results (or the number of pages on which search results appear)? Can they not be loaded in Jackett until all results from all trackers have been displayed in Jackett?

1.) is the support status in jackett (there are only <5 trackers where you can configure the number of pages to parse)

2.) it's not essential, just a limitation of the newznab format. We probably have to extend it a little to handle all tracker specific cases.

@kaso17 Is this related to #250?

Definitely.

A reminder to myself:
We've to consider trackers which return unrelated results too.
These will return a fixed amount of torrents per page but we might filter out some of them.
In that case if we get a request with offset != 0 we can't easily tell on which page we've to continue.
We would have to keep the state (on which page/row to continue).

Would greatly appreciate this, its truncating the search results from 1337x

Is anyone willing to work on this? @kaso17 @garfield69

It certainly would be great to have this feature, but until it is implemented, could you at least add info somewhere that Jackett only returns results from the first page?

I did notice when adding some indexers that there is a message, such as:

"For best results, change the 'Torrents per page' setting to maximum in your profile settings page."

but I thought that "best results" meant the performance of the search, that it would just parse the results faster, as I assumed it would take a short pause before getting the next result page, so it would not reveal itself as a bot. I would recommend you replace the word "best" with "more" and make this message appear when configuring any new indexer so that people would know how search is performed.

This will be a lot easier to implement when we can get the interface updated to support IAsyncEnumerable from the newest C#8 specifications. This will allow us to skip having to parse all the results and return them every time we run a query. We'll also have to roll our own custom SkipEnumerators that would allow us to be aware of pages when skipping results via Offset.

I've been working on TV Store as a tracker, and part of the problem I was having getting it refactored cleanly was related to the fact that it supports pages (and also the sites horrible API that we're using being unintuitive, non-standard, and undocumented, but that's off topic).

Anyway, there's a lot of other cleaning needed before we're ready to start implementing a base pagination support that all trackers can fall back on. Once we do get that base implemented, each tracker would be able to support unlimited results through Lazy Evaluation of Enumerable types. The lazy evaluation would keep our results from DDoSing the underlying websites without having to introduce any kind of delay within each tracker, or having each tracker need to count results or pages during a query.

Was this page helpful?
0 / 5 - 0 ratings