It's about https://readthedocs.org/dashboard/import
When I open DevTools, I see that API requests for getting a list of repos to show fail with 502 after 10-12 seconds of waiting.
Import page shows the list of projects to import
Import page shows nothing in the list and doesn't even report any issues accessing the API to users
UPD: after a lot of retries I actually succeeded to see that list...
I'm seeing 502 two, we need to improve the performance of this :/
Actually this in only happening with the github integration :thinking:
FTR: I've managed to add a project I wanted to add :)
Additional info: I've got Uptime Robot set up to watch another project availability and occasionally I get notifications about it being down. Usually, when I get to check it manually it's already up. But at least once, when I did the check immediately I was seeing 502 on the project page as well. It doesn't happen very often but I thought I'd mention...
I'm only getting timeouts and 502's from the import it seems. After waiting for several minutes while it's syncing I'm not hopeful it will ever finish.
Maybe I've got too many repositories linked to my account?
In addition to the 200+ repo's I've got myself I am also in the Conda organisation which adds 6000+ repo's
Yeah, that's my guess as well. Maybe RTD does sequential requests to GitHub and times out...
I think I've got a lot of repos across orgs as well but probably less than 6k.
Yeah... honestly, I'd prefer to simply filter those out. Even if it does work, it's just a pain to search through all them and they're not useful for most things. The conda-forge repo's are metapackages only so they're never interesting for RTD if you ask me.
Maybe RTD does sequential requests to GitHub and times out...
I'm pretty sure the problem has to do with the code that attempts to limit users ability to import a repository that is already in Read the Docs. Specifically, this code here which is called from here. This results in a full table scan on the Project table for each project in the pagination (so ~15 full table scans).
I'm working on a fix.
This is now deployed. Please let me know if it's working better. 馃憤
@ericholscher tested a few minutes ago. It's very responsive now. Great work!
Yes! Finally works again for me The site simply gave up after trying for 15 minutes this afternoon.
[edit] Even with a "huge" organisation such as conda-forge with thousands of repo's it only takes about 400-500ms for a request. Next request... a search feature... even though it's faster, it's still annoying to have to paginate through tens of pages.