See also https://www.drupal.org/node/1475510
Problem: when running composer install the user is prompted to input a Github auth token. That is a serious burden for 3000+ Drupal core contributors that will need to run composer install and a significant waste of developer time. Some users do not want to create a Github account.
It seems that composer only prompts for a Github auth token in interactive shells? I have never seen that prompt on Travis CI for PHP projects for example.
Proposed solution: make composer install on interactive shells behave the same as in non interactive ones: always fall back to install from source (git clone) and never prompt the user for a Github token. Users should be able to opt-in for faster downloads with Github auth tokens, but this should never be an interactive usability WTF.
It only asks for it when reaching the API rate limit, as authenticated calls have a much higher rate limit.
Never asking for it would make the users unaware they should authenticate against github to be able to keep using downloads.
A better solution would be to improve the prompt to better explain why we ask for it, and to allow to skip the authentication by giving an empty token (i.e. just hitting enter), which would also be explained in the message.
@stof I think you might be on the right track. It would by default leave some people with a odd mix of dist and source checkouts without an explanation. Could we carry that token through all additional requests during the run and/or expose this behavior as an option?
"It only asks for it when reaching the API rate limit"
Yes, that happens with every composer install of Drupal core because in total there are more than 60 downloads (phpunit etc. with --require-dev is quite a lot by itself)
"and to allow to skip the authentication by giving an empty token (i.e. just hitting enter), which would also be explained in the message."
It does that already, but it prompts on every individual download, so you get into a few seconds for a package, same error message again, press enter, over and over.
For me I'd show a warning message but not actually interrupt the downloads. composer install takes long enough that I'm not usually sitting holding my breath in front of the terminal window for it to finish. So I'd rather it took longer and actually happened, than go back and see it got stuck half way through. If it's still running, there's the warning message explaining why it's taking so long.
well, once you give it a token, it won't ask anymore.
what we could do to improve things is remembering that you rejecting giving a github token so that we don't ask again in the same composer run.
We do not want to force composer users to create a github account. The install process can throw warnings with bells and whistles and Github advertisements, but it should get the job done without prompting or bothering the user.
If you run in non-interactive mode (-n) it will switch from dist to source when hitting the limit without asking for a token I believe. Just a quick work-around until we change the flow.
I can see the problem.. having some error output in every run is however not nice either. I wonder if Drupal would have enough weight to have GitHub reconsider the 60/h limit without authentication (it is quite low tbh..).
Alternatively I suppose we could indeed remove the prompt even in interactive mode, but then we need even more output to tell people how to configure it if they do create a token.
@Seldaek I think it's worth a github issue - either to raise the limit or for them to actually put the zip requests through a CDN properly so people aren't hitting the API raw every time.
However "We do not want to force composer users to create a github account." 100% agreed with this - this is the reason I reverted the Drupal core commit until it was discussed a bit more, and relying on them to raise their API limit doesn't really help with that dependency.
At the moment, Composer drupal-update can easily hit the Githubt rate limit. I am using Docker Hub for automated build. The rate limit applies on IP address, so more then often the builds are fails.
Using interactive promptdoesn't help automated build. Github personal token only works if the automated build if NOT open sourced. If it is open sourced, Github will automatically revoke the public personal token.
requests through a CDN
My thoughts exactly. But they probably don't want to do this, to avoid 1) big bandwidth cost, 2) using GitHub to distribute pirated content.
Using interactive promptdoesn't help automated build. Github personal token only works if the automated build if NOT open sourced. If it is open sourced, Github will automatically revoke the public personal token.
Well, it depends how you store the token. When using Travis, they have a concept of secure environment variables which means that the token won't be public.
And I'm not sure they would revoke a token without any scope.
Well, it depends how you store the token. When using Travis, they have a concept of secure environment variables which means that the token won't be public.
And I'm not sure they would revoke a token without any scope.
Using secure environment variable for a public Travis build is broken. That is probably a Travis bug, or maybe it is expected behavior for open source builds on Travis.
Edit: I stand corrected per @bartfeenstra's correction. Thank you.
Gitlab CI has variables to though I haven't looked into if they have the level of "secure" that Travis does. As far as I can tell docker hub does not though but that's probably a limitation of docker build itself. You'll probably want the -n option @Seldaek mentioned if composer is detecting an interactive shell. None of the environmental variable discussion is really pertinent to improving the workflow for users without an API key though which is the core of the issue.
Secure variables work, but not in pull requests through forks. As a consequence, builds for any such PR will fail, because the token cannot be decrypted.
I think it's worth a github issue - either to raise the limit or for them to actually put the zip requests through a CDN properly so people aren't hitting the API raw every time.
@catch56 The reason for using the API for downloads is GitHub guarantee that the endpoint will not change, historically Composer used the download URLs from the website but those have changed overtime. If GitHub stopped rate limiting API download calls that'd really help.
@bartfeenstra it won't fail. It will fallback to the slower source install (cloning a git repo is slower than downloading a zip archive of the code at a specific commit, as it is much bigger)
What if we look at this from the other direction: The main issue with downloading sources for vendor dependencies is you're downloading a lot more data, since you get the full history even if you don't need it. What if we allowed for shallow clones, or even defaulted to that? Downloading the latest commit should be in the same neighborhood as downloading a static snapshot, and just one or two commits back is not going to add a great deal of cost. Certainly far less than the full history of a project like Drupal, PHPUnit, or Symfony HttpKernel. (Let's face it, it's not like having the code snapshot of Drupal 4.1 or PHPUnit 2 is at all useful to 99.999999% of people installing Drupal or Symfony today.)
That would make "skip this noise, just grab sources" a much cheaper and therefore more viable option.
@Crell yes we should look at https://github.com/composer/composer/issues/2833 (and maybe https://github.com/composer/composer/pull/4685 would help too on that front). Not sure if it's workable though to be honest as we might end up having to fetch every time.
@Seldaek #4685 _should_ be fairly safe as it only uses the cache for the initial clone, although it requires Git 2.3 but should be possible to replicate that behaviour for older Git's. https://github.com/gitster/git/commit/d35c8027937546e6b22a2f28123f731c84e3b380
For what it's worth OSX 10.11 is bundled with git 2.3.x these days, and it probably wouldn't be too much of a big deal to strongly recommend using 2.3 or newer for people wanting to work on Drupal core.
I wonder if Drupal would have enough weight to have GitHub reconsider the 60/h limit without authentication (it is quite low tbh..).
I have no idea. What would you like it to be? I have a call with Jono Bacon next week and I will mention. Absolutely no promises but I'll mention it to him.
@chx well the main point that would be nice is to get the ability to use the zipball URLs without API limit (or with a higher one.. the exact number probably depends on who is running composer), because really this isn't an API call per se.
earlier @cs278 mentioned using the non rate limited zips in the past. Could we consider using those, falling back on the api, then falling back on the source? That way if they move again things still "work" until the user updates to a release where composer has fixed the URLs.
If the API rate limit message also funnelled people to a useful wiki/help page then I think it could also clear things up for people. "Oh I suddenly saw this because github changed things. No big deal, I can update."
I came here to say what neclimdul said already. +1
That sounds good to me as well trying to use the static zips then falling back to the API @Seldaek would that be an acceptable solution for you too?
I guess we could do this yes, as long as there's a fallback. It can probably be hooked in after FileDownloader:87 once we have $urls we could go through them and add the direct URLs before the api one if one is found. And I guess for future proofing it, in case one of these fails we probably should stop trying for the other URLs. Just in case github removes support for that composer won't try everything with a timeout/error for every package.
Sounds like pretty robust logic to me. Good catch on the timeout bit, that probably would get annoying.
Out of curiosity: did anyone ever reach out to Github anywhere recently about the API limits? I would expect, most certainly after the recent "we listen to the developers" communication, them to certainly consider either raising the API limit or exempting the zipball calls.
Looking at https://api.github.com/rate_limit output they certainly have the tech in place - they're already splitting core and search calls in separate limits. That makes it seem trivial from our end to also separate the zipball calls. From their end I think they would even prefer all those users downloading zipballs instead of entire deep clones, as it's going to be a lesser burden on their servers and bandwidth for sure.
For all we care they raise it from 60/hr to 500/day. Whilst that's a far lower limit per timespan that would help the one-time install bursts immensely, and only developers need the more sustained limits - and those have Github accounts or no problem with creating one.
Just mentioning that you can continue without entering a key (and prompting once per run) should probably be a lot clearer. And then mention that --prefer-source can be used (or ask to change the composer.json config to prefer source?).
That should at least stop users from quitting directly.
I've read the @github offered it's help to move to Github already, so they would probably be open to listen to problems with Drupal, it's worth a try.
But I would think that Composer itself should have some weight now, with the amount of downloads coming from Composer..
FYI, I was looking at other package installers, and at least Bower seems to use both shallow clones and the non-api downloads: https://github.com/bower/bower/blob/master/lib/core/resolvers/GitHubResolver.js
Sorry, I now see there is an existing Pull Request, with feedback from Github, about using the github.com downloads instead of the API; https://github.com/composer/composer/pull/4737
Should that be revisited in light of this issue?
I've created a PR to use the dist when possible, this should avoid prompts and be a lot faster, when not signed in with Github (roughly as fast as with auth). Should also speed up your test builds when using caching a lot. Does that work for you guys?
https://github.com/composer/composer/pull/4944
@barryvdh Yes I think this would work nicely for Drupal 8. Nice one.
Meanwhile I've reached out to Github about this specific issue, mgmt summary:
After reading through your comments and the issue thread that you've linked, we'll take a closer look at this and get back to you after we've discussed it internally or if the team has any other questions about it.
So fingers crossed on that one.
That would be even better :) Although the proposed solution would still benefit large projects/CI builds, it would just take longer to reach the API limits. (Unless they fully skip the rate limit on downloads)
I don't know if its better. I think we've identified several things to fix regardless.
Core issue is still that Github does not guarantee any other method will work per the communication @alcohol had with them. So the rate limit remains the core issue to resolve to reach an initial fundamental solution.
The direct download is just an alternative to cloning. If it is removed, it will continue as before (cloning) and the link can be fixed in a new release.
I would be ok with the solution provided by @barryvdh in #4944 (based on quick PoC from @nicolas-grekas, kudos due) if @github cannot offer us a better alternative. I would like to wait though until we get more feedback from them because I do not feel particularly comfortable about solutions that are not officially supported. But in the end, I suppose anything that makes Composer more user-friendly is a plus.
@barryvdh does your PR include the functionality suggested regarding not re-trying the direct download url for consecutive packages if it fails on the first one?
@alcohol Can we discuss on the #4944 issue?
@Crell @chx While we're on the topic of Drupals move to composer, are there any more issues stopping you from fully adopting Composer, as mentioned in the d.org issue? Are the problems with the public vendor dir etc all resolved, or is it just the downloads/tokens? (I saw the issue was to not be included in the 8.1 release)
Since I was asked: I find Composer in general unusably slow ( I tried to file a few issues to this extent without much success, probably I explained my problems poorly ) and I gave up on it but my word carries no weight in Drupal any more and I am sure others will go ahead as soon as this issue is resolved. Sorry.
The aforementioned call didn't happen at the end either.
Well yeah, I read that in the (drupal) issues a few times, but okay.
So on my test system, downloading https://ftp.drupal.org/files/projects/drupal-8.2.x-dev.zip and running composer update takes about 1 minute, when I clear the cache. Running it the second time takes about 9 seconds, which isn't that bad right? If you use stable releases, you wouldn't have to download new versions very often, and after the first time, they're cached.
Deleting the vendor dir and running composer install from a fresh cache does indeed take 1m30 (cached 9 sec).
So I'm guessing your primary concern is the actual download time (not counting the cloning, which obviously takes a lot longer still), right?
FYI, using this plugin: https://github.com/hirak/prestissimo (composer global require hirak/prestissimo) to use parallel downloads, reduces the install time with fresh cache to < 10 sec for me.
So I'm guessing your primary concern is the actual download time (not counting the cloning, which obviously takes a lot longer still), right?
- If you, even accidentally, run composer in a shell which has xdebug set up you will get ... interesting numbers. This is somewhat understandable as we all know xdebug adds overhead to every function call and composer is a complex CLI app written in PHP and has a lot of function calls. This understanding doesn't help the problem though.
- Try adding https://github.com/amitaibu/og/ and then updating anything. This is an extremely popular module underpinning one of the drupal.org subsites even. Now, even if the thing you try to update has nothing to do with og you will still see a ton of fetching attempts because most branches in there have no composer.json so it presumes the fetch have failed. I think at least that's what happens, I only saw the fetching attempts when I tried to update another contrib and that's when I just gave up.
Re 1. If you accidentally the whole thing, we got you covered:

If you then decide to accidentally ignore big yellow warnings, then indeed you might experience a slowdown.
Oh. How come I haven't gotten that? I will investigate. Is that new?
To elaborate on 2, as far as I have seen, even if you just want to update just one Drupal contrib module it will go out and try to see whether everything required in composer.json has an update. This very quickly gets seriously out of hand as core already has dozens and many sites have over a hundred contrib modules...
The warning is newish (a few months I'd say).
And re 2. That is only true if you add it as a VCS repo, which should not be done really unless you are trying to temporarily work on something you patched or such use cases. For long term usage it's better to have something like Toran Proxy or Satis that does that stuff in the back and generates a composer repository that compoesr can load very fast.
In the long run though, drupal modules should be available directly as a composer repository, there is one right now for testing (AFAIK) at https://packagist.drupal-composer.org/packages/drupal/og - using that repo you shouldn't have issues with slow updates looking for all branches in the repo. See also http://drupal-composer.org/
You can update a single module with composer update foo/bar.
Usually you will want to add the --with-dependencies flag to also update the dependencies of the requested package. But other than that it will not touch or inspect the other packages in the project.
You can update a single module with composer update foo/bar.
Usually you will want to add the --with-dependencies flag to also update the dependencies of the requested package. But other than that it will not touch or inspect the other packages in the project.
@curry684 that's not entirely true. If the slowdown is caused by the loading of the VCS repo, this will still happen for partial update (because composer cannot know what they contain until it loads them).
This is why using lots of VCS repos is not recommended (it is also much more painful than adding a single composer repo providing metadata for all Drupal modules)
See this https://gist.github.com/chx/e734569c89de06a512c7 . I have started from a pristine 8.0.x branch of Drupal 8 (doesn't matter whether 8.0.x , 8.1.x or 8.2.x ) and added one somewhat complex module. The initial cache warming from Drupal Packagist took close to seven minutes (with one break because there was no feedback even after several minutes whether anything is going or it's just waiting on some hung download or what). Then updating that single package with freshly warm caches on an i5 2400 (old it may be but if you are on a laptop you are guaranteed to have a slower CPU) machine w/ SSDs took 12 seconds. That's the "do nothing" baseline. Everything else will take longer.
My problem is that I compare to a git pull. Unfair? Might be but that's the alternative.
Well, it does seem to download files, even with I don't add --with-dependancies and am not using any VCS repos.
$ composer update symfony/process -vvv
[..]
Reading ./composer.lock
Loading composer repositories with package information
Downloading https://packagist.org/packages.json
Writing /home/user/.composer/cache/repo/https---packagist.org/packages.json into cache
Updating dependencies (including require-dev)
Downloading http://packagist.org/p/provider-2013%241e5638116dc94e6c7daadf10261204569279d78420396beb9d2aba8c717e112e.json
Writing /home/user/.composer/cache/repo/https---packagist.org/p-provider-2013.json into cache
Downloading http://packagist.org/p/provider-2014%247b5c14545807334668ef4ae72bf64be3c3fe7f665e5f39e6eae5bf7bb7146336.json
Writing /home/user/.composer/cache/repo/https---packagist.org/p-provider-2014.json into cache
[..]
Downloading http://packagist.org/p/phpunit/php-text-template%244062cc1894fca5d3f6d05de5b49368a3adf7381d41ed21fd274f1beb266e2450.json
Writing /home/user/.composer/cache/repo/https---packagist.org/provider-phpunit$php-text-template.json into cache
...
@stof yeah I was assuming the packagist.drupal-composer.org scenario in that answer, which they should use anyway at this scale.
@chx please include the --profile flag when providing logs about performance - it adds timing and memory information to every output line. Right now in your output I cannot see whether Composer is 'at fault' or the 2 Drupal scripts injected at the end. Also a minimum verbosity of -v is recommended for granularity.
➜ d8 git:(8.0.x) ✗ composer --profile update drupal/page_manager
[10.7MB/0.30s] Loading composer repositories with package information
[11.2MB/2.73s] Updating dependencies (including require-dev)
[267.6MB/34.22s] Nothing to install or update
[176.4MB/34.39s] Generating autoload files
[176.7MB/34.50s] > Drupal\Core\Composer\Composer::preAutoloadDump
[177.1MB/34.73s] > Drupal\Core\Composer\Composer::ensureHtaccess
[175.6MB/34.84s] Memory usage: 175.58MB (peak: 268.35MB), time: 34.84s
@chx And if you add -vvv you can see what actually gets downloaded/calculated there.
First run, on an "Intel(R) Xeon(R) CPU E5-2670 v2 @ 2.50GHz" (Ivy Bridge, 2 years old CPU):
/drupal-8.2.x-dev$ time composer --profile update
[3.7MB/0.13s] Loading composer repositories with package information
[4.0MB/0.24s] Updating dependencies (including require-dev)
[178.0MB/8.69s] Nothing to install or update
[115.5MB/8.74s] Generating autoload files
[115.7MB/8.82s] > Drupal\Core\Composer\Composer::preAutoloadDump
[116.0MB/8.99s] > Drupal\Core\Composer\Composer::ensureHtaccess
[115.0MB/8.99s] Memory usage: 114.96MB (peak: 178.41MB), time: 8.99s
real 0m9.107s
user 0m3.376s
sys 0m0.256s
Second run:
/drupal-8.2.x-dev$ time composer --profile update
[3.7MB/0.14s] Loading composer repositories with package information
[4.0MB/0.24s] Updating dependencies (including require-dev)
[178.1MB/2.80s] Nothing to install or update
[115.5MB/2.85s] Generating autoload files
[115.7MB/2.92s] > Drupal\Core\Composer\Composer::preAutoloadDump
[116.0MB/3.06s] > Drupal\Core\Composer\Composer::ensureHtaccess
[115.0MB/3.06s] Memory usage: 114.97MB (peak: 178.41MB), time: 3.06s
real 0m3.169s
user 0m2.964s
sys 0m0.104s
I really don't think 3 seconds is excessive with warmed caches. A git pull would likely take just as long to sync, and in exchange for that you get recursive dependency management.
I can't install the page_manager from stock provided repo though, which repository do I need for that?
@barryvdh sure, https://gist.github.com/chx/5e59d0b226bcd17dffd5 here it is with -vvv , this one took 18.34s to do nothing.
Can we please stop this discussion here? Performance is what it is. If it's not acceptable to you that's not gonna change the performance. Most people are ok with 18 seconds to know if there are new updates to their dependencies (that's not _nothing_ btw.). There is no discussion here really, and it's side-tracking the initial issue.
Errr @chx:
[11.1MB/0.32s] Downloading https://packagist.drupal-composer.org/packages.json
[11.1MB/1.44s] Writing /home/chx/.composer/cache/repo/https---packagist.drupal-composer.org/packages.json into cache
On my computer this 1.4KB file loads in 44ms.
[11.1MB/1.44s] Downloading https://packagist.org/packages.json
[11.1MB/2.56s] Writing /home/chx/.composer/cache/repo/https---packagist.org/packages.json into cache
This 1.4KB file loaded in 80ms. Your internet connection is consistently taking over a second to download files that fit in 3 ethernet frames. That is your true performance issue, on your end, not Composer.
But indeed, please get back on topic. Whether Composer is performant enough for Drupal is an issue to be discussed on the Drupal end. This issue should be about the issue with burst-loading packages from Github, and Github is communicating seriously with us about our issue there, no news to report yet though.
+1 for resolution to this issue. Especially as more software projects use composer for dependency management this will become a more pressing issue. I'd like to point out too this could cause serious confusion for users (not developers) of that software. Say during an upgrade flow where there could be a lot of dependencies that hit the Github api limit and require the user to enter Github credentials. That could be an huge barrier for non-technical users.
I like the suggestion to use public url's for zip downloads as a way to avoid hitting the API limit. Not only will that result in a faster experience when pulling a lot of dependencies but it'll also avoid the need to authenticate to Github for common user flows ( software install, upgrade ).
@barryvdh if the rate limit issue is sorted it's not impossible that we'd still remove the vendor directory during the 8.1.x beta. I'd certainly remove it from 8.2.x once it is at least - and that branch is already open.
I'm not sure if the communication with Github is still active, but this comment from a Github employee: https://github.com/CocoaPods/CocoaPods/issues/4989#issuecomment-193801376 (posted in #4961 by @bojanz) seems to suggest that Github is more happy when you download the archives instead of git clone:
@mikemcquaid: It would help if you were using e.g. master.tar.gz tarballs as they can be more easily cached and served without hitting the Git layer every time. The problem from your side is that you'd need to do a ~60MB download every time so I can see this being undesirable.
A few comments later, he also confirms that this means the downloads from the website.
(Altough @vmg just now replied that doing clones isn't actually really a problem for Github: https://github.com/CocoaPods/CocoaPods/issues/4989#issuecomment-193838934)
This would not be a strict improvement. If you use the tarballs that we offer for download, you will not have the Git metadata for the repository, so further fetches won't be possible. It'd be just as cheap to perform a full clone through Git -- GitHub has a special implementation on the server-side that can make serving a full clone particularly cheap as long as not a shallow clone. And obviously, you can continue fetching on top of the original clone.
But I would argue that this would be a neglectible advantage in the case of CI builds, because a) they don't update and b) when doing multiple runs, they dists are cached so only 1 download vs 7 clones. This also goes for users, which is a trade-off (many small clones vs same dists)
@barryvdh I've tried to follow this thread but I'm lacking context. I'd definitely advise you to always, unconditionally use e.g. https://github.com/composer/composer/archive/master.tar.gz over https://api.github.com/repos/composer/composer/tarball/master.
@mikemcquaid do you mind expanding on that? Historically I've been told by GH support that relying on github.com URLs is a bad idea as they are not guaranteed to remain as they are now (vs the API ones). But indeed using those URLs would be a lot better for us than relying on API as we sometimes hit the limit. BTW we have a support thread going on atm between @curry684 and Ivan Žužak from GH, in case you rather chime in there.
@mikemcquaid that directly contradicts what your colleagues have been telling us, that the API endpoints are the only ones guaranteed to keep working in the future, and that _all_ other links in the site may change without warning or prior notice. We've been in touch with @izuzak on the subject and he's currently taking this up with the tech team.
@curry684 I'm not speaking as a GitHub employee here but as a Homebrew maintainer. I would have given the same advice here before I worked at GitHub and the advice I gave to Homebrew developers who used those different endpoints. While the endpoint I've suggested are not guaranteed to continue working (@izuzak is completely correct there), for Homebrew's case they have been static for a very long time and therefore we rely on them very heavily so I'd advise you do the same. That said, I've recently joined GitHub's API team so I'm going to start an internal discussion about the state of this guarantee.
@mikemcquaid I'd start with @izuzak as we've already discussed the internals with him. If you drop me an email I can forward you those mails so far if you want.
@curry684 Yup, we are talking :smile:. For now I'd say it's a question of whether you need those guarantees or whether other older and widely used package managers relying heavily on that URL format is sufficient for you to make that switch.
@curry684 Or another option that just occurred to me is that you could always fall back to the API endpoints if the other fail.
Switch on fail is not a resilient scenario for an application serving over a million packages per day. What is a fail, is it just 404? Or any 400 code? What if Github replaces it with a new HTML page giving 200?
We know there are various workarounds, the goal of us reaching out to Github is not needing them anymore, and Github on their end getting predictable traffic from us in exchange.
@mikemcquaid Short summary: We'd like to speed-up installs for non-authenticated users and CI runs by using archives instead of git clones, but run into rate limits for downloads via the API, when not using a token.
Basically what we want to hear from Github is 1 of the following:
(And I'm making the assumption here that Github isn't stupid and knows a lot of package managers use those link, so replacing them with a different result would seem unlikely).
Thanks for your feedback!
@barryvdh With my GitHub hat on: We're discussing this internally but I cannot make any promises as to the outcome or timescale on that.
With my "Homebrew maintainer for 7 years" hat on: Homebrew has relied on that URL format remaining the same since long before I was at GitHub, we are not experiencing the same problems you are experiencing and have no plans to start using the API for these downloads.
@mikemcquaid from Drupal's point of view, an increase in the API limit would allow us to improve our composer integration in time for the next minor release. We intend to have a release candidate out on April 6th, so ideally would need a week or two prior to be able to re-integrate the change that prompted opening this issue. I'm not sure the best place to request this though so please redirect me if mentioning here isn't enough.
Overall I agree using the download URLs is a much better solution though. It looks from comments elsewhere that composer's unlikely to make that change before the Drupal 8.1.0 release candidate, but either change happening would be enough for me.
It looks from comments elsewhere that composer's unlikely to make that change before the Drupal 8.1.0 release candidate
If Github is willing to make a solid statement on the reliability of the change I don't see why not. Essentially we have all the logic in place and it's a non-functional change.
@mikemcquaid from Drupal's point of view, an increase in the API limit would allow us to improve our composer integration in time for the next minor release. We intend to have a release candidate out on April 6th, so ideally would need a week or two prior to be able to re-integrate the change that prompted opening this issue. I'm not sure the best place to request this though so please redirect me if mentioning here isn't enough.
I believe an increase in rate limits has already been discussed with support so you could raise it again there. To perhaps save you the effort though: our unauthenticated rate limits exist to protect GitHub's availability so it's unlikely that it will be increased on a non-short-term basis.
If Github is willing to make a solid statement on the reliability of the change I don't see why not. Essentially we have all the logic in place and it's a non-functional change.
As I've mentioned before: GitHub does not provide any guarantees for the non-API endpoints currently. Homebrew relies on them but that predates my employment at GitHub so is not indicative of GitHub's position or guarantees.
Can we please stop repeating the discussion in a loop? All the cards are
on the table, we will get it sorted.
@Seldaek My apologies. I'll unsubscribe from this thread.
@mikemcquaid it wasn't aimed at you specifically :) I'm grateful for your input here. I just think all we need now is to fix it, not pile up emails in everyone's inboxes.
Well it would seem problem solved without any changes needed here :smile:
Hi Niels and Jordi,
We've spent some time digging in, considering all the options to resolve your issues, weighing your needs against infrastructure strain and availability.
I'm happy to report that our Infrastructure team believes that due to their work on our Git backend since these APIs were introduced, we're now able to drop these rate limits on the API side. We deployed those changes a couple of hours ago. Getting an archive link [1] via the API will no longer count against your hourly rate limit (authenticated or not). This should make Composer installs happy.
Let us know if you see any funny business.
Cheers,
Wynn Netherland
Platform Engineering Manager, GitHub
Cheers @pengwynn!
<3
Awesome. Thanks @pengwynn @mikemcquaid @izuzak and the rest of Github for looking into this :)
Pinging @catch56 specifically btw on this.
:+1: :+1:
Awesome. Thanks @pengwynn @mikemcquaid @izuzak and the rest of Github for looking into this :)
@barryvdh Our pleasure. Thanks for the kind words.
Best news of the day!
I've re-started the Drupal issue since that unblocks things our end, thanks.
@mikemcquaid with the API restrictions removed would you actually recommend using these over the files now? For example I'd assume the file downloads go via your CDN etc.
@catch56 They both end up at the same endpoint currently but going through the API means that any changes are subject to GitHub's usual API versioning principles (https://developer.github.com/v3/versions/#v3).
Closing this then, thanks GitHub folks, good we didn't have to add any hacks :)
Is there going to be an official announcement from github?
We can all check it's been deployed but docs haven't been changed yet: https://developer.github.com/v3/#rate-limiting
I know updating docs and drafting announcements takes time so I'm very glad good-folks-gihub deployed it before those things are ready so we can use it ASAP, but it seems premature to _rely_ on it before anything official comes out.
( edit ) Also unlimited and very high limit looks the same but isn't. Again, not complaining, just curious about the details.
but it seems premature to rely on it before anything official comes out.
It's been merged to master and deployed to production so you can rely on it now.
Thanks @pengwynn @mikemcquaid @izuzak and the rest of Github for addressing!
https://travis-ci.org/api-platform/core/jobs/115386831#L263
:disappointed:
Yay!
Very Nice
I would like to think that this is true, but it certainly doesn't seem like the API limit has been completely removed, as API calls from Travis builds still fail: https://github.com/composer/composer/issues/1314#issuecomment-199457561
Either "unlimited" is actually "a very high limit that Travis will still hit", or Composer is calling some other limited API.
I've also seen it happen on a few occasions still. @mikemcquaid any idea why this would still be happening?
@pengwynn is the best person to ask here (I'm on vacation).
Or you'll probably get the quickest response from GitHub support.
Did anyone create an issue already? Seeing this with Travis more often, eg. https://travis-ci.org/laravel/framework/jobs/119219747#L157
@barryvdh Same issue here: https://travis-ci.org/sonata-project/exporter/jobs/138336530#L445
cc @MikeMcQuaid @pengwynn @Seldaek
@Soullivaneuh et al.: It appears that while we stopped metering this endpoint a while back, there were rate limit checks upstream that failed fast if you were over the limit from other API usage. We just deployed a change that should fix that up. Get in touch at API support if you see any continued funny business.
how is this possible, that i hit this in in november? :o
Reading ./composer.lock
Loading composer repositories with package information
Installing dependencies (including require-dev) from lock file
Reading ./composer.lock
Resolving dependencies through SAT
Dependency resolution completed in 0.006 seconds
Analyzed 189 packages to resolve dependencies
Analyzed 528 rules to resolve dependencies
- Removing asika/pdf2text (1.0.1)
- Installing asika/pdf2text (1.0.5)
Downloading https://api.github.com/repos/asika32764/php-pdf-2-text/zipball/a14ea95695a277e385dbc03caeddb91c5e10319f
Downloading: Connecting...
Failed: [Composer\Downloader\TransportException] 401: Could not authenticate against github.com
[Composer\Downloader\TransportException]
Could not authenticate against github.com
Exception trace:
() at phar:///usr/local/bin/composer.phar/src/Composer/Util/RemoteFilesystem.php:580
it's a machine with fixed ip. strangely I can wget this file without problems?
Because you use private repositories maybe? Or just vcs repositories
defined in composer.json overall.. Because that's not covered.
nope. None of this is used in this project. But it seems to work / won't work, so maybe it's something api limit related.
Most helpful comment
Well it would seem problem solved without any changes needed here :smile:
Cheers @pengwynn!