Discussions should happen on this issue. Please link other issues with similar asks to this one.
July 2018 - Announced feature
December 2018 - Blog
It appears that the current specification document is incorrect regarding how version ranges work with direct dependencies. It says that given the following package reference:
<PackageReference Include="My.Sample.Lib" Version="[4.0.0, 5.0.0]"/>
That it would pick the highest version, 5.0.0. This is not what I observe. Instead, that version range would resolve to the lowest version, 4.0.0.
Floating version numbers appear to be the only case where any sort of highest version logic is applied.
I also don't see anything indicating that highest would be expected in any of the documentation I've found:
https://docs.microsoft.com/en-us/nuget/consume-packages/dependency-resolution
https://docs.microsoft.com/en-us/nuget/consume-packages/package-references-in-project-files
Thanks @bording for reporting the discrepancy. This was an oversight from my end. I have corrected the same in the spec.
If a range is specified - NuGet resolves to the lowest version specified in that range.
E.g.
Feed has only these versions for My.Sample.Lib: 4.0.0, 4.6.0, 5.0.0
a. Range is specified:
<PackageReference Include="My.Sample.Lib" Version="[4.0.0, 5.0.0]"/>
NuGet resolves to the 4.0.0 here.
b. Range is specified contd..
<PackageReference Include="My.Sample.Lib" Version="[4.1.0, 5.0.0]"/>
NuGet resolves to the 4.6.0 here.
What happens if the NuGet.config file is on one machine but not on another, or is changed after the lock file is create - does the lock file still take effect?
IMO, sources definition should not matter for a lock file. Irrespective of which source the packages are coming from, finally it should be the same package that goes into the build.
To make sure whether its the same package or not I am thinking about putting the hash of the package in the lock file. Open to other ideas.
@anangaur Sorry, didn't explain myself very well :-) What happens in the following situation: -
What happens to the lock file? Does it still get used or is it discarded and deleted?
@isaacabraham Haven't put much thought on it but I am leaning towards generating the file by default and not as an option.
Let me spend sometime on it and thn I will come back with the proposal.
why not just have a command line option to create or consume a project wide project.assets.json
file at restore time?
Something like:
nuget restore path\to\something.sln -AssetsOut something.assets.json
and later (say on a build server)
nuget restore \path\to\something.sln --AssetsIn something.assets.json
It would be up to the consumer to regenerate/edit the assets file and check it into source control and to configure their builds to use it.
This would seem to allow the requested functionality with the minimum amount of changes....
Asset file is not just the restore graph but has multiple overloaded functionality and lot of contents irrelevant to a typical lock file. For example it contains all the warnings/errors applicable for the project restore. In addition to this, the assets file is not easy to understand or modify. (IMO even the lock file should not be manually edited). I have seen different assets file getting generated for the same dependencies (the lines get re-ordered) and hence it will be difficult to manage the merge conflicts. Assets file in the current form will not be able to handle floating versions.
However the current idea is very similar to generating a lock file by refactoring out the relevant contents from assets file that gets generated in the obj directory. I would rather solve this issue using a lock file mechanism than re-using the assets file and then live to support the backward compatibility in the long run :)
True, just having a packages.config
as input with specific, locked versions would be enough, having the ability to generate that file on restore would go a long way. packages.config
is documented and presumably will be supported for the foreseeable future.
I would really like to have this feature in the NuGet client. Are there any updates (or perhaps even a roadmap) you can share?
This would be a great feature for nuget to support CI/CD workflows. Is there any ETA on when this feature will be available?
@forki @isaacabraham Wanted to bring the conversation here :) The current proposal is having the lock file at project level with the intention to lock the transitive dependencies per project. However, I have heard the problems it could have at runtime and I am contemplating to bring it at solution level.
Currently NuGet restore does a project level restore - may be that has to change if we want to bring the lock file to solution level? @jainaashish
@anangaur hi :-)
OK. I'll try to (briefly) outline some of the reasons that come to mind why I think locking dependencies at either project or solution level will be a mistake. I'm sure @forki will have his own thoughts as well.
If you lock at project level, at runtime, you'll be in the same precarious position you are now vis-a-vis dependency management i.e. you won't be able to guarantee that Project A and Project B both depend on the same version of the same dependency. At runtime, things might work, they might not.
If you lock at solution level, you'll still probably run into problems. I know many people that have large code bases with multiple solutions that share projects in the same repository. 99/100 times, they'll want those solutions to have the same exact dependencies for consistency and expected behaviour. They won't want to open one solution and get one set of versions, then open another solution and get another set for the same project.
Alternatively, consider the situation where you e.g. have two solutions, one for your "main" codebase and another for integration and unit tests (which I have seen before). Do you really want to maintain a separate dependency chain for both of them? What happens when they get out of sync?
Ask yourself this - how often in a repository do you explicitly want different versions of the same dependency? I suppose I would say this (because of my involvement with Paket) but decoupling yourself from projects and solutions will free you from all of these issues. Instead, consider pinning dependencies at the folder level (typically repository level). You then get consistency across all of your projects and solutions and scripts, because there's there's only one dependency graph across the whole repo.
For those cases where you need to go "outside the box" and must have different dependency graphs within a repo, consider something like Paket's Groups feature, which allows you to explicitly have distinct dependency chains (with their own section in the lock file). However, this is the exception to the rule - it doesn't happen very often.
Just my two cents - you may well feel differently, and it's entirely your choice how you proceed. Good luck! :-)
I agree with @isaacabraham. I know of code bases for a single application split into multiple solutions with some of these solutions even overlapping, so locking dependencies neither at project level nor at solution level works in such scenarios.
Having a lock file at repository level (or a arbitrary file system subtree) sounds sensible to me
@isaacabraham Thanks for the detailed reply. I am fully in agreement in what you mentioned above. This is something required as part of the following:
https://github.com/NuGet/Home/wiki/Manage-allowed-packages-for-a-solution-%28or-globally%29
This issue is primarily to lock down the transitive dependencies (at install time) so that restores are repeatable across space and time. I do see both of these workitems are related but I am trying to kind of segregate these out and attack one problem at a time.
Right now the proposal is to have NuGet generate the lock file at the time of install at project level as that is how NuGet restore works but I have been hearing a lot of voices to make the lock file generated at solution level. From twitter and from Paket experience, I understand that's your recommendation too.
@anangaur Hi. Sorry - but no, that's not my recommendation really. My comment just above states:
decoupling yourself from projects and solutions will free you from all of these issues. Instead, consider pinning dependencies at the folder level (typically repository level). You then get consistency across all of your projects and solutions and scripts, because there's there's only one dependency graph across the whole repo.
So I hope I'm clear here - working at either projects or solutions will not, in my opinion, provide a satisfactory solution that is either simple to understand, consistent and repeatable.
Again, though, I'll repeat: That's my opinion. It's entirely up to you how you proceed.
Sure. Thanks for the explanation. Will keep your recommendation in mind while I iterate over it. I will update the thread as we make progress. Appreciate your input here.
my recommendation:
I want to pile on "please don't introduce any functionality at solution level". Single solution typically represents a very small part of a bigger product that wants to harmonize the dll versions. Solutions should remain a superficial "ide convenience" feature instead of getting critical new responsibilities.
@anangaur after reading through the recent wiki updates: I would discourage using .lock
as a file extension for the planned feature. a .lock
carries the meaning of a semaphore file on *nix or a marker for things not to delete. That's why there's packages-lock.json
and friends or .cache
etc. to try to avoid using .lock
as a file extension for non-"traditional" .lock
meanings. (I've also seen ppl use yarn but then not add yarn.lock
to their source control because of thinking it was a temporary file or existing gitignore entries)
*.lock is VERY commonly for package managers. Many use that.
Martin Andreas Ullrich notifications@github.com schrieb am Sa., 14. Apr.
2018, 16:12:
@anangaur https://github.com/anangaur after reading through the recent
wiki updates: I would discourage using .lock as a file extension for the
planned feature. a .lock carries the meaning of a semaphore file on *nix
or a marker for things not to delete. That's why there's
packages-lock.json and friends or .cache etc. to try to avoid using .lock
as a file extension for non-"traditional" .lock meanings. (I've also seen
ppl use yarn but then not add yarn.lock to their source control because
of thinking it was a temporary file or existing gitignore entries)—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/NuGet/Home/issues/5602#issuecomment-381331929, or mute
the thread
https://github.com/notifications/unsubscribe-auth/AADgNP2bC-SDypd8FQt7RekPo9eAYxaAks5togO-gaJpZM4OYukQ
.
I'll throw another angle in this discussion.
One of my clients is a financial institution and due to legislative requirements they must have repeatable builds at any given point of time.
With packages.config it was achieved by committing packages into repository, however the new model employed by .NetCore that uses a shared cache has become a significant issue for those guys...
@RussKie How is the centralized cache a problem? Why do you need to commit packages into a repository? That should be NuGet's job to bring in the same packages every time you restore irrespective of when and where you restore the packages.
In enterprise a lot of systems are either behind proxies or do not have access to internet at all.
A developer adds a Nuget package to a project on a local machine. A build server does not have a connection to internet and thus unable to restore a package. Epic fail.
Another developer with a different level of internet access clones a project to his/her workstation. Due to proxy permissions the developer is unable to connect to the source Nuget gallery (e.g. nuget.org) and restore packages. Epic fail.
For CI builds, if we use UpdateLockFileOnRestore: deny
, will that fail if some packages have been added to the repository and our projects are set to pick up latest package versions?
I'd hope not. It should only fail if it would have no way to meet the PackageReference conditions with what is in the lock file.
Lock file will have exact versions of the packages locked and hence if it cannot find the locked version of packages it will fail to update the lock file as you chose the option ‘deny’.
If you want the resolution to change then why set the option to deny?
Sorry, probably not made the scenario clear. It's a pretty standard CI build, and I want to be sure that if the default resolution would be different to what is in the lock file it won't error or even warn.
The spec suggested that maybe if the resolution was different it would fail with deny, but it's not clear if that's just if the lock file package resolution is not possible or if the resolution has changed for a different reason like a new package version being available.
Thanks for explaining the scenario. I guess we are on the same page then. Once you have a lock file for a project, the only check we do is if there was a change in requested versions from the time lock file was generated. If not, the lock file is just used to get the packages.
So if you had stated the following:
<PackageReference Include="Contoso.Utility.UsefulStuff" Version="*" />
And the lock file resolved to 2.0.0 (latest version at the time). On the CI machine when you trigger the build, if you didn't change the PackageReference
, it restore will just resolve to the version 2.0.0 as mentioned in the lock file. Even if a newer version (say 2.1.0) is available. There won't be any warning or error raised here with UpdateLockFileOnRestore: deny
or with UpdateLockFileOnRestore: warn
Change in spec:
The option UpdateLockFileOnRestore
/--update-lockfile
has changed to RestoreLockedMode
/--locked-mode
. It is set to false
by default.
Hi
Is there any ETA on this. Wasn't this originally planned for Spring 2018?
I'd appreciate if some information can be provided about the estimate completion time for this.
regards,
Nas
Also, what impact will this have on Semantic Versioning and GitFlow?
Also, what impact will this have on Semantic Versioning and GitFlow?
@NasAmin Apologies for the late reply. I am not sure I understood your comment. Can you please elaborate?
The per project lock file feature would be available with VS 15.9 preview 5. I am currently working on the docs. The PR is here: https://github.com/NuGet/docs.microsoft.com-nuget/pull/1119
This is going in the right direction. Really looking forward to the lock file per solution.
Really looking forward to the lock file per solution.
@OsirisTerje Yepp. Remember our interaction on this one. I have a draft spec out here: https://github.com/NuGet/Home/wiki/Centrally-managing-NuGet-packages
Would love any early feedback.
PackageReferences with LockFile shipped in NuGet 4.9.0
Ship Vehicles:
I have a problem with this feature. When running on my dev environment (Windows), the contentHash
of some packages is different from the one on my build environment (LXSS). For example, on Windows:
"System.Data.Common": {
"type": "Transitive",
"resolved": "4.3.0",
"contentHash": "lm6E3T5u7BOuEH0u18JpbJHxBfOJPuCyl4Kg1RH10ktYLp5uEEE1xKrHW56/We4SnZpGAuCc9N0MJpSDhTHZGQ==",
"dependencies": {
"System.Collections": "4.3.0",
"System.Globalization": "4.3.0",
"System.IO": "4.3.0",
"System.Resources.ResourceManager": "4.3.0",
"System.Runtime": "4.3.0",
"System.Runtime.Extensions": "4.3.0",
"System.Text.RegularExpressions": "4.3.0",
"System.Threading.Tasks": "4.3.0"
}
},
But on Ubuntu (LXSS):
"System.Data.Common": {
"type": "Transitive",
"resolved": "4.3.0",
"contentHash": "OGX4ifHI67s/2KRrXGMmEjmszCptbhWS4BieBdiBPl/8pn0A1xPMx0Gn0dKK0zKtEaelYNh5M7M5lCZL0EA4eA==",
"dependencies": {
"System.Collections": "4.3.0",
"System.Globalization": "4.3.0",
"System.IO": "4.3.0",
"System.Resources.ResourceManager": "4.3.0",
"System.Runtime": "4.3.0",
"System.Runtime.Extensions": "4.3.0",
"System.Text.RegularExpressions": "4.3.0",
"System.Threading.Tasks": "4.3.0"
}
},
So of course, trying to restore on the build environment fails, and if I delete the lock file on the build environment, it gets recreated with different contentHash
values for some of the packages.
Is that a bug?
this is something that we noticed long long time ago. way before nuget was
annoucing lock files. That's why paket doesn't have hashes in it's lock
file.
Hope it gets fixed server side soon. /cc @isaac_abraham
Am Di., 18. Dez. 2018 um 18:46 Uhr schrieb Flavien Charlon <
[email protected]>:
I have a problem with this feature. When running on my dev environment
(Windows), the contentHash of some packages is different from the one on
my build environment (LXSS). For example, on Windows:"System.Data.Common": { "type": "Transitive", "resolved": "4.3.0", "contentHash": "lm6E3T5u7BOuEH0u18JpbJHxBfOJPuCyl4Kg1RH10ktYLp5uEEE1xKrHW56/We4SnZpGAuCc9N0MJpSDhTHZGQ==", "dependencies": { "System.Collections": "4.3.0", "System.Globalization": "4.3.0", "System.IO": "4.3.0", "System.Resources.ResourceManager": "4.3.0", "System.Runtime": "4.3.0", "System.Runtime.Extensions": "4.3.0", "System.Text.RegularExpressions": "4.3.0", "System.Threading.Tasks": "4.3.0" } },
But on Ubuntu (LXSS):
"System.Data.Common": { "type": "Transitive", "resolved": "4.3.0", "contentHash": "OGX4ifHI67s/2KRrXGMmEjmszCptbhWS4BieBdiBPl/8pn0A1xPMx0Gn0dKK0zKtEaelYNh5M7M5lCZL0EA4eA==", "dependencies": { "System.Collections": "4.3.0", "System.Globalization": "4.3.0", "System.IO": "4.3.0", "System.Resources.ResourceManager": "4.3.0", "System.Runtime": "4.3.0", "System.Runtime.Extensions": "4.3.0", "System.Text.RegularExpressions": "4.3.0", "System.Threading.Tasks": "4.3.0" } },
So of course, trying to restore on the build environment fails, and if I
delete the lock file on the build environment, it gets recreated with
different contentHash values for some of the packages.Is that a bug?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/NuGet/Home/issues/5602#issuecomment-448308066, or mute
the thread
https://github.com/notifications/unsubscribe-auth/AADgNGZvpjif2i1dQvNOk2rYc-E6GyoJks5u6SnrgaJpZM4OYukQ
.
if I delete the lock file on the build environment, it gets recreated with different contentHash values for some of the packages.
@Flavien This is by design. Lock files are auto recreated if the RestoreWithLockFile
property is set and if lock file is out of sync or not present. You can fail the build if lock file needs to be updated by setting the property RestoreLockedMode
to true
.
@forki , @Flavien
Content hash being different on different OS seems weird to me. We need to investigate this. In the mean time, we may need an option to disable content hash checking. Thanks for reporting this.
/cc: @rrelyea @jainaashish
Even I don't follow why contentHash is different for the same package across different OS? Is this ZipArchieve thing? or NuGet feed?
for clarification: in paket we tried to verify the hash that the server
sends. But that one is differing (in some case - not all the time) from
what we calculate locally. So after thinking about it: our issue is
probably not directly related.
Am Di., 18. Dez. 2018 um 19:13 Uhr schrieb Anand Gaurav <
[email protected]>:
/cc: @rrelyea https://github.com/rrelyea @jainaashish
https://github.com/jainaashish—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/NuGet/Home/issues/5602#issuecomment-448316790, or mute
the thread
https://github.com/notifications/unsubscribe-auth/AADgNNi6mPk2xWQXvG_0gnqM0wa5fHxxks5u6TBZgaJpZM4OYukQ
.
@jainaashish The package in question is System.Data.Common. Something to do with Fallback folder inconsistency?
Diffing my lock files between Windows and Ubuntu, the other packages that end up with a different hash:
ContentHash
will only differ when there are actually multiple copies of this package across these systems which could be because of the different NuGet feeds configured across these systems or different copies in global packages folder (%userprofile%/.nuget/packages) or dotnet fallback folder (~dotnet/sdk/NuGetFallbackFolder)
@Flavien if you delete the FallbackFolder on both Dev machine and the Build machine, restore
should work fine as the contentHash would be for the package from nuget.org which should be same. Can you try once?
@anangaur I tried and that fixed the issue.
Thanks @Flavien for validating the hypothesis. We still need to fix this issue: #7651
Will the content hash also be different between a W10 dev machine and a W7 build machine? I already cleared all nuget caches and removed that FallbackFolder everywhere but it still fails with hash mismatch.
What I don't understand is how can the same package with the same version number end up with two different hashes? To me, that is the root cause of the bug.
this is to do with dotnet lzma tool which generates a different nupkg for the same package (id/version) in fallback folder which gives a different hash for the same package, where one coming from fallback folder and another from nuget.org. You can find more details in this issue# https://github.com/NuGet/Home/issues/7414#issuecomment-431997934
Update: I am mistaken in the below suggestion. Please ignore it.
@Flavien , @springy76
~Just another pointer - You do not need to delete the packages, instead just point NuGet to not look at fallback folder. See details here: https://docs.microsoft.com/en-us/nuget/release-notes/nuget-4.9-rtm#packages-in-fallbackfolders-installed-by-net-core-sdk-are-custom-installed-and-fail-signature-validation---7414~
I inserted <RestoreAdditionalProjectSources/>
into autoexec.bat but did not work -- seriously: If someone writes a xml snippet into documentation then PLEASE give a hint where to place it (file -> parent-node/hierarchy).
@anangaur RestoreAdditionalProjectSources
has nothing to do with fallback folders, Not sure where did you get this idea? If you really want to suggest skipping fallback folders then RestoreFallbackFolders
is the right property. More details at https://github.com/NuGet/Home/wiki/%5BSpec%5D-NuGet-settings-in-MSBuild
@springy76 My apologies. My earlier suggestion was misplaced. I guess, for now, you can try the deletion or what @jainaashish suggested. We will investigate the issue and get back.
@springy76 , @Flavien Can you tell us more details like SDK versions (and repro steps)?
EDIT: Damn I wish I'd have seen this issue sooner. First I saw of this was the blog post on Nuget.org. Its great you guys are so much more transparent about things but I have a real bone to pick with y'all on the Nuget team.
Omg. Whats the point of checkin of package.json.lock if you're not going to use it by default! We need packages to be locked down at the time the developer picks a package via install or updates via update.
Do you guys ever check out npm? They did this several years ago via npm shrinkwrap
and enhanced it over a year ago to always write out a lock file to fix the same scenario... repeatable builds... even repeatable on developers machines. They use the package.lock when doing npm restore
and they even created an npm ci
command that bypasses all package dependency resolutions in favor of just restoring the exact package dependency tree.
It feels like the Nuget team is so opinionated about things that keep being proven wrong again and again. I've had my clashes with various members in the past about issues like this and I'm continually overruled then vindicated several months if not year or so later on issues.
I realize Nuget has some significant challenges especially with the multitude of platforms and whatnot. But this particular feature is a no-brainer guys...
Should be OPT-OUT if not desired and I can't imagine any scenario where that is legitimately the case.
Now its baked in as OPT-IN so everybody keeps going forward slamming their heads into their desks while the Build server starts installing unwanted new versions of DLLs that are different from the developer's machines...
Here's how its gonna go:
Developer: "Works on my machine!"
Manager: "It sure does... damn... what the heck???"
Developer 2: "Did you set that option in Nuget to only restore the exactly whats in package.lock file so that new versions of some of the transient dependencies dont get installed?"
Developer: "...Wait... what? I thought having the lock file would do that... I already had to set something in the CSPROJ file to turn this on."
And to be clear:
package lock files should only be generated at INSTALL or UPDATE time. When a restore happens, and a package lock doesnt exist (solve issue of old libraries) then CREATE IT. If this happens on build server, nothing you can do about it. If this happens on DEV machine, it shows up as new file. (Except on TFS where you HAVE to remember to promote the thing... another id10tic feature) You should have to OPT-OUT of this behavior.
Secondly I recommend, like npm restore
, to only install exactly whats in the package lock file. npm ci
command was created to make this process even faster by not evaluating any dependency. It simply deletes node_modules and restores EXACTLY whats in package.lock. For nuget, the restore command should ALWAYS respect the package.lock file and so that if run on the build system and it cant restore EXACTLY whats in packages.json.lock, because a transient dependency exact version went unlisted or similar (hash fail maybe?) then it should fail the build, so developers can evaluate the situation, come back to their developer machine and effectively evaluate the situation, by themselves running nuget restore which precisely matches up the packages to packages.lock file. What I'm saying is... the same thing should happen on my developer machine as on the build server... otherwise its nondeterministic even at checkin. A transitive dependency can update its version literally one second after checkin, and bam the build server is pulling newer version. And you've made this OPT-IN so that developer may or may not end up replicating the build server package tree. So again whats the point?
Maybe theres more to it, I cant imagine it... thats 99% of developer intended workflow regarding a package manager in relation to the build/release phase and SHOULD utilize LOCKED package modes. You made this opt-in instead of opt-out so the build output of a lot of projects will continue to be nondeterministic as time moves on which is just silly.
TL;DR the RESTORE should ALWAYS respect package.lock thats why it exists. INSTALL and UPDATE should always WRITE and UPDATE the dependency versions within packages.lock if needed, and any changes get exposed in the VCS checkin. If you didnt intend for that to happen... DONT CHECKIN THE PACKAGES.LOCK file! Very simple. You guys have made it mind numbingly irritating by now having to double check a csproj setting upon install/update/restore to even write the file, AND you make it so we have to now go into every build definition and double check that this feature is turned on.
@ericnewton76 check out https://fsprojects.github.io/Paket/ - .NET has this for years.
@ericnewton76 Thanks for your detailed feedback. Let me try to answer your queries (as I understand them) one by one and we can delve into specifics, as required.
We did not want to suddenly start producing a new file with an incremental release i.e. as part of 4.x. In addition, the lock file feature is not complete unless we solve the problem of centrally managing package versions and thereby having a central lock file per repo/solution. Once done, we will evaluate to make the lock file default.
Btw, the presence of lock file is enough to opt in to this feature. The property RestoreWithLockFile
is only to bootstrap into this feature. See the note section here for more details.
Install
and Update
should write to lock file and Restore
should always restore from the lock fileToday there are various ways to install and update the packages - not just from various tools but also directly writing the PackageReference
node into the csproj file:
PackageReference
node into the csproj file is so easy and has become very common. The intellisense in VS and VS code also assists you today in directly editing the csproj file to include a PackageReference
node. Once written into the csproj, the package is installed
using restore
. This might sound confusing but NuGet in PackageReference
world does not really have an install
. Everything is a restore
. PackageReference
node into the csproj file (There are some evaluations before writing into the project file, though). And then restore
actually brings the package in the project's context (the so called install
)To summarize, there were 2 options:
Option 1: Continue with the current NuGet model of installs and restores and introduce thi feature in most non-breaking way as possible.
Option 2: Re work on the notions of installs vs. restores and break the current way of adding a package by directly writing PackageReference
node into the project file as the restore immediately after such writes/edits of PackageReference
s would fail.
We chose Option 1. One can get the Option 2 behavior by setting the locked mode to true. Once set, restore will fail if it cannot get the exact packages as mentioned in the lock file. I feel this is what you want but as default?
nuget ci
similar to npm ci
This seems like a good idea. We can definitely evaluate this option if you and others feel this would be useful and if this helps in improved performance of restore (theoretically this has promise but we would need to run a few experiments to understand the quantum of gain).
Do let me know if this answers your concerns? Happy to discuss more and/or delve deeper into any of the topics elaborated above.
Thanks for the reply. Sorry if my tone seemed too adversarial. I can be passionate about this stuff and its hard to guard my words sometimes.
We did not want to suddenly start producing a new file with an incremental release i.e. as part of 4.x. In addition, the lock file feature is not complete unless we solve the problem of centrally managing package versions
Granted, npm formally brought on the package-lock.json
by moving to npm v5
... read their release notes and releases after v5.0.0 It should be a very interesting read for another package manager in a slightly different ecosystem.
However, if you introduce this lock file in v4.8 then whats the difference really? It seems like MS in general is afraid to kick the major version up due to marketing concerns or something else instead of technology needs. I won't say thats what Nuget team does but it seems to be the norm. Nothing wrong with Nuget v10... LOL. Anybody that complains just doesnt understand how software is built then.
Directly writing PackageReference node into the csproj file is so easy and has become very common
That's fine... that'll happen on the developer's machine and the RESTORE command should complain like a stuck pig. At least squeal with a warning message that says a new packagereference is found and should be installed properly to have a proper dependency graph analysis performed. And then go ahead and basically do an install. When nuget restore --ci
is running and the same occurs, then fail the build. Call a time out. And say "this is for your own good," and point them right at these messages as to why failing the build is necessary when a package lock file exists and there's packages being installed in a build server scenario that shouldn't be there. Again, complainers about this don't understand the problem until they smack into it when a transitive dependency floats on them and breaks the runtime behavior somehow.
Problem is:
Option 1 is still non-deterministic builds.
Option 2 is better but still requires that specific opt-in that should be a default, and thus non-deterministic builds
The goal is a deterministic build. Both of your current options listed don't solve that... they just add more configuration switches to underlying mechanisms that don't help you achieve a stable build. And when the feature releases formally, now you have this extra cruft that has to be supported ad-infinium to preserve that exact behavior when it might not be true later.
Probably a difference of opinion here... you guys are trying to go for least disruptive change for something that will 100% make their lives better, but by not jumping in feet first, you're equivocating on a feature that is a must-have. In addition, did you try this out in-house first? Did you guys scratch their heads saying "omg! this would be bad for it to precisely match up my dependencies! and when it doesnt match, its notifying me that my hand edited package reference accidentally checked in due to TFS lunacy is warning me that something is awry!" I have a feeling it was the opposite. Please note I'm trying to be humorous here, to keep this deep subject in the realm of amusement.
You have to honestly ask yourself, would a developer whos tested precise versions of packages for days, possibly weeks, on his own machine, be okay with an algorithm making decisions for him about inaccurate package versions by library developers that probably will introduce inadvertent breaking changes into his runtime when he presses that build button to release to production on a thursday night at 10pm?
nuget restore --ci
or nuget ci
Just check out npm ci
and some of the release notes about it. Makes perfect sense, and has legitimately locked down that crazy world of javascript package management! Quite amazing if you ask me... considering how fast things move over there too.
This didn't get documented in the release notes for 4.9.0 because it was mistakenly in 5.0 milestone.
(working to fix)
For now, setting to 4.9.3 - will fix release notes in 4.9.0 when we ship release notes for 4.9.3 and then reset this back to 4.9.0
Will be closing this issue. Please spin off any follow up discussions in other issues.
Wasn't the problem referenced above the same problem as this?
are there plans for a nuget
restore --locked-mode
? Without this it seems like it's impossible to use the lock file on Azure Pipelines for framework/non-dotnetcore projects.
@fowl2 Can you try msbuild
instead:
msbuild.exe /t:restore /p:RestoreLockedMode=true
I'm still struggling with this error:
error NU1403: The package System.Collections.Specialized.4.3.0 sha512 validation failed. The package is different than the last restore.
(happens with any package randomly)
I'm using NuGet 5.0.0.6 and have cleared my caches and fallback folders numerous times. I am still getting this error on my CI build no matter what I try.
Any ideas what's wrong?
Can specify the exact steps? Or may be provide a repro?
It would also be good to know the sources you are using. One of the reasons could be that the sources have different packages (with different SHA) and depending on which source was used to restore, you might see failures.
Here is a repro: https://github.com/Flavien/nuget-lockfile-repro.
I have generated the lock file by building the project on Visual Studio (Windows) 16.0.0.0 Preview 5.0. NuGet version is 5.0.0.
When I clone this on my Ubuntu WSL and run dotnet restore --locked-mode
, I get this:
flavien@LAPTOP-FLAVIEN:~/NuGet/NuGetLockFile$ dotnet restore --locked-mode
Restore completed in 163.45 ms for /home/flavien/NuGet/NuGetLockFile/NuGetLockFile.csproj.
/home/flavien/NuGet/NuGetLockFile/NuGetLockFile.csproj : error NU1403: The package Microsoft.AspNetCore.2.2.0 sha512 validation failed. The package is different than the last restore.
Restore failed in 5.25 sec for /home/flavien/NuGet/NuGetLockFile/NuGetLockFile.csproj.
Here is dotnet --info
on my WSL install:
.NET Core SDK (reflecting any global.json):
Version: 2.2.202
Commit: 8a7ff6789d
Runtime Environment:
OS Name: ubuntu
OS Version: 18.04
OS Platform: Linux
RID: ubuntu.18.04-x64
Base Path: /usr/share/dotnet/sdk/2.2.202/
Host (useful for support):
Version: 2.2.3
Commit: 6b8ad509b6
.NET Core SDKs installed:
2.2.202 [/usr/share/dotnet/sdk]
.NET Core runtimes installed:
Microsoft.AspNetCore.All 2.2.3 [/usr/share/dotnet/shared/Microsoft.AspNetCore.All]
Microsoft.AspNetCore.App 2.2.3 [/usr/share/dotnet/shared/Microsoft.AspNetCore.App]
Microsoft.NETCore.App 2.2.3 [/usr/share/dotnet/shared/Microsoft.NETCore.App]
To install additional .NET Core runtimes or SDKs:
https://aka.ms/dotnet-download
The only source I have on Visual Studio is: nuget.org (https://api.nuget.org/v3/index.json).
Sources on WSL:
Registered Sources:
1. https://www.nuget.org/api/v2/ [Enabled]
https://www.nuget.org/api/v2/
We did not want to suddenly start producing a new file with an incremental release i.e. as part of 4.x
@anangaur - Yet it was OK to start breaking/normalizing version numbers in a minor release?
There seem to be discrepancies across platforms or systems. I locked on Windows 10 and can restore fine on that same system. However, restoring on Arch Linux fails for System.Collections.NonGeneric
. If I restore and let it update the lock file, then I can see it produces a different hash for the aforementioned dependency. Furthermore, I use an Ubuntu agent for my CI pipeline and there System.Collections.NonGeneric
fails too, but so does Microsoft.NETCore.Platforms
(which is fine on Arch).
Edit: solved by following https://github.com/NuGet/Home/issues/7921#issuecomment-478152479
Most helpful comment
EDIT: Damn I wish I'd have seen this issue sooner. First I saw of this was the blog post on Nuget.org. Its great you guys are so much more transparent about things but I have a real bone to pick with y'all on the Nuget team.
Omg. Whats the point of checkin of package.json.lock if you're not going to use it by default! We need packages to be locked down at the time the developer picks a package via install or updates via update.
Do you guys ever check out npm? They did this several years ago via
npm shrinkwrap
and enhanced it over a year ago to always write out a lock file to fix the same scenario... repeatable builds... even repeatable on developers machines. They use the package.lock when doingnpm restore
and they even created annpm ci
command that bypasses all package dependency resolutions in favor of just restoring the exact package dependency tree.It feels like the Nuget team is so opinionated about things that keep being proven wrong again and again. I've had my clashes with various members in the past about issues like this and I'm continually overruled then vindicated several months if not year or so later on issues.
I realize Nuget has some significant challenges especially with the multitude of platforms and whatnot. But this particular feature is a no-brainer guys...
Should be OPT-OUT if not desired and I can't imagine any scenario where that is legitimately the case.
Now its baked in as OPT-IN so everybody keeps going forward slamming their heads into their desks while the Build server starts installing unwanted new versions of DLLs that are different from the developer's machines...
Here's how its gonna go:
Developer: "Works on my machine!"
Manager: "It sure does... damn... what the heck???"
Developer 2: "Did you set that option in Nuget to only restore the exactly whats in package.lock file so that new versions of some of the transient dependencies dont get installed?"
Developer: "...Wait... what? I thought having the lock file would do that... I already had to set something in the CSPROJ file to turn this on."