I'm happy to move this to a feature-request, though I'm not 100% sure. Please advise. We've observed that when we mix 3rd-party module imports from ISV's, it's quite common that we hit assembly version collisions.
A very common example:
We can't use both these modules in the same session. If we import Module1, then Module2, the cmdlets from Module2 throw claiming assembly version conflicts with Newton.Json. Our current work around is to kill the powershell process and start over with the other module import.
Should powershell isolate module imports into their own AppDomains? I'm happy to implement a repo, if this is not already filed.
IIRC AppDomains are not supported in .NET Core.
Even if AppDomains are not supported in .NET core, the issue still remains. How can powershell support multiple versions of dlls imported in the same powershell process? I can't update 3rd party modules to use newer dlls versions. So I'm stuck having to explicitly know that I can never use Module1 and Module2 cmdlets in the same ps-session.
Any progress on this?
I have the same issue.
I have a "JsonSchemaValidation" module build by myself which should use:
Newtonsoft.Json, Version=10.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed
At the same time AzureRm.Profile uses:
Newtonsoft.Json, Version=6.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed
PowerShell always uses the "newtonsoft.Json" dll from the module that is being imported first!
Looking in the appdomain, both DLLs are present.
In case of AzureRm.Profile is being imported first, my "JsonSchemaValidation" cmdlets does not work.
@gogbg You could try to use bindingRedirect
.
Also it is expected that AppDomain will come back in .Net Core 2.1.0.
I have the same issue.
My binary module has a dependency on assembly version 8 (JSON.NET).
I use it together with another module, which has a dependency on same assembly version 6.
As soon as I Import-Module
both of them, 2 different versions of the same assembly are loaded into the AppDomain.
This normally wouldn't present an issue, but my binary module uses features only available in version 8 of dependent assembly. This throws MissingMethodException
.
@iSazonov
I tried to handle AssemblyResolve
event on the current AppDomain. I redirect any previous version of dependent assembly to v8. Unfortunately, this is too late. This event handler kicks in when I run the first cmdlet from my module. At that stage, the current AppDomain already contains v6.
This would not happen in a normal executable, as assembly redirects or assembly resolve handler would kick in when application attempts to load v6.
@SteveL-MSFT
Is this issue still being looked at? It makes some of the modules unusable together in the same session.
It certainly does not solve this problem in general but If we talk specifically about Newton.Json then we might consider replacing it with System.Json.
@xenalite yes, this is an issue we are aware of and have had internal discussions about it. @daxian-dbw perhaps for now we can document some workarounds until we can address this in code
Hello everyone!
BTW it also seems that PowerShell itself depends on Newtonsoft.Json:
PS> [System.AppDomain]::CurrentDomain.GetAssemblies() | where { $_.Location.Contains("Newton") }
GAC Version Location
--- ------- --------
False v4.0.30319 c:\Program Files\PowerShell\6.0.1\Newtonsoft.Json.dll (currently - version 10.0.3)
It means that if someone imports module which depends on newer version of Newtonsoft.Json then some of its commands will fail with error:
Could not load file or assembly 'Newtonsoft.Json, Version=11.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'. Could not find or load a specific file. (Exception from HRESULT: 0x80131621)
But it seems to me that this issue is rather general. I mean it must have existed even before PowerShell Core and .NET Core. And there is no solution?
@SteveL-MSFT @xenalite @daxian-dbw when documenting workarounds please consider the scenario that we cannot recompile the 3rd party compiled modules, nor can we manually add binderRedirects to powershell.exe.config.
The only workaround that we've identified is...
Update our large ps1 scripts, spawn portions of the work into child powershell.exe's with a collection of decomposed scripts. This anti-pattern results in the mashing of teeth and rending of garments.
@curiousdev workarounds are temporal, plan is to have a proper solution. No ETA at this time as it's not actively being worked on due to other priorities.
@SteveL-MSFT Seeing that it's marked Up-for-Grabs
, what's the best way to discuss the solution so a community member can push forward on this? Can you share internal team discussion? If a solution has been identified, I'm happy to consider implementing it for review.
@curiousdev the Up-for-Grabs
label was removed 26 days ago, but I would greatly appreciate if you would like to take this on! There hasn't been any internal discussion yet on this so you are free to propose a solution. @daxian-dbw would be best to review your proposal. You can just start by having a discussion in this issue. Should the need arise it may make sense to author a RFC.
Even if AppDomains are not supported in .NET core, the issue still remains
@curiousdev And even though AppDomains are not supported it looks like they have something new to solve dependency problem - Assembly Load Contexts:
https://github.com/dotnet/coreclr/blob/master/Documentation/design-docs/assemblyloadcontext.md
@SteveL-MSFT This is the issue that we briefly talked about at psconf
I think we could solve this after .Net Core 2.1 - we get assembly appdomains or context.
Now that we have netcore2.1 it makes me sad seeing this now only being considered for 6.2, is thjs due to the complexity of the implementation?
It would be great if there was at least documentation on it, especially since the special case by @xenalite here causes also issues on Windows PowerShell, which already has AppDomains integration.
Wouldn't this issue be a great candidate for the feature flags that you want to trial @SteveL-MSFT ?
@bergmeister the challenge for 6.1 is we're getting towards the end and this work wasn't something we initially planned and making the changes now may prove risky (in terms of regression).
One way we can do things like this is using remoting. If you could remote to your own computer without needing to be elevated, you could do something like this:
$s = New-PSSession -Local
Import-Module Storage -PSSession $s
And we could just make a shortcut by adding a switch to Import-Module
to simplify it: -InNewSession
Anything _else_ we could do to "work around" the inability to load different versions of the same assembly is going to have _other_ compromises. That's the nature of the problem.
I mean, one of the _strengths_ of PowerShell is that it's one single app, with all the commands just being methods in classes. That means we can output real objects and pass them around _and call methods on them_ in different commands from different modules.
Every layer of abstraction that you add to that (Load Contexts, AppDomains, and even separate Applications) adds extra work that has to be done and/or removes functionality. If you have separate Applications, you have to serialize your objects, and you loose the ability to call methods, etc. If you have AppDomains you can pass objects, but only if you load the metadata for them on both AppDomains, and it requires remoting calls and object passing, etc.
It seemed, a few posts back, that there have been conversations about this on the side which suggest that this could be done with AppDomains or LoadContexts in a way that _would not_ have huge performance implications -- can anyone explain more? My best understanding of AppDomains would add a lot of caveats: we'd be able to load (and possibly unload) multiple versions of the same assembly ...
But PowerShell would have to not just load each module in it's own app domain, but all of the module's dependencies. At least the metadata for every object being referenced (i.e. all the PowerShell classes) has to be loaded in _each_ AppDomain (and if we want unloading, we have to JIT compile each assembly into each AppDomain where you want to use it)
Then, PowerShell would need to handle inter-domain communication between every command in every pipeline. I can't even imagine the perf impact of doing remoting in between every command in a pipeline.
I think we'd have to avoid loading every module into it's own AppDomain, for performance reasons -- so there would have to be some logic to try and minimize the number of AppDomains while still using "enough" of them to handle as many versions of an assembly as we need to...
I can't quite tell from the linked document -- it looks to me like there's very little isolation (and thus, no ability to unload), I want to say maybe there's no need for remoting, since it's not mentioned in the linked article, but I can't tell...
How would a script module that loaded a specific version of an assembly avoid using the wrong one? How would a _unrelated_ script which uses commands from two different modules that perhaps use different versions of, say, a logging module ... know which assembly LoadContext to use?
I think we'd have to avoid loading every module into it's own AppDomain, for performance reasons -- so there would have to be some logic to try and minimize the number of AppDomains while still using "enough" of them to handle as many versions of an assembly as we need to...
I very hope that we can avoid. Perhaps we could Import-Module -Isolate
to minimize AppDomains.
As for performance it seems CoreCLR use a type mappings so I think there is some ways to get a solution without performance lost.
How would a unrelated script which uses commands from two different modules that perhaps use different versions of, say, a logging module ... know which assembly LoadContext to use?
I believe we haven't conflicts if cmdlet types is _mapped_ in global context.
What's the resolution for when I need to reference a newer version of NewtonSoft than that which ships with PowerShell core 6? (See @astral-keks comment)
@Mattacks I guess you could use application.config
to map a dll to new version.
Currently we always use latest version of packages in preview and release PowerShell Core versions.
@iSazonov What do you mean by 'application.config'?
Notwithstanding the fact that developers such as ourselves require a specific version because of dependencies we are bringing in, what would be the process if a different version of NewtonSoft would be needed for, e.g., security bug-fix reasons?
We update continuously packages to get latest security fixes.
Redirecting assemblies https://docs.microsoft.com/en-us/dotnet/framework/configure-apps/redirect-assembly-versions.
Wow, you did mean that feature from the .Net ark! I'll look into it, but that can only be a short term hack, can't it - you can't propose it as a long term solution.
We ship our product to customers. They might stick with a version of powershell installed on their kit for years. They have historically. That's different from upgrading our modules.
I think you have a significant issue here. We need to be able to easily reference the version of newtonsoft we need. And when a dependency ups the version, it needs to flow through seamlessly.
Do I need to raise a separate issue?
You could consider using LoadContext to resolve version conflicts https://github.com/dotnet/coreclr/blob/master/Documentation/design-docs/assemblyloadcontext.md
Solution: Load your target version of the assembly
https://gist.github.com/JamesRandall/444f3365751edb130bef197f2222cfa5
worked for me.
@goldcode this doesn't work if the assembly is native to pwsh, e.g. 'System.Runtime.CompilerServices.Unsafe', it's only if you have two modules that need different version DLLs.
Even then it's a "first module wins" race unless you coordinate loading the modules at the same time (in "practice", this means your module has to search all other modules for a potentially newer assembly before loading, and load that one instead) because whatever assembly gets loaded first is the one that "sticks", and even that still has problems because what if you add another module via install-module in the same session later?
LoadContext in .NET 3.0 (and subsequently PS7) solves this problem "sort of" for libraries that are not native to PS Core, you can have a Load Context per module (preparing a proof of concept for this), but you need to make sure your module doesn't output any types that are part of that LoadContext, because if another app needs it and the module's loadcontext has the type as a different version, the CLR will see them as different types and you get the super confusing "Cannot cast MyLoadContextType to type MyLoadContextType"
I was playing around with this and came up with something that seems to work, I'm not sure if it's a good or safe way of doing things though
In the root module I have
$Script:plugins = @{ }
$lc = [System.Runtime.Loader.AssemblyLoadContext]::new($false)
@(
"YamlDotNet.dll"
) | ForEach-Object {
$path = (Resolve-Path "$PSScriptRoot/Lib/$_").Path
$assemby = $lc.LoadFromAssemblyPath($path)
$Script:plugins[$assemby.GetName().Name] = $assemby
}
I can then create instances from the load context using
function New-PluginInstance {
param (
[string]$TypeName,
[object[]]$Params
)
foreach ($plugin in $Script:plugins.GetEnumerator()) {
$type = $plugin.Value.GetType($TypeName)
if ($null -ne $type) {
if ($null -eq $Params) {
return [System.Activator]::CreateInstance($type)
}
else {
return [System.Activator]::CreateInstance($type, $Params)
}
}
}
throw "Type ${TypeName} was not found in the plugin cache"
}
$deserializer = (New-PluginInstance -TypeName "YamlDotNet.Serialization.DeserializerBuilder").Build()
This seems to work fine, and I can import the Az.Aks
module, which has YamlDotNet as a RequiredAssembly, in the same session
@JustinGrote is this the right way of using LoadContext?
@hbuckle I can't comment on the "Right" way but what I would envision as a design:
@JustinGrote Have you an interest to pull RFC?
@iSazonov I'm a terrible c# developer, if I have some spare time I can make an effort but for now I probably need to finish some "old business" projects before jumping on something new.
@JustinGrote revealed a key problem about isolating modules -- type identity. That's the main reason why the RFC Loading Modules into Isolated AssemblyLoadContext got withdrawn. But more and more scenarios are affected by the assembly conflict issue and we have to move toward this general solution in PowerShell.
I will update that RFC soon to call out the guidance for module author is to not rely on users to use type exposed from the module directly, but be noted that the Azure PowerShell modules actually heavily depends on allowing users to directly use the types exposed by Azure PowerShell modules, so it's hard to say if that "guidance" is acceptable ...
It seems we need a new PowerShell abstraction for this like we have provides and provider prefixes. I mean something like [<loadcontext>::io.fileinfo]
. And maybe switching default context for a scope.
[<loadcontext>::io.fileinfo]
or something like it seems workable, perhaps with an update to Add-Type
to accept a load context as well.
The method I posted above seems to work until this is supported natively although it's quite clunky.
What with the growth in yaml I wouldn't be surprised if yamldotnet becomes the next newtonsoft.json in terms of people including it in modules.
Import-module -name az.aks
Import-module -name platyPS
platyPS\New-MarkdownHelp -Module az.aks
does not work as both platyPS and az.aks depends on YamlDotNet.dll.
Do we have a workaround for that?
I have tried both approaches, using AppDomain.AssemblyResolve event, or AssemblyLoadContext.Resolving events but without any success.
The import of the second module fails with:
Import-Module: Assembly with same name is already loaded
My real use case is to generate markdownhelp for module that has az.aks as dependancy :)
@gogbg I believe the module itself has to call the assembly and methods from the loadcontext, you can't facilitate it for it, you'd probably have to edit the source of one of the two (platyPS would probably be easier).
Also are you doing this in 5.1? An alternative is loading order with PS7, PS7 when doing add-type will do sort of a "binding redirect", in the sense that if an assembly is already loaded but with a different version it will just silently continue, so if you load the assembly with the latest version first (and assuming there aren't any backwards incompatible changes), it should work for both modules.
Actually in PowerShell 6.2.4 i can simultaneously load az.aks
and platyPS
PS C:\> get-module az.aks,platyps
ModuleType Version Name ExportedCommands
---------- ------- ---- ----------------
Script 1.0.2 Az.Aks {Get-AzAks, Import-AzAksCredential, New-AzAks, Remove-AzAks鈥
Script 0.14.0 platyPS {Get-HelpPreview, Get-MarkdownMetadata, Merge-MarkdownHelp, New-ExternalHelp鈥
PS C:\> [System.Runtime.Loader.AssemblyLoadContext]::GetLoadedAssemblies() | Where-Object {$_.FullName -like '*YamlDotNet*'} | select fullname
FullName
--------
YamlDotNet, Version=5.0.0.0, Culture=neutral, PublicKeyToken=ec19458f3c15af5e
YamlDotNet, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null
whereas in PS 7.0.0 I cannot :
PS C:\> Import-Module -Name 'Az.Aks'
PS C:\> Import-Module -Name 'platyPS'
Import-Module: Assembly with same name is already loaded
PS C:\>
I can also see some differences in the System.Runtime.Loader.AssemblyLoadContext api.
For modules do not exporting binary types (like PlatyPS) we could enhance PowerShell engine to load such modules in custom context. Such modules could have new parameter in PSD1 file like:
AssemblyContext = Default # Default current behavior
AssemblyContext = AutoIsolate # Load module in random generated assembly context
If engine see the parameter and RequiredAssemblies = @('Markdown.MAML.dll','YamlDotNet.dll')
it load the assemblies in custom context.
@iSazonov are you aware of solution/workaround to import both Az.Aks
and platyPS
in PS7 at the same time?
@iSazonov the AssemblyContext
key in the psd1 seems a good idea.
I would suggest also adding parameter to Import-Module -AssemblyContext
so we can handle unpredictable issues on per case base.
adding parameter to Import-Module -AssemblyContext
So far I do not see this as necessary because only an author of the module can guarantee the module correct work in custom context. It is simplest case for start. After we get an experience with the feature we can think about the parameter.
I'd ask @daxian-dbw and @rjmholt to review the idea.
@iSazonov the module author cannot predict all the possible cases in which his module will be used, especially for generic modules like platyPS. So for unpredictable cases instead of waiting for the module author to implement fixes for edge cases, it would be good to have the Import-Module -AssemblyContext
parameter.
Basically what I mean is expose the internals how module assembly to ALC are being loaded
to the developer/scripters so they can handle edge cases on their own
@gogbg No, only module developers, not consumers, can ensure whether the module works in isolated context or no.
The module developers shouldn't expose any types from underlying assembly. They have two option to achieve this:
If users would load a module does not designed for working in isolated context the module would be not operational or continue to conflict with other dll-s.
@iSazonov I agree with your point.
My point is however different.
In practice a lot of modules can have conflicting dependencies, and the module developer might not event know what AssemblyLoadContext is. So it would be good for me as a consumer to have option to import his module into separate ALC, instead of going to contribute to his codebase as this typically is much slower process.
If you are afraid that adding parameter to Import-Module -AssemblyContext
can cause issues in certain cases just add warning message for that, but give me the option to use it in cases where it does not cause issues.
@gogbg You say about general scenario. From my current understanding we can not address the scenario in simple way, it would be very complex. I believe we will have to add a new PowerShell language abstraction to address the scenario.
If we first address a scenario like PlatyPS we will go half way.
I'd ask @daxian-dbw and @rjmholt to review the idea.
We actually discussed this at length a little while back. It's not a simple question unfortunately.
I should start by saying that we've got an implementation using another ALC running successfully in the wild, but it took work to implement correctly and we didn't get everything the first time; some type-based APIs are still not entirely perfect.
There are plenty of tricky edge cases, but the biggest one is leaking dependency types in your API, leading to weird exceptions that say things like unable to cast object of type 'Dependency.ThisObject' to type 'Dependency.ThisObject'
. Basically, putting something in another ALC introduces subtle type equality issues, and just loading something into another ALC doesn't mean you'll be able to call it.
So while PowerShell could just load everything into its own ALC, that could lead to pernicious errors. Especially where a set of modules all assume they share a dependency (like many of the Az modules).
Naturally ALC issues are also compounded by the fact that PowerShell uses reflection pretty much everywhere, so we don't know what else might have holes in it there.
Similarly, while we could offer an -InNewAssemblyContext
module load switch, it wouldn't have any meaning for non-binary modules (and what about modules that load binaries but as nested modules or required modules or not as modules at all?), and there's real potential for it to be abused. One possibility is that users take it up as an "it just works" fix that later causes much subtler crashes, and another is that it allows module authors to abrogate the need to fix issues with their module because you can just load it differently.
The general assumption is that binary module developers have a better understanding of .NET mechanics than most of the users importing their module, so it possibly makes more sense to expose this to module developers. But again, giving them a big hammer to say "load my module in a new ALC" doesn't solve the whole issue; they need to understand what the implications are and ensure that their module can reliably work across ALCs by doing things like making sure they aren't leaking their dependencies' types in their own APIs. That's the kind of analysis that gets done when the module maintainers look at using an ALC themselves.
I think it still definitely makes sense for PowerShell to provide ALC-isolated module loading, but we need more than a switch basically, since we need to help module authors understand and work with such a functionality.
With all this said, we could offer multiple ways to load module into a new ALC, especially since any one module doesn't intrinsically have a dependency conflict issue; the module consumer faces the issue because two modules that don't know or care about each other have a conflict. I think it's worth looking into as well, but I should say that:
So yeah, to respond more specifically:
In practice a lot of modules can have conflicting dependencies, and the module developer might not event know what AssemblyLoadContext is. So it would be good for me as a consumer to have option to import his module into separate ALC, instead of going to contribute to his codebase as this typically is much slower process.
Module developers will need to do their homework even if we implement an ALC loading feature, and they'll need to sift through their public APIs to weed out types from their dependencies. I don't see a fast route here; PowerShell can't abstract this away, since it's a concept in the fabric of the platform. Them going through the slow process of understanding an ALC is kind of required.
Implementing a user-facing switch to load things into an ALC anyway is possible, but is still a lot of work that we don't have on the board right now. And people using it should understand that there's risk involved.
This is definitely an area we'd like to invest in in the future, but because of how fundamental it is, we need to be methodical about it.
Finally, my personal take on what would be good:
Import-Module
switch parameter to do the same (and possibly an opt-out to override the module)[assembly]::Load
is something we probably can't control)@rjmholt nailed my concerns and recommended go-forward.
Module developers should be able to specify one or more assembly load contexts on their own in a "standard" way and consume types, methods, and assemblies from those within the module scope, and have some sort of guidance if it is detected they are leaking types from the module scope out into the main scope.
The module manifest flag should also be tied into remove-module such to dispose the assembly contexts and free up the dll's to avoid the locking issues today, as well as allow reloading for easier testing and side-by-side usage of modules requiring different assembly versions.
In both cases the onus should be on the module developer, but it should be made easy to manage for them like how Export-ModuleMember is used today.
Do we know scenario without type conflicts? I think yes:
Currently Engine can only import cmdlets (functions/aliases) to global.
As we can see PowerShell engine must be enhanced to import module's cmdlet types from custom ALC and can be able to resolve the types. I believe we can implement this _without side effects_ because all will isolated in custom ALC.
For more complex scenarios Engine could import module's types from the module custom ALC.
__Since the import process is under Engine control Engine can block/resolve/avoid type conflicts.__
Then if we implement type prefixes as proposed above Engine can import all needed types from isolated modules. Although here we certainly can鈥檛 avoid all the side effects completely.
Do we know scenario without type conflicts? I think yes:
I think the technical definition here is that any type coming from an assembly in $PSHome
(PowerShell's broader equivalent of a Base Class Library) is a type that a module can freely use in its APIs. If a module exports its own types, manipulating ALC-internal types from PowerShell probably isn't an issue, but weird things will happen if you try and do something like use the output of a cmdlet from one module as the input to another.
One scenario I imagine will be common is wanting to export the types defined by a module, but not those it pulls in as a dependency. A possible solution there is as @iSazonov says; when an assembly is imported as a module, we import those types into the main ALC as well (I believe this is possible, but I'm not 100% sure). That means nested modules work too. It also means those modules wouldn't be able to be unloaded.
Given that I see a couple of common patterns:
This is in addition to the default of just loading everything into the main ALC.
The risk, of course, is that the subtleties of these scenarios will be misunderstood or ignored and that modules will be marked or imported in a way that initially seems to work but causes subtle ALC-caused type errors later.
I think the best we can do to help that is to gate a module ALC feature on better documentation and tooling, particularly:
This has been well doc'd by @rjmholt and @sdwheeler at https://docs.microsoft.com/powershell/scripting/dev-cross-plat/resolving-dependency-conflicts
That being said, we may want to give an opt-in mode to module owners for this functionality more generally, but it's not going to come in 7.1
When writing a binary PowerShell module in C#, it's natural to take dependencies on other packages or libraries to provide functionality.