.NET Core SDK (reflecting any global.json):
Version: 2.1.700-preview-009597
Commit: 96b18bcb5c
Runtime Environment:
OS Name: Windows
OS Version: 10.0.17763
OS Platform: Windows
RID: win10-x64
Base Path: C:\Program Files\dotnet\sdk\2.1.700-preview-009597\
Host (useful for support):
Version: 2.1.9
Commit: dcedc87d22
.NET Core SDKs installed:
2.1.601 [C:\Program Files\dotnet\sdk]
2.1.602 [C:\Program Files\dotnet\sdk]
2.1.700-preview-009597 [C:\Program Files\dotnet\sdk]
.NET Core runtimes installed:
Microsoft.AspNetCore.All 2.1.8 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.All]
Microsoft.AspNetCore.All 2.1.9 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.All]
Microsoft.AspNetCore.App 2.1.8 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App]
Microsoft.AspNetCore.App 2.1.9 [C:\Program Files\dotnet\shared\Microsoft.AspNetCore.App]
Microsoft.NETCore.App 2.1.8 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
Microsoft.NETCore.App 2.1.9 [C:\Program Files\dotnet\shared\Microsoft.NETCore.App]
05/21/2019 16:15:32: Runtime terminating: True
05/22/2019 11:58:37: System.InvalidOperationException: Shuffle input cursor reader failed with an exception ---> System.InvalidOperationException: Splitter/consolidator worker encountered exception while consuming source data ---> System.DllNotFoundException: Unable to load DLL 'CpuMathNative': The specified module could not be found. (Exception from HRESULT: 0x8007007E)
at Microsoft.ML.Internal.CpuMath.Thunk.SumSqU(Single* ps, Int32 c)
at Microsoft.ML.Transforms.LpNormNormalizingTransformer.Mapper.<>c__DisplayClass6_0.<MakeGetter>b__5(VBuffer`1& dst)
at Microsoft.ML.Data.ColumnConcatenatingTransformer.Mapper.BoundColumn.<>c__DisplayClass18_0`1.<MakeGetter>b__0(VBuffer`1& dst)
at Microsoft.ML.Data.ColumnConcatenatingTransformer.Mapper.BoundColumn.<>c__DisplayClass18_0`1.<MakeGetter>b__0(VBuffer`1& dst)
at Microsoft.ML.Data.DataViewUtils.Splitter.InPipe.Impl`1.Fill()
at Microsoft.ML.Data.DataViewUtils.Splitter.<>c__DisplayClass5_1.<ConsolidateCore>b__2()
--- End of inner exception stack trace ---
at Microsoft.ML.Data.DataViewUtils.Splitter.Batch.SetAll(OutPipe[] pipes)
at Microsoft.ML.Data.DataViewUtils.Splitter.Cursor.MoveNextCore()
at Microsoft.ML.Data.RootCursorBase.MoveNext()
at Microsoft.ML.Transforms.RowShufflingTransformer.Cursor.<LoopProducerWorker>d__31.MoveNext()
--- End of inner exception stack trace ---
at Microsoft.ML.Transforms.RowShufflingTransformer.Cursor.MoveNextCore()
at Microsoft.ML.Data.RootCursorBase.MoveNext()
at Microsoft.ML.Trainers.TrainingCursorBase.MoveNext()
at Microsoft.ML.Trainers.SdcaTrainerBase`3.TrainCore(IChannel ch, RoleMappedData data, LinearModelParameters predictor, Int32 weightSetCount)
at Microsoft.ML.Trainers.StochasticTrainerBase`2.TrainModelCore(TrainContext context)
at Microsoft.ML.Trainers.TrainerEstimatorBase`2.TrainTransformer(IDataView trainSet, IDataView validationSet, IPredictor initPredictor)
at Microsoft.ML.Data.EstimatorChain`1.Fit(IDataView input)
I figured that the file is on my devbox. Probably it is not put to the release folder because there is a x86 and a x64 version of it, with the project set to Any CPU these two would need to become one.
Therefore I would like to deploy both DLLs to the target devices, but don't know how I can have the project (a library on its own) to look it up in the respective folders, without needing to generate two separate projects/installers etc.?
I copied the files in subfolders and tried setting the following in the executable project's app.config, but with no luck. Still its missing the DLL.
<runtime>
<assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
<probing privatePath="dependencies\32bit;dependencies\64bit" />
...
@korneliuscode .. let me take a look at this. will keep u posted.
@korneliuscode . For the .NET standard project where you added ML.NET you can keep it as AnyCPU. However, you need to specify either x86 or x64 in your .NET Framework 4.7.2 project
@eerhardt
The assembly that is causing an issue is the CpuMathNative.dll assembly - not to be confused with the Microsoft.ML.CpuMath.dll. The Microsoft.ML.CpuMath.dll assembly is a C# assembly that calls into the C++ CpuMathNative.dll assembly.
There is no such thing as "any CPU" for C/C++ - you need to compile it for a specific processor architecture. We compile CpuMathNative.dll for 2 architectures on Windows: x86 and x64. Both versions have the same name: CpuMathNative.dll, that way we can just P/Invoke from C# into the native dll directly.
This works great on .NET Core, because .NET Core has the concept of a "portable" application - meaning it can run either x86 or x64, Windows or Linux or macOS. The way native assemblies work on .NET Core, is they all get put into a runtimes\<RID> folder, where <RID> represents the OS and architecture of the application. So we have a runtimes\win-x64\CpuMathNative.dll and a runtimes\win-x86\CpuMathNative.dll. They both get deployed with a .NET Core portable application and the correct one is picked automatically at runtime based on what the app is running as.
However, on .NET Framework this doesn't work. The .NET Framework doesn't have the same capability to pick a native asset based on the current process. So instead, if you have native dependencies (like ML.NET does) you need to build a .NET Framework application for the specific architecture you expect it to run on. If you need it to run on both 32-bit and 64-bit, then you will need to build 2 versions of it - one with the x86 native binaries and one with the x64 native binaries.
There are ways to pick the native assets yourself on .NET Framework, but I would consider them to be advanced scenarios. If you are curious, one example is how DiaSymReader works with its native assets:
The idea here would be to deploy both the win-x86\CpuMathNative.dll and win-x64\CpuMathNative.dll assemblies in your app (the files are named the same, so you would need to put them in different folders), and then control yourself which one gets loaded based on which architecture the process was.
Like I said, I'd consider that an advanced approach, and I would recommend building your .NET Framework Exe project as x86 or x64 specifically.
@korneliuscode .. hope the above explanations helped and we can close this issue.
Thanks for the explanation. As always in life I have to cater for some legacy users also. Therefore it's not easy to introduce two different versions (x86/x64) or a new type of installer of the app or switch fully to .NET Code (due to remaining winforms components). I'll have a look at the dynamic loading based on the bitness of the OS. I wonder if this works even though my app is not "directly" using the native dll.
For now I introduced a check on the bitness in the app to only expose the ML features if it is 64bit. This way I just need to deploy the 64bit CpuMathNative.dll for now.
Therefore it's not easy to introduce two different versions (x86/x64) or a new type of installer of the app
Are you explicitly using AnyCPU or are you using AnyCPU32bitpreferred (the default)?

If you have "Prefer 32-bit" checked, your app will run as 32-bit on both 32-bit and 64-bit Windows. Which means you would only need to deploy the x86 CpuMathNative assembly, and your app will run everywhere.
Check out https://dzone.com/articles/what-anycpu-really-means-net for more details.
Thanks for the tip. For now I still use the AnyCPU setting. I will look into the AnyCPU32bitpreferred.
Closing this issue. Does not require a code fix for ML.NET
Exception thrown: 'System.DllNotFoundException' in Microsoft.ML.CpuMath.dll
An exception of type 'System.DllNotFoundException' occurred in Microsoft.ML.CpuMath.dll but was not handled in user code
Unable to load DLL 'CpuMathNative': The specified module could not be found. (Exception from HRESULT: 0x8007007E)
the above error is shown at this line:
ModelOutput result = predEngine.Predict(input);
trying to implement sentimental analysis on input comments in ASP.NET WEB APPLICATION(.NET FRAMEWORK) but getting these error even though it is seen in solution explorer.
Solution platform is Anycpu.also tried x64,x86 but no change.
I am having the same issue. Changing to x86 or x64 does not correct the problem.
@pjmelvin - can you try with the latest pre-release version of ML.NET? https://www.nuget.org/packages/Microsoft.ML/1.5.0-preview. If that doesn't fix your issue, please open a new issue to track it with full repro steps - https://github.com/dotnet/machinelearning/issues/new
This issue terrifies my boss and makes him afraid to use this package. We will not be on .net core anytime soon.
@andrewkittredge - I can help calm some fears here. ML.NET does work on .NET Framework, with one caveat: your application needs to specifically target 32-bit or 64-bit. Your app (the project that produces an .exe) can't target "Any CPU" because ML.NET needs to deploy native assemblies and the build needs to know whether it should pick 32-bit or 64-bit native assemblies.
There was a bug shipped in version 1.4.0 that regressed this support, but it has been fixed in the latest versions. Or you could use the version before that 1.3.1. Or, if you really need to use version 1.4.0, I listed 2 workarounds in https://github.com/dotnet/machinelearning/issues/93#issuecomment-552486679.
Hope this helps.
Most helpful comment
@pjmelvin - can you try with the latest pre-release version of ML.NET? https://www.nuget.org/packages/Microsoft.ML/1.5.0-preview. If that doesn't fix your issue, please open a new issue to track it with full repro steps - https://github.com/dotnet/machinelearning/issues/new