_From @cbeall on March 22, 2017 20:38_
I'm not sure if this is the correct place to ask this question, but for the last nine months or so, I've been using Visual Studio to work on my project.json based web applications running in Docker.
My Dockerfile adds all required packages to my NuGet Packages folder, but instead of copying the code to image, I mount the volume with the code on my windows drive. This has allowed me work with Visual Studio to write code, but also leverage dotnet watch so that as soon as I save a file, the application in my docker container restarts and has my changes. My Dockerfile would work as follows (/app being the mounted directory).
FROM microsoft/dotnet:1.1.0-sdk-projectjson
ENV DOTNET_USE_POLLING_FILE_WATCHER 1
# Preloads the packages
COPY ./src/myapp/project.json /tmp/project.json
WORKDIR /tmp
RUN dotnet restore
RUN mkdir /app
WORKDIR /app
ENTRYPOINT dotnet watch run
I've had a lot of success with this approach. What allowed this to work is that the project.lock.json didn't care if it was running in Docker or Windows. Now that I'm trying to migrate to use csproj with MSBuild, I'm unable to make this work. When I run dotnet restore, instead of a project.lock.json, there are two files created
Both of these files contain with absolute paths. Depending on if I run the restore on docker or windows, the values of those paths change. This prevents me from being able to run files in docker and use Visual Studio at the same time. If running in windows, the docker container will not start. If the docker container is running, Visual Studio gives me a who mess of "The type or namespace 'System' cannot be found" errors.
Is there a way I can pass arguments to MSBuild when restoring or configure .csproj to make this not happen? Or is this the new normal?
_Copied from original issue: dotnet/sdk#1033_
@cbeall these files are intended to be intermediate files and generated for the machine you are on. Is there a way you can restore from the docker container to generate the correct paths there?
Yes, I can run the restore from either docker or windows.
The issue is, when I do the restore from the docker container, Visual Studio does not work (there are a bunch of "The type or namespace 'System' cannot be found" errors). And when I do the restore from windows, I can't run the application in the docker container.
Back with the preview tools and project.json, it did not matter where I ran the restore. And after I did that restore, I was able to run the application in docker and use Visual Studio simultaneously.
NuGet uses BaseIntermediateOutputPath to determine the location for the project.assets.json file. You could change this based on your environment to create two different files which will allow you to work on both. There are some known issues with those however where tools aren't respecting the property yet, but you can try it out.
I'm running into the same issue. I was hoping to be able to develop in Visual Studio and have the container running with dotnet watch.
The generated NuGet file *.csproj.nuget.g.props under ./obj references files on Windows using absolute path, and as soon the container runs dotnet watch then it messes up the file with absolute paths within the container, and that messes it up for Visual Studio.
So I decided to look at how the docker support in VS does it. It seems to mount the following volumes:
volumes: - ./ExampleProject:/app - ~/.nuget/packages:/root/.nuget/packages:ro - ~/clrdbg:/clrdbg:ro
I'm not sure if this helps with the problem above, but it mounts the \.nuget folder from the Users folder in Windows to /root/.nuget/packages:ro which matches the path in the generated *.nuget.g.props file (when running dotnet restore within the container):
<NuGetPackageRoot Condition=" '$(NuGetPackageRoot)' == '' ">/root/.nuget/packages/</NuGetPackageRoot> <NuGetPackageFolders Condition=" '$(NuGetPackageFolders)' == '' ">/root/.nuget/packages/</NuGetPackageFolders>
But this won't fix the issue on Windows.
It would be best if we could have separate nuget.g.props file for the container somehow.
I fixed the problem by mounting ./bin and ./obj to an empty folder. Voila, no more conflict! :) But this means my docker run command doesn't look very good.
docker run -it --rm -p 5000:5000 -v C:\git\ExampleApp\:/app -v C:/empty:/app/src/ExampleProject/obj -v C:/empty:/app/src/ExampleProject/bin tests/example.app
I was able to solve the problem using BaseIntermediateOutputPath and moving the obj folder to a directory that is a sibling to the directory within the solution folder (so effectively "..\obj").
This solution also works, but both this and@raRaRa 's answer feel a bit like a hack. It would be much better if the output of dotnet restore could be done in such a way where the output doesn't care about absolute paths or operating systems. It worked this way prior to the csproj change, and that made life a lot simpler.
I am also running into this very same issue and would love to have a good solid container development workflow.
I am using @raRaRa solution for now which works great but I still get the "Unresolved" Info popup every time I open vscode, even though dotnet restore says the lock file hasn't changed and was skipped.
I'm also experiencing these difficulties. A comprehensive guide on running dockerized development and deployment with this newer dotnet stuff would be amazing. Some of what I have is getting close but I'm sure it's far from optimal.
Thought I woudl add that @raRaRa 's solution works greta for Linux based development workflows as well (in my case, I'm running macOS with VSCode and VS 2017 for Mac - both are tools that like to spontaneously run dotnet restore without input from the user, causing conflict with Docker-based dotnet restore in the same directory.)
The "hack" is no big deal since I can stick it in my existing docker-compose.yml file but I agree it's a hack, and not a solution, especially considering it was working for project.json projects in the past. Thank you guys for taking the time to post your solution!
I was able to get this working also, but I had to add obj/bin to dockerignore to keep from ruining VSCode's intellisense on build
So, to solve that until now, is basically changing the /bin and /obj folders between both OS( Windows and Linux Docker ) Environment. It is: do not let Docker sync thoses folders. Right?
First one:
If you want to run dotnet restore at Docker after you ran dotnet add package somepackage at Windows develoment environment.
We have two way(fix me if I'm wrong):
I stop the container and rebuild it.
Or run docker exec -i -t comtainerId dotnet restore
Second:
I feel running the command dotnet watch run in the container is very slow, I think this should be more faster, I have a docker-compose.yml file with 5 containers, 3 of those are dotnet core console app, another is a RabbitMQ e another is an ElasticSerch. Everytime when I change csharp file, a chain of log events happen at my containers, logs like RabbitMq client disconect and of couse, dotnet watch from file changed.
In this time, my HardDrive become to 100% of work for more then 5 minutes until dotnet watch recompile all of project.
I thought, this could be a HD problem, but not, I change my Docker .VHD file( 24GB of size ) to another HardDrive, but the problem still continues. So I restore my Docker to factory defaults cleaning my .VHD gain performance to ~1GB but the performance still slow between 3 and 5 minutes.
Somebody here have one of those issue?
I'm trying to use @raRaRa workaround to mount an empty folder. It looks like in his case the host is Windows but the container is Linux. My case is Windows to Windows container and it looks like nested volumes isn't supported on Windows containers.
Most helpful comment
I was able to solve the problem using BaseIntermediateOutputPath and moving the obj folder to a directory that is a sibling to the directory within the solution folder (so effectively "..\obj").
This solution also works, but both this and@raRaRa 's answer feel a bit like a hack. It would be much better if the output of
dotnet restorecould be done in such a way where the output doesn't care about absolute paths or operating systems. It worked this way prior to the csproj change, and that made life a lot simpler.