Msbuild: Please Consider Improving Project Format and Structure (Serialized POCO/Format Agnostic)

Created on 12 May 2016  ·  195Comments  ·  Source: dotnet/msbuild

Going to try this again, but hopefully with a better defined ask. The request is not to support a particular format, but to improve the MSBuild system overall so that it can support any format, while also making it more accessible/discoverable by tooling and (possible) designers.

Based on the current climate of .NET Core having its project.json merged back into MSBuild, this topic might get a little more interest conversation. Also, the Roslyn team is sending developers your way, so, there's that. :)

Use Case: Developers _REALLY_ like their file formats!

Problem(s):

  1. MSBuild format is rooted in an arbitrary, not well-defined/known schema (difficult to explore and discover).
  2. MSBuild project structure can only be defined/edited in XML.
  3. MSFT development ecosystem (for better or for worse) is now comprised of two camps: web application developers and native application developers each has their own way of doing things and neither really likes the approach of the other.

Suggestions:

  1. Create a well-defined, well-known .NET object model that defines the Project Model (Project API). Visual Studio then loads (and saves) these as POCOs on disk. In this design, there are no arbitrary schemas or data files, but 100% serialized POCOs which are read and saved to disk. Another issue on Roslyn's board goes into more detail around this.
  2. Adopt a "Bring Your Own Serializer" strategy. Adding a new format/type is as simple as installing a Visual Studio extension (or even auto-detected and installed for you upon detection/initial file load). To start with, JSON5 and XML should be supported out of the box, but developers should be able to bring on any format they wish, such as Xaml, Yaml, etc.
  3. Allow (optional) naming extensions for projects to help identify stored format. Examples:
  4. MyProject.csproj.json5 <-- C# Project serialized as JSON5
  5. MyProject.vbproj.xml <-- VB.NET Project serialized as XML
  6. MyProject.fsproj.xaml <-- F# Project serialized as XAML (OH YES I WENT THERE!!!)

Those are off the top of my head to get the conversation started. All of these are open to feedback of course and I would love to hear the thoughts of developers around this. Thank you for any consideration and/or support!

Most helpful comment

@aolszowka never in a professional environment, by someone who I would consider a professional developer (or Build Master), would I say I've heard a cry for a switch to a JSON based MSBuild.

Hi. I'm a professional developer. I was lead developer on several major projects at a top 20 stock exchange. I was the lead solution architect. I knew MSBuild/Team Build better than most of the Build Masters there and assisted them with developing and debugging their builds. I admit, I don't want a JSON based MSBuild. I want MSBuild gone. It'd be nice if it's simpler replacement was an easier format than XML for humans.

@aolszowka I know the community has had a recent influx of posers from the "DevOps" movement; but there are still the few who actually know what they're doing beyond a drag n' derp interface.

I'm in the DevOps camp, yes. I increased my current client's delivery pace over 100% in one year (with less staff), by adopting DevOps techniques. You know, delivering more value to the business in a shorter time. That's what "professional" software _engineers_ do. Oh wait, I'm a "poser". And I don't know what I'm doing if I don't have a drag n' drop interface. Hmm, where is that IDE? Oh yeah, I don't have one installed. Tough to drag and drop in a text editor dude.

@aolszowka Reading Sayed Ibrahim Hashimi's "Inside the Microsoft Build Engine: Using MSBuild and Team Foundation Build" should also be required reading for anyone claiming intimate knowledge.

Read it. Cover to cover. More than once. Years and years ago. And I still rejected MSBuild. So, wait, by your definition I have intimate knowledge. How can I not like MSBuild? Waiting eagerly for the "well, you obviously didn't understand it" response...

@aolszowka Looking at project.json (I had never heard of this project until today)

Wow. You're dismissing other's viewpoints left and right, denigrating their abilities and making snide comments about them being posers, but you don't know the biggest shake up slated for the .NET Core toolchain? Well, until yesterday, that is...

@aolszowka If you're attempting to target an audience that is not using Visual Studio then they should be encouraged to seek out the recommended project system of choice for their environment.

Yeah, I did. It was called project.json. The environment was .NET Core on OSX. But now they're forcing me to use MSBuild, and I don't wanna, 'cause it's rubbish IMO.

@aolszowka The JSON Kids haven't grown up enough yet to understand why such systems are required; it looks like they're starting to come around though based on a quick google search.

"Kids", huh? Haven't grown up enough? Gee whiskers, if you're calling me a kid, I'd love to hear about your professional experience.

Some developers look up from their environment from time to time and evaluate alternatives. They try out Python, or Java, or Maven, or Scala, or F#, or Fake, or SBT, or TypeScript. And they _learn_. And they _improve_. And they judge their primary development environment tools based on what they've seen delivering value elsewhere.

Other developers focus on one toolchain and one stack for their entire career, and never learn anything outside their ambit. They think their primary/only development tools are the One True Way™ and there's no point looking at anything else.

Just some friendly career advice: I've found that the second set of developers are always the ones who get retrenched first, and that they never reach the earning potential of the first set of developers. But hey, that's just my experience after two decades years in the field, just a poser kid.

All 195 comments

I think you're vastly over-estimating the demand for different file formats. Who really cares about XML vs JSON? And more importantly, are there enough people who care enough to justify adding complexity to the entire build system?
The current problems with MSBuild files are historical: they're meant to be read/written by VS tooling exclusively (and not humans), they don't support packages as first-class citizens, etc. . XML is not a problem.

Thanks for your input @SolalPirelli. I am not sure what you mean by adding complexity. The solutions/goal is to reduce complexity while also satisfying developer (and organizational) preferences. Even if the tooling is meant to be used by VS/IDE exclusively, that does not mean that humans don't get their proverbial hands dirty with it, and they do all the time. The process for doing as such is very clumsy and awkward (which I believe you allude to via "historical").

I would also challenge you on demand for JSON vs. XML. What forums/tweets have _you_ been reading? :smile:

I am not sure what you mean by adding complexity.

Adding support for multiple file formats necessarily increases complexity. It means adding a public API that third-party providers will use, which will definitely cause headaches when new items are added and it turns out providers were doing crazy things to enable DSLs.

I would also challenge you on demand for JSON vs. XML. What forums/tweets have you been reading?

If this is to be resolved via forums and tweets, we'll end up writing our config file in Go, JavaScript or Rust.
The exact markup format does not matter, what matters is that the resulting format is editable by both humans and programs, adapted to current needs, and prepared for future evolution. There's nothing in JSON that makes it clearly better than XML in that regard. If anything, JSON is worse because it lacks comments.

I would also challenge you on demand for JSON vs. XML. What forums/tweets have you been reading?

I'll challenge you right back; never in a professional environment, by someone who I would consider a professional developer (or Build Master), would I say I've heard a cry for a switch to a JSON based MSBuild.

I will agree with regards to documentation and ask that the current XML format be better documented; but as far as a switch to JSON I see little to no technical gains. Why not take those resources that would be wasted on such a system and instead put them towards improving the MSDN docs?

As per Raymond Chen every feature starts out at -100; what are the gains that get us to Positive 100?

If this is to be resolved via forums and tweets, we'll end up writing our config file in Go, JavaScript or Rust.If this is to be resolved via forums and tweets, we'll end up writing our config file in Go, JavaScript or Rust.

LOL!!!

I hear you. As for the API. Yes, that is the goal here. To (ultimately) have a well-defined/documented/accessible project API that we all know and have availability to, in case we want to know the file (project) we're actually describing. :)

When you open a .csproj file now... can you in all honesty say you know each and every element within it? The schema is difficult at best to discover and browse. Whereas if we were using a well-known API (.NET POCOs) then this becomes a snap.

If anything, JSON is worse because it lacks comments.

Agreed. But not everyone is agreement with this. And also, I am suggesting JSON5, which allows comments. Finally, the serialization is intended to be an implementation detail, and not supposed to be something that is part of MSBuild, per se. It just has to support it.

I'll challenge you right back; never in a professional environment, by someone who I would consider a professional developer (or Build Master), would I say I've heard a cry for a switch to a JSON based MSBuild.

Well dudes... WHERE HAVE YOU BEEN IN MY LIFE FOR THE PAST YEAR?!?! LOL. I guess I have some baaaaaad luck then, because I have been harping against the project.json movement for over a year now and it has seemed to have been an uphill battle, to say the least!

I personally am very much more in the XML camp (I would actually prefer to see Xaml), but I want to consider and be mindful of the developers who have been enjoying the shiny new toy of project.json for the past year.

And also, I do not want to "switch to JSON" ... but to simply support it as a serialization mechanism. If we're working with well-defined/known POCOs from an API, what matter does it make the format it is serialized/deserialized in?

When you open a .csproj file now... can you in all honesty say you know each and every element within it?

Yes; that is what part of being a Build Master and putting on your resume that you speak MSBUILD means. I know the community has had a recent influx of posers from the "DevOps" movement; but there are still the few who actually know what they're doing beyond a drag n' derp interface.

The schema is difficult at best to discover and browse.

This is the primer for anyone coming into MSBuild that needs to start with https://msdn.microsoft.com/en-us/library/0k6kkbsd.aspx Reading Sayed Ibrahim Hashimi's "Inside the Microsoft Build Engine: Using MSBuild and Team Foundation Build" should also be required reading for anyone claiming intimate knowledge.

WHERE HAVE YOU BEEN IN MY LIFE FOR THE PAST YEAR?!?!

Monitoring the progress of MSBuild's open sourcing; trying to assist on StackOverflow under the MSBuild Tag (when Sayed isn't beating me to the punch). #JustBuildMasterThings

I want to consider and be mindful of the developers who have been enjoying the shiny new toy of project.json for the past year.

I see no reason why both projects cannot coexist; each project should be allowed to fail or succeed based on its technical merits. Instead of chasing tail lights I personally feel that resources are better spent on improving what we have if there is a clear technical gain. That being said I'm not the Product Owner, nor even a Developer, just a well informed end user.

but to simply support it as a serialization mechanism

Again what is accomplished from this?

I personally think what we need is a good way for the two build systems to share data, so that MSBuild can import data from a JSON project file.

Yes; that is what part of being a Build Master and putting on your resume that you speak MSBUILD means

Wellllll that's all nice and good and it is great to see all your investments/commitments to becoming/being a Build Master, but for a typical developer who simply wants to add a new task to their project to do x, y, z, their objective is to get those tasks in and back to developing code, not spending a week in required reading to figure out the simplest of features.

Again what is accomplished from this?

You're asking me what is accomplished by allowing developers to use the format they most prefer to work in? :smile: Web developers REALLY like their JSON and native developers really like their XML/Xaml. Soooo...

And of course, there are other formats altogether that developers enjoy. By embracing and enabling support for them (which can be done on a 3rd party basis, not from MSBuild itself) you encourage adoption.

Everybody agrees that we need a less arcane format for build files. I can't speak for Microsoft, but it does seem like they want to move in this direction as well.

What you don't seem to understand is that one can't just "enable support for 3rd-party providers" by flipping a switch somewhere. Creating a flexible enough API, documenting it and versioning it is far from trivial, MSBuild or otherwise.

typical developer who simply wants to add a new task to their project to do x, y, z

It would be helpful to understand what you're asking the project to do. Looking at project.json (I had never heard of this project until today) I only see limitations on what I'm able to do. For one there is no readily apparent ability for me to extend their project system with custom tasks (something any one who utilizes MSBUILD will eventually do).

The concept of native NuGet Package management seems nice; but is hardly novel.

not spending a week in required reading to figure out the simplest of features.

Again if you could tell us those features that you feel are hard to discover and would take a week of reading?

In reality, a significant portion of developers will never find themselves editing the csproj or any other msbuild based file by hand. The vast majority of them are interacting with the system via Visual Studio. Those that are editing in these files generally are reading the documentation or are using wrappers around the system such as TeamCity, Jenkins, or Team Foundation Server (TFS/Visual Studio Online whatever you wanna call it these days).

If you're attempting to target an audience that is not using Visual Studio then they should be encouraged to seek out the recommended project system of choice for their environment.

most prefer to work in?

I'm asking what these developers are doing and why they're not using the tooling provided to them?

By embracing and enabling support for them (which can be done on a 3rd party basis, not from MSBuild itself) you encourage adoption.

This is great in theory; however in practice it results in a contract being formed between Microsoft and these 3rd Parties; Raymond Chen speaks at length about such subjects as they are a recurring theme in the history of Microsoft developed products. Offering such a system results only in additional technical debt, and unless there is an extremely compelling reason most teams are wise to not take on such debt.

But back to your original post as I feel we're far off topic

MSBuild format is rooted in an arbitrary, not well-defined/known schema (difficult to explore and discover).

You need to qualify what you mean by this; the schema provides an XSD by which the project file can be validated http://schemas.microsoft.com/developer/msbuild/2003 this doesn't help you if you're using custom tasks, for which you should have provided the XSD when you wrote the tasks. Many community based projects such as the MSBuild Extension Pack and the MSBuild Community Tasks do this.

The format is well documented on MSDN (https://msdn.microsoft.com/en-us/library/0k6kkbsd.aspx), an exploration of the project syntax is relativity straight forward to me personally.

If you could give a specific example of a common task developers are engaging in that is hindered by the current format it'd help to understand your request.

Create a well-defined, well-known .NET object model that defines the Project Model (Project API)

Believe it or not what you ask for already exists for MSBuild; however officially it is not a sanctioned binary for third party use and the API's are not guaranteed to change. However based on the widespread usage it has become defacto (probably much to the team's displeasure). The much API in question lives in Microsoft.Build; the most familiar of which to me is Microsoft.Build.Evaluation (https://msdn.microsoft.com/en-us/library/microsoft.build.evaluation.aspx).

Anyone who has done extensive deep dives into creating custom tasks or extending Visual Studio will be familiar with these API's.

Perhaps a better ask of the team is to make these API's sanctioned such that the third parties you mention can more reliably write to the specification.

this doesn't help you if you're using custom tasks, for which you should have provided the XSD when you wrote the tasks

That sounds like -- and is -- a lot of work. :stuck_out_tongue: The goal/ask here is to make this more of a POCO based model where we are working with well-known/well-defined POCOs, and the objects that are serialized/deserialized are these objects. Otherwise, we are asking developers to add "yet another artifact" with an .xsd file (are these even used anymore these days?) to use as a schema when they have already defined the schema with the Task object they have created.

If you could give a specific example of a common task developers are engaging in that is hindered by the current format it'd help to understand your request.

I am open to admitting that I might (most probably) using the wrong words in describing my problem. I am fundamentally lazy and don't like thinking about things until it really matters, like you are pushing me to do! Essentially, there are two scenarios here to consider:

1) Build master (experts)
2) Developers (script kiddies/interested in becoming a build master -- hey, that's _me!_)

I will speak from my perspective (developer -- but I do have a lot of experience with TFS build servers). To start with, I will provide context when I go to open a Xaml file. When I open a Xaml file, every symbol on that file -- regardless of the type -- is easily accessible to me. Using ReSharper, I can CTRL-B any symbol on that file, and I will be taken to its (decomiled) definition. This happens every time, no question.

Now, for MSBuild, I have to open up the project to edit it. First, it produces a very disruptive dialog if I want to close all files in the project to view this file. Secondly, once the file opens, discoverability is next to zero. Not to mention the conceptual paradigm in play is very awkward. If I I want to get a list of files, I have to work with these PropertyGroup and ItemGroup tags and work with a strange syntax to collect my files.

Whereas in Xaml I could see something more well-formed such as:

<msbuild:Project>
    <msbuild:Project.Resources>
        <msbuild:FileList x:Name="FilesToDelete" Files="SomeDirectory/**.*" />
    <msbuild:Project.Resouces>
    <msbuild:Project.Tasks>
       <msbuild:DeleteFilesTask Files="{x:Reference FilesToDelete}" />
    </msbuild:Project.Tasks>
</msbuild:Project>

(note that is a REALLY ROUGH sketch of what I would possibly like to see in an API model. Please don't make too much fun of it. :stuck_out_tongue: But the point here is that as I am typing, tooling kicks in and I am able to reference tasks as I write them as they are POCOs resolved to assemblies I have referenced.

I'm asking what these developers are doing and why they're not using the tooling provided to them?

That's just the problem. This is no tooling provided to XML editing of MSBuild files. Well there is, but it is very prohibitive when compared to, say, Xaml editing experience.

Perhaps a better ask of the team is to make these API's sanctioned such that the third parties you mention can more reliably write to the specification.

Like I said this is just to get the conversation going. Looks like I posted in the right place! :smile: Thank you for providing your input/perspective. I can tell you know what you're talking about! And also, give me your Twitter handle so I can tag you when I get pwned by the JSON crowd. (kidding... sorta :smile: )

What you don't seem to understand is that one can't just "enable support for 3rd-party providers" by flipping a switch somewhere

Isn't that exactly what the ASP.NET Core team did with its configuration provider API? That's pretty much the same idea here.

Isn't that exactly what the ASP.NET Core team did with its configuration provider API? That's pretty much the same idea here.

The ASP.NET team made the choice of accepting third-party configuration models, yes, and I'm sure they had good reasons to make that tradeoff; the fact that they needed to implement multiple configuration providers anyway for things like environment vars and config files probably factored into that discussion. In exchange for that flexibility, they get more complexity.

However, you have not given a good argument as to why MSBuild should become more complex. "It's the current fad in web development" is not a good argument.

This entire thread looks like the XY problem to me: you want to have a better MSBuild format, which is great, but you think it can only be achieved via your idea - to let everybody provide their own format - when there are plenty of other solutions out there.

@Mike-EEE The issue is more that you know the XAML APIs and (admittedly) the editor experience is less friendly for MSBuild. But if you really want to learn http://www.amazon.com/Inside-Microsoft-Build-Engine-Foundation/dp/0735645248?ie=UTF8&psc=1&redirect=true&ref_=oh_aui_detailpage_o00_s00

You can count me in the project.json camp, although what I loved about it was the simplicity:

  • here's my project name and description
  • here are the target frameworks I want supported and here are the dependencies for each
  • here are my framework-independent dependencies (more often than not, none)
  • done. Here's your NuGet package.

I don't care what format you use to instruct the toolchain to build my project as long as it is documented, I can edit it by hand (because sometimes NuGet just gets confused) and I can use the exact same thing on a dev box as on a build server and they both work exactly the same way.

Oh, and I don't need to maintain multiple files, one for VS to build my project and another one for NuGet to package it. Hey, VS, if you can figure out how to build it, I'm sure you can handle collecting everything and making a nice package for me.

Oh, and I don't need to maintain multiple files, one for VS to build my project and another one for NuGet to package it. Hey, VS, if you can figure out how to build it, I'm sure you can handle collecting everything and making a nice package for me.

See https://github.com/nuproj/nuproj by @terrajobst

However, you have not given a good argument as to why MSBuild should become more complex. "It's the current fad in web development" is not a good argument

Again, not making it more complex, but reducing its complexity. It's not just a fad in web development but a viable, popular pattern that has been used for quite some time in .NET. The ask would be to utilize this pattern for serializing objects that are used to describe MSBuild projects in a way that allows developers/organizations/teams to use the format they prefer.

I am starting to get the feeling that we should wait until more developers from the pro-JSON camp find their way onto this thread before attempting to provide a better argument. ;)

This entire thread looks like the XY problem to me: you want to have a better MSBuild format, which is great, but you think it can only be achieved via your idea - to let everybody provide their own format - when there are plenty of other solutions out there.

Haha... that's cool. I learned something new today. Thank you for the link :smile: My idea is to provide a better _model_ (which it sounds like everyone agrees with!) which can then be serialized/deserialized in _any_ format, if that helps clarify my position.

with an .xsd file (are these even used anymore these days?) to use as a schema when they have already defined the schema with the Task object they have created.

Yes; they're used all the time. In your Xaml example its how Intellisense (and other such tools) knows what to present to you and how XML files are validated as "well formed". The JSON Kids haven't grown up enough yet to understand why such systems are required; it looks like they're starting to come around though based on a quick google search.

When I open a Xaml file, every symbol on that file -- regardless of the type -- is easily accessible to me.

You're asking for Intellisense; again provided by a valid XSD which is automatically loaded as per the directive at the top of every well formed msbuild project file. Out of the box this is only provided for the included "base" MSBuild tasks.

Does the one provided within Visual Studio not meet your needs? Below is a screen shot from one of our build scripts showing this in action:

intellisenseformsbuild

A reasonable ask I think is to ask for more context documentation here to improve discover-ability; however that is for another subject; one I'd gladly up-vote as I know when I was starting out it was frustrating to continue to reference back to the documentation.

First, it produces a very disruptive dialog if I want to close all files in the project to view this file.

This is a limitation of Visual Studio; not of the chosen file format.

no tooling provided to XML editing of MSBuild files

Again, any XML capable editor can do this; I personally recommend Visual Studio simply because it will parse the XSD and any other included namespace to give you contextual Intellisense/Code completion.

And also, give me your Twitter handle

I've been told I need one; but honestly have never bothered to get on there. Feel free to @mention me anywhere on GitHub though.

@SamuelEnglard yeah I hate to mention Xaml here, because the Xaml that is already in use that is associated (very negatively i might add) with MSBuild is actually Windows Workflow and is really a bear to use (really, EVERYONE hates it and do not want "Xaml" because of it). I personally would like to see Xaml used to describe MSBuild files so it would be more like the current XML approach, but much more well-defined and discoverable.

See https://github.com/nuproj/nuproj by @terrajobst

Yeah, no. That's still requiring me to duplicate information. The only reason I should ever need to specify some bit on info again is because I want it to be different than somewhere else. e.g. If I have AssemblyInfo.cs in my project and I want the assembly versions to be different than the version of the NuGet package, then I would specify each. Otherwise, setting one should flow into the other.

I should be able to describe everything about my project in a single location. I should also not need to tell NuGet that I want it to take the output of my project and use that in the package. Why wouldn't I want to include that? And if I've already specified which frameworks to generate assemblies for, why do I need to explain which targets are being packaged, again? Pick all of them, unless I tell you otherwise.

Sensible defaults and a mechanism to override them...

You're asking for Intellisense

Actually, no I am asking for more than that (from what I understand). Intellisense completes the symbols, and provides tooltips, but to actually dive into the symbol to take you directly to the class file where it exists, that is an IDE (or tooling, such as ReSharper) implementation detail.

Does the one provided within Visual Studio not meet your needs?

It's OK. But I find the Xaml experience much more expressive and discoverable. And intuitive as well. And I am not entirely sure that Xaml is using .xsd files, unless they are automatically creating them from the class definitions? That seems inefficient as the class definitions are already in memory and available to to the tools. It doesn't make sense to create a whole new file and then use that for navigation/information.

Also, another aspect we're overlooking here is .SLN files, which are their own animal (beast, more like it!) altogether and should be tamed/consolidated/considered into this new model as well.

@colin-young because it's all MSBuild you can "embed" it into the existing project and have it pull the information from there. I think I'll fork it to add an example of doing that...

I've added #614 to discuses a better experience editing the XML since that's really off topic for this issue

@colin-young

If I understand what you're asking for you want a workflow in which NuGet package creation is more tightly coupled with the build; this is already possible in MSBuild; it would require that you add a new target to your existing msbuild file and then call the target at the appropriate time; if you were using MSBuild Community Tasks you'd call the NuGetPack Task as appropriate (here's a snippet from one of our projects):

<NuGetPack File="%(NSPSTransformed.Identity)" OutputDirectory="$(BuildOutputNuGet)" Properties="Configuration=Release" IncludeReferencedProjects="true" ToolPath="$(nugetToolPath)" />

Reading between the lines you want a system that does this for you automagically; I'm not sure that specific business needs should be covered by the tool by default. At some point you will need to customize and modify the tools to fulfill your needs.

@Mike-EEE

but to actually dive into the symbol to take you directly to the class file where it exists

Its not clear what you would gain from being shown the source for a task such as "Move" or "Copy" 99% of the time unless you're debugging a bug within those tasks you're more interested in what the attributes (arguments) to the task are and what its behavior is; all of this can be embedded in the XSD; the version that they have provided and maintained is very simplistic covering only the built in tasks and the various attributes (IE "arguments") to be passed into the task.

And intuitive as well. And I am not entirely sure that Xaml is using .xsd files

Its slightly more complex than that Intellisense will utilize methods such as comment scraping for XML Docs to generate this information on the fly; but the end results are the same.

It doesn't make sense to create a whole new file and then use that for navigation/information.

Why not? The file is created in memory if anything.

Also, another aspect we're overlooking here is .SLN files, which are their own animal (beast, more like it!) altogether and should be tamed/consolidated/considered into this new model as well.

If you look at how MSBuild handles SLN files; they are actually transformed by MSBuild into pseudo MSBuild files prior to execution to avoid the nastiness incurred within them. However that being said I found the format straight forward; if you created another issue page to air your complaints with them I'm sure we can show you how they operate.

They are also editable via the above linked API.

Its not clear what you would gain from being shown the source for a task such as "Move" or "Copy" 99% of the time unless you're debugging a bug within those tasks you're more interested in what the attributes (arguments) to the task are and what its behavior is

What you gain is a sense of discoverability and access -- not just for the default items described by the xsd but _any object defined_ in the file. You get clear connection to the data you are describing and the object that ends up using those values. If you have not spent a lot of time in Xaml then it might not make sense to you, but when you have access to your code and can easily navigate through its properties and definitions, you not only get a better understanding of the elements at play, but also for the system as a whole. This is what is so awesome about .NET in general: being able to explore elements and see how they all connect and how they can be utilized.

Why not? The file is created in memory if anything.

Again, I am not sure if this takes place. Can you provide a resource showing that XSDs are used for intellisense? This is the first I have heard of this. And if a process is creating "yet another file" -- even in memory -- when the data it seeks is already in memory by way of symbols culled from a class definition, then obviously that is a very inefficient approach!

if you created another issue page to air your complaints with them

Truth be told, I have already done that here:
https://visualstudio.uservoice.com/forums/121579-visual-studio/suggestions/9347001-improve-reboot-visual-studio-project-system

:)

@SamuelEnglard @aolszowka This thread started from https://github.com/dotnet/roslyn/issues/11235, which was about improving the project definition format. To your point, what I'd like is a declarative project format rather than prescriptive. MSBuild is, by necessity, prescriptive. 90% of the time, I don't care how it accomplishes that because I just want to say, "Hey .Net, see all these files? Can you compile them all into an assembly and then package it for all of these targets? Here's all the details of what to name it and the version."

To me the question is, should the tool that takes a declarative description of a project and produces the requested output be part of MSBuild, or part of something else? But I do feel very strongly that it needs to be standard across all of .Net (i.e. one file format on Windows, Linux, OS X whether you are using the command line or Visual Studio).

@colin-young You can put your version number in an MSBuild file and have your AssemblyInfo.cs (or part of it) generated by an MSBuild task. I've done this for my work projects, because we have ~100 projects that we want to build with the same version. Now, we only have the one place to maintain that version number. This same version property could be fed into the task that builds the nuget packages.

CommonAssemblyInfo.targets:

<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">

    <PropertyGroup>    
        <Year>$([System.DateTime]::Now.Year)</Year>
        <Version>1.2.3</Version>
        <Company>FooBar Technologies</Company>        
        <Copyright>© $(Year) $(Company)</Copyright>
        <CommonAssemblyInfoFileName>CommonAssemblyInfo.cs</CommonAssemblyInfoFileName>
    </PropertyGroup>

    <ItemGroup>
        <Compile Include="$(CommonAssemblyInfoFileName)"/>
    </ItemGroup>

    <Target Name="CleanCommonAssemblyInfo" BeforeTargets="Clean">
        <Delete Files="$(CommonAssemblyInfoFileName)" Condition="Exists('$(CommonAssemblyInfoFileName)')"/>
    </Target>

    <Target Name="CreateCommonAssemblyInfo" BeforeTargets="CoreCompile">
        <ItemGroup>
            <AssemblyAttributes Include="AssemblyVersion">
                <_Parameter1>$(Version).0</_Parameter1>
            </AssemblyAttributes>
            <AssemblyAttributes Include="AssemblyInformationalVersion">
                <_Parameter1>$(Version).0</_Parameter1>
            </AssemblyAttributes>
            <AssemblyAttributes Include="AssemblyFileVersion">
                <_Parameter1>$(Version).$(BuildNumber)</_Parameter1>
            </AssemblyAttributes>
            <AssemblyAttributes Include="AssemblyCompany">
                <_Parameter1>$(Company)</_Parameter1>
            </AssemblyAttributes>
            <AssemblyAttributes Include="AssemblyCopyright">
                <_Parameter1>$(Copyright)</_Parameter1>
            </AssemblyAttributes>
        </ItemGroup>

        <WriteCodeFragment 
            Language="C#"           
            OutputFile="$(CommonAssemblyInfoFileName)"
            AssemblyAttributes="@(AssemblyAttributes)" 
        />
    </Target>
</Project>

Just Import this targets file into your .csproj file. (It might need some tweaks, I sanitized this from the one that we use. This way you also don't need to remember to update your copyright year every time the earth laps the sun.

@colin-young @MarkPflug Excellent; you can obviously chain that to whatever task you need; including the packaging of the NuGet Package and pushing out as needed.

"Hey .Net, see all these files? Can you compile them all into an assembly and then package it for all of these targets? Here's all the details of what to name it and the version."

I'm not sure what I'm missing here; this is exactly how msbuild can work; what have you tried?

@Mike-EEE

What you gain is a sense of discoverability and access -- not just for the default items described by the xsd but any object defined in the file.

Again I'm not sure what you gain by that; 75-80% of the content in a MSBuild Task is simply correctly hooking up the underlying tool into MSBuild and its logging system. You can validate this yourself by browsing the source or by Reflecting on the Binaries with your IL De-compiler of choice. You'll see the majority extend from the ToolTask helper abstract class to perform most of the dirty work.

You gain no additional insight into how the tooling should be used beyond implementation details which should not be relied upon.

I assure you I've spent plenty of time in Visual Studio in plenty of projects some of which utilize WPF pretty heavily; but I'm still missing where you feel this is any different than what is currently provided (but I will admit needs to be much better fleshed out).

Can you provide a resource showing that XSDs are used for intellisense? This is the first I have heard of this.

This is the first Google Result for "Intellisense xsd"; it provides a very good high level overview https://msdn.microsoft.com/en-us/library/ms255811.aspx if there is something more you need let me know. The documentation claims it only goes back to VS 2005; which seems to jive as that's when I started using VS.

And if a process is creating "yet another file" -- even in memory -- when the data it seeks is already in memory by way of symbols culled from a class definition, then obviously that is a very inefficient approach!

How do you think Intellisense is being provided to you right now? I'm not sure that you understand what you're arguing here, or maybe I'm completely off in left field as to why you think this is an issue. You realize that in order to gather documentation from XML Doc Comments Visual Studio is creating a structure (that can even be written out to an XML file) that contains the documentation for each function right?

All of that is beside the point; I strongly encourage you to take the time to understand how Intellisense works before continuing this discussion; starting here https://msdn.microsoft.com/en-us/library/hcw1s69b.aspx, here https://msdn.microsoft.com/en-us/library/s0we08bk.aspx, and here https://msdn.microsoft.com/en-us/library/microsoft.visualstudio.language.intellisense.icompletionsourceprovider.aspx

Thank you for that users voice link; I'll direct additional input there as needed.

@aolszowka LOL I think we are definitely getting our definitions crossed here (and don't get me wrong I really appreciate your dialogue!). I understand how Intellisense works from a functional standpoint, but from a _technical_ standpoint, I do not see any evidence of intellisense being dependent upon (or relying on) an xsd to operate. The article you showed me explains how intellisense operates on an XSD, but it does not say that it _requires an xsd_ to function. Does that make sense? Big difference. :smile:

Again I'm not sure what you gain by that; 75-80% of the content in a MSBuild Task is simply correctly hooking up the underlying tool into MSBuild and its logging system

Right, and the tooling that provides for this is not as effective as say what you see in a Xaml-based context. Perhaps this is a matter of approach, but when learning a new API, I and the developers I know spend a lot of time navigating through objects/classes to learn the system. Documentation is really a 2nd thought and rarely used. In fact words get in the way of the real langage, the code! :)

You gain no additional insight into how the tooling should be used beyond implementation details which should not be relied upon

Yikes, this is a very narrow (bordering on close-minded!) view! This might be the case in actually bolting together a task and a "just get it done and never think about it again" sense, but developers (again, the good ones I know) like to explore. When you impede this process (or make it difficult), this provides poor experience. Poor developer experience results in poor adoption and even worse yet poor sentiment for your product.

This is why Xaml developers are really passionate about their Xaml, because of the _experience_ that it offers, and that is part of the goal here -- to improve the developer experience this format and model.

I do not see any evidence of intellisense being dependent upon (or relying on) an xsd to operate. The article you showed me explains how intellisense operates on an XSD, but it does not say that it requires an xsd to function.

Actually it does; right under the first header:

"After a schema is associated with your document, you get a drop-down list of expected elements any time you type "<" or click the Display an Object Member List button on the XML editor toolbar. For information about how to associate schemas with your XML documents, see XML Document Validation."

Please understand that in this context schema is an XSD (XML Schema Document).

Furthermore, reading the in-lined link (see XML Document Validation) clearly states:

"The XML Editor checks XML 1.0 syntax and also performs data validation as you type. The editor can validate using a document type definition (DTD) or a schema. Red wavy underlines highlight any XML 1.0 well-formed errors. Blue wavy underlines show semantic errors based on DTD or schema validation. Each error has an associated entry in the error list. You can also view the error message by pausing the mouse over the wavy underline."

And even further:

"When editing an XML Schema file, the xsdschema.xsd file located in the schema cache is used for validation. Validation errors are shown as blue wavy underlines. Any compilation errors are also shown with red wavy underlines."

In an MSBuild file this is implicitly loaded by the namespace command:

xmlns="http://schemas.microsoft.com/developer/msbuild/2003"

The exact same way it'd be loaded in for any Xaml or other XML based document format. This is not unique to any of these file formats but is part of the well-defined XML Document Standard.

The article you showed me explains how intellisense operates on an XSD, but it does not say that it requires an xsd to function. Does that make sense? Big difference.

That is a distinction without a difference; however, it is beside the point.

Right, and the tooling that provides for this is not as effective as say what you see in a Xaml-based context.

I'm not sure what you're saying here; I'm not sure how looking at the source of a Task (for example https://github.com/Microsoft/msbuild/blob/c1459f5cbd36f2c33eafc8f4ff087f6ee84c3640/src/XMakeTasks/Move.cs) gives you any insight on how the task (Move) is supposed to be used from within MSBuild; there is no requirement (although it is good practice) that a usage comment is associated with the code that defines the task.

Let’s for a moment say that they implemented what you described; When you were highlighted on a task name that hitting F12 (Go to Definition) took you to either the reference source or a decompiled version of the currently referenced assembly. What would your work flow look like if you were attempting to use the Task (not Debug the task)?

Yikes, this is a very narrow (bordering on close-minded!) view!

This is a view commonly shared by consumers of any API; understanding how the API call is implemented may serve some desire to understand how things work; but it will not explain to you how it should be integrated with other parts of the API to perform useful tasks, nor what the expectation is of the person providing you the API.

Do you need to know how File.Delete(string) is implemented? How would that change how you utilize the function beyond what was documented for it? What about Interfaces or header files; which give absolutely no implementation details (as is their purpose).

When you impede this process (or make it difficult), this provides poor experience.

For a "Good Developer" this has never been an impediment; even prior to the open sourcing of this project. Microsoft does not obfuscate their code; you could always have looked "under the hood" at the implementation of any of the given classes using any number of the available IL Decompilers.

However, as I have stated before there is nothing to be gleaned from viewing the source to the MSBuild Task that was not better covered in the documentation which at the end of the day is the guarantee from Microsoft.

Good developers also recognize at what layer of abstraction to focus their research time on. Good product owners know where to invest time on features and how to get for what is asked for. Good document writers produce enough documentation such that groveling in the source should be unnecessary.

@aolszowka never in a professional environment, by someone who I would consider a professional developer (or Build Master), would I say I've heard a cry for a switch to a JSON based MSBuild.

Hi. I'm a professional developer. I was lead developer on several major projects at a top 20 stock exchange. I was the lead solution architect. I knew MSBuild/Team Build better than most of the Build Masters there and assisted them with developing and debugging their builds. I admit, I don't want a JSON based MSBuild. I want MSBuild gone. It'd be nice if it's simpler replacement was an easier format than XML for humans.

@aolszowka I know the community has had a recent influx of posers from the "DevOps" movement; but there are still the few who actually know what they're doing beyond a drag n' derp interface.

I'm in the DevOps camp, yes. I increased my current client's delivery pace over 100% in one year (with less staff), by adopting DevOps techniques. You know, delivering more value to the business in a shorter time. That's what "professional" software _engineers_ do. Oh wait, I'm a "poser". And I don't know what I'm doing if I don't have a drag n' drop interface. Hmm, where is that IDE? Oh yeah, I don't have one installed. Tough to drag and drop in a text editor dude.

@aolszowka Reading Sayed Ibrahim Hashimi's "Inside the Microsoft Build Engine: Using MSBuild and Team Foundation Build" should also be required reading for anyone claiming intimate knowledge.

Read it. Cover to cover. More than once. Years and years ago. And I still rejected MSBuild. So, wait, by your definition I have intimate knowledge. How can I not like MSBuild? Waiting eagerly for the "well, you obviously didn't understand it" response...

@aolszowka Looking at project.json (I had never heard of this project until today)

Wow. You're dismissing other's viewpoints left and right, denigrating their abilities and making snide comments about them being posers, but you don't know the biggest shake up slated for the .NET Core toolchain? Well, until yesterday, that is...

@aolszowka If you're attempting to target an audience that is not using Visual Studio then they should be encouraged to seek out the recommended project system of choice for their environment.

Yeah, I did. It was called project.json. The environment was .NET Core on OSX. But now they're forcing me to use MSBuild, and I don't wanna, 'cause it's rubbish IMO.

@aolszowka The JSON Kids haven't grown up enough yet to understand why such systems are required; it looks like they're starting to come around though based on a quick google search.

"Kids", huh? Haven't grown up enough? Gee whiskers, if you're calling me a kid, I'd love to hear about your professional experience.

Some developers look up from their environment from time to time and evaluate alternatives. They try out Python, or Java, or Maven, or Scala, or F#, or Fake, or SBT, or TypeScript. And they _learn_. And they _improve_. And they judge their primary development environment tools based on what they've seen delivering value elsewhere.

Other developers focus on one toolchain and one stack for their entire career, and never learn anything outside their ambit. They think their primary/only development tools are the One True Way™ and there's no point looking at anything else.

Just some friendly career advice: I've found that the second set of developers are always the ones who get retrenched first, and that they never reach the earning potential of the first set of developers. But hey, that's just my experience after two decades years in the field, just a poser kid.

In fact, here's a screen grab of my Online Kindle Reader:

image

@shederman

What caused you to Reject MSBuild for your workflows and how do you feel the suggested change will improve MSBuild such that you'd reevaluate it in the future?

Dude, it was just pain. _I_ could get it working, but very few devs spent the time or cared. So if I set something up in one project, they didn't transfer it to another. If there was a problem, I had to fix it. Even those Build Masters were clueless with a lot of the stuff. From what I could see, they were not in your league by any stretch, I'm a busy guy, I don't have time to chase after build issues.

I just want something that's easily edited by the devs, presented in a familiar and intuitive manner, and doesn't require complex internal knowledge. And I understand property groups, but for imperative-based developers they're counter-intuitive, and an impedance mismatch.

MSBuild uses an inference system to determine what it will do and how it will do it. project.json used a declarative system to describe properties and dependencies. It didn't try and say what should be done, that was up to the toolchain, which _knew_ how to build stuff.

Is that sufficient for a fully fledged build system? No, it's not. But it is sufficient for microservice based architectures in many cases. We have small, simple services, with simple build requirements. Why do we need this inference system to describe the blazingly obvious? Restore packages, build, test, pack.

For the few occasions I needed something sequences, a Powershell or Bash script file did the job just fine. Again, it's not complex stuff. We're not building an operating system. The complexity is in the deployment and management systems, not the build.

So sure, we'll need some sort of task running system at some point. But does it really have to be inference based? The stream based solution of gulps is very intriguing. Maybe not viable for .NET, but an indication of how you can have a task based system with imperative type logic.

@shederman

So if I set something up in one project, they didn't transfer it to another.

Is there a reason common includes or pre-defining projects using Visual Studio Templates did not fulfill this need?

doesn't require complex internal knowledge

The devil is always in the details; there is a balance that needs to be struck between flexibility, complexity, and ease of use for developers who are not focused on such tasks I agree. However I thought MSBuild struck a reasonable medium with regards to the requirements placed upon it.

MSBuild uses an inference system to determine what it will do and how it will do it. project.json used a declarative system to describe properties and dependencies. It didn't try and say what should be done, that was up to the toolchain, which knew how to build stuff.

Perhaps as you say I've "never learn anything outside their ambit" but I'm confused as to what you expect the system to do here; at some point you need to declare your dependencies. Could you provide an example of this?

Why do we need this inference system to describe the blazingly obvious?

Because this system is used for more than just simple Microservices; its used to build everything from a Hello World Application to Complex Multi-tiered systems with numerous dependencies and build requirements. The extensibility provided by MSBuild strikes the best balance it can to meet competing needs.

For the few occasions I needed something sequences, a Powershell or Bash script file did the job just fine.

This requires these additional tools to be shipped along side the proposed build system; with MSBuild these can be embedded within the system itself. If your end goal is to reduce the amount of setup an end developer is doing requiring Powershell or Bash to be configured correctly on the system is simply another (easily automated) hurdle.

I did not see a response to the last part of the question; which was _how does this suggestion improve MSBuild such that you'd reevaluate it for your purposes?_

I'm not here to beat the drum to say that MSBuild meets all needs; its far from a perfect solution for all end users. I am however here to beat the drum against trying to modify it into something it is not; and cannot be without significant investment that results in a fundamental redesign of the product.

At some point it is better to abandon such a product and move on to one that is better suited to address the needs of its developers.

Is there a reason common includes or pre-defining projects using Visual Studio Templates did not fulfill this need?

I've used common includes for myself, but have always found them a pain as well. You don't often install/update them so they tend not to be in the new machine builds. Causes some hilarity when the devs can't build and are struggling to find out why, when a magic file is needed for it to work.

I've built VS Templates a couple of times, but found it a bit of a waste of time. The templates tend to require a lot of maintenance that doesn't pay itself back.

I think both those solutions _would_ work in a world where I have a team of capable Build Masters doing these things for me. I've had Build Masters at only one client so far, and it'd be a stretch to call them capable.

I thought MSBuild struck a reasonable medium with regards to the requirements placed upon it

Dude, this is the reason I'm engaging with you. I'm not trying to convince you that MSBuild is awful, I just want you to realize that a great many professional and capable development teams find it to be suboptimal for their needs. The medium it crosses does not approach their needs very closely.

The extensibility provided by MSBuild strikes the best balance it can to meet competing needs

Sure. So it's a spork. I want a fork. Others want a spoon. But it's neither. Sometimes you have to be opinionated, and sometimes you have to realise that meeting all needs is not possible or economic. Additionally, most of my complaints relate to the _way_ the extensibility it implemented. There are other ways to do it.

If your end goal is to reduce the amount of setup an end developer is doing requiring Powershell or Bash to be configured correctly on the system

Um, brew and chocolatey. Getting those tools installed is faster and easier than editing an msbuild file. They're imperative, so devs grok them easily without an impedance mismatch. Are they perfect? No. Would a proper build system be better? Maybe. Would MSBuild be better? Not in my opinion, not my our use cases.

how does this suggestion improve MSBuild such that you'd reevaluate it for your purposes?

I'm not _trying_ to improve MSBuild. I don't want to modify it into something it's not.

I want it out of my toolchain entirely.

At some point it is better to abandon such a product and move on to one that is better suited to address the needs of its developers

Agree absolutely! But it was done, it was project.json. And then MS decided to back out of that decision and move back to MSBuild.

I think we're cluttering an issue thread with a lot of philosophical argumentation :-)

Drop me an email to continue the conversation.

I'm not sure what you're saying here;

I think that sums up this conversation to point perfectly. :smile:

Actually it does; right under the first header:

LOL... I appreciate the line by line of the article you asked me to read (and I did, well, glanced over it because I know most of it already. :) ). We are definitely confused here. You are saying that Intellisense uses XSDs to work on an XML, which is accurate, but not in all cases. In my mind, I was also thinking of Xaml files. Intellisense works just fine there without an XSD. Why? Because of the class definition. In this case (and I would argue should be for all cases) _the class definition is the schema_. I was under the impression that you were saying that the XSD was generated automatically anyways in such cases, which is where I was asking for reference.

you could always have looked "under the hood" at the implementation of any of the given classes using any number of the available IL Decompilers

Right, and what is the amount of effort required to do this? I have to go find a compiler and learn how it works, then navigate to the file and start dickering from there. Converse this with the experience of pressing CTRL-B (via ReSharper -- Maybe F12 also works these days?) on any symbol on an Xaml file and being taken directly to (decompiled) definition. Here the tooling is working with you not against you.

Do you need to know how File.Delete(string) is implemented?

Yes, if I can. Seeing how it is implemented gives me knowledge as a developer and improves my skill, as well as providing me access and visibility into how other developers using .NET (and in the example you provided one who works directly for MSFT!) go about solving problems. Having access to this knowledge ultimately improves my skillset as a developer, and again separates the "good" (knowledgeable/craftsperson) developers from those who "just want to get the job done and move on to the next thing someone is paying me for."

All of this again leads to developer experience. The experience that leads to more knowledge/visibility with the least amount of work/friction is the one that developers will gravitate to and adopt.

What would your work flow look like if you were attempting to use the Task (not Debug the task)?

Now we're talking. There isn't a workflow per se, but more of a tooling access to the information that I require (or are simply interested in). The objective is to reduce confusion and improve accessibility and developer knowledge with a given library and/or API. Xaml development is the best I can think of, which is why I keep referring to it, but I am really open to any system that provides its qualities.

@aolszowka some great discussion here that reflects some of the sort of sentiment I have seen (and was alluding to earlier) towards MSBuild: https://github.com/aspnet/Home/issues/1433

@shederman

I've used common includes for myself, but have always found them a pain as well. You don't often install/update them so they tend not to be in the new machine builds.

Is there a reason deployment methods such as NuGet were not used to keep these up to date (you also mention chocolatey and brew, both of which can do this as well)? Or an even easier step of checking them into the root of the project so they're updated at the same time your Developers use Version Control to pull changes?

Is that the only pain point your encountered?

I've built VS Templates a couple of times, but found it a bit of a waste of time. The templates tend to require a lot of maintenance that doesn't pay itself back.

I guess I found them no more work than editing the MSBuild file for your project; again with various deployment/environment management tools it allows you to quickly update them to the standard you need your developers on.

I've had Build Masters at only one client so far, and it'd be a stretch to call them capable.

Would you by chance go as far to call them poser kids?

Dude, this is the reason I'm engaging with you. I'm not trying to convince you that MSBuild is awful, I just want you to realize that a great many professional and capable development teams find it to be suboptimal for their needs. The medium it crosses does not approach their needs very closely.

And the reason I'm engaging with you is apparently I'm in the group of developers that likes to understand the issues other devs have ran into (IE the limitations to the tooling); understand what their solution to the problem was; and then work out how I'd solve a similar class of issue should I run into it either in my job or when someone posts a similar question on StackOverflow.

Additionally, most of my complaints relate to the way the extensibility it implemented. There are other ways to do it.

I'd like to understand this; what struggles did you encounter.

Um, brew and chocolatey. Getting those tools installed is faster and easier than editing an msbuild file.

Sure is nice when you cut off the potion of that comment that mentions (easily automatable). By this logic though the above solutions should have been eaiser to deploy; you however said it was a pain because you don't open install/update devs?

I'm not trying to improve MSBuild. I don't want to modify it into something it's not.

Yet you're here; on this particular issue?

Drop me an email to continue the conversation.

Seeing as you didn't leave your email address please feel free to hit me up first its my git username at a popular mail service from Google.

@SamuelEnglard

I believe you meant to mention @shederman rather than our esteemed @SamuelEnglard. Or did I just step in it again? :smile:

(excellent questions btw!)

@Mike-EEE

In my mind, I was also thinking of Xaml files. Intellisense works just fine there without an XSD. Why?

Because as I stated in my above post your Xaml file is including the xmlns and Visual Studio is performing this required task for you; in addition it is also performing several XML Doc scraping techniques to gather the information it needs.

Right, and what is the amount of effort required to do this?

None, you're already doing it yourself; you yourself mention you use ReSharper; included in that suite of tools is dotPeek which is one of many IL Decompilers commercially available. This is what is occurring when you hit Ctrl+B in your environment. Other such tools would be Reflector, IL Spy, JustDecompile and good ol' ILDASM.

tooling access to the information that I require (or are simply interested in)

The source to this tool is now openly available (with comments); What you are asking for basically amounts to an improvement to link you directly to the reference source (something done in newer version of Visual Studio such as 2015 when browsing to known sources such as the BCL).

While there are merits in looking at code this does not help someone new to MSBuild understand the system; a design goal for your proposed improved system if I've understood everything you've said here. I just don't see how it gets you there.

I agree this whole thing reads like the XY problem.

@Mike-EEE

Nope you were spot on thanks for the catch!

@aolszowka My email address is on my public github profile...

Would you by chance go as far to call them poser kids?

I don't believe I called anyone any such thing. Ever.

That would be you.

I am tired of you continually trying to make this argument about software tools personal. Please edit or resubmit the questions, or send them to me on my private email address.

Because as I stated in my above post your Xaml file is including the xmlns and Visual Studio is performing this required task for you

OK... so here we have our issue. I do not believe XSD is used at any time during a Xaml file schema resolution. But, I am not 100% familiar with intellisense internals and this is getting outside of the scope here and want to keep this on-topic (did I really just say that?!)

This is what is occurring when you hit Ctrl+B in your environment

Right but this does NOT happen in a MSBuild file. Incidentally, if MSBuild switched/supported an Xaml format this would all work as requested with very little effort. :smile:

While there are merits in looking at code this does not help someone new to MSBuild understand the system; a design goal for your proposed improved system if I've understood everything you've said here. I just don't see how it gets you there.

Haha... sounds like a :-1: from @aolszowka here. I think I've done my best here in explaining my position to you. Sounds like I have done a poor job. Maybe some other perspective/conversation will help me get my point across at some later time. :sunglasses:

Two random ideas I'd like to through out there:

  1. Trivially importing props from another format. A lot of the nice-ness around project.json was being able to edit the configuration trivially. Removing the cruft and doing something like:

MyLibrary.csproj

<?xml version="1.0" encoding="utf-8"?>
<Project>
  <Import Project="project.json" Type="application/json "/>
  <Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
</Project>

project.json:

{
    "name": "MyLibrary"
}

With things like references, <ItemGroup><Compile Include="**\*.cs" /></ItemGroup>, namespaces, and such included by default. Project.json would dictate packing, naming, includes/exludes, dependencies, while csproj would actually glue it all together. Easy, humanely-readble and editable configuration. More convention over configuration. Everyone A lot of people win.

  1. Make it trivial to shell out to something like .csx or other scripting tools for easy extension. Like it or not, writing task is not simple currently. FAKE, Cake, and psake for cross project and deployment orchestration, msbuild for project glue and configration, and the ability to drop into scripts for extending project builds, all while minimizing the need to interact with the Xml people seem to hate so much.

Let me know if any of this seems wrong or missing something. I know this is still an oversimplification of what would be required. Lot of inspiration from @davkean, especially for (1).

So basically:

Let's remove pretty much everything about MSBuild, except make MSBuild the coordinator

I mean it's a step in the right direction. But why do I need or want MSBuild involved at all then? Why am I spinning up ancient legacy tech, that large number of developers have grown to hate, to build?

What's MSBuild's use case in such a scenario?

project.json is acting essentially as the project file
project.json/nuget.json is the references
Scripts for project builds

What does MSBuild do exactly?

We run dotnet build
It spins up MSBuild
Which does nothing other than coordinate a well known list of steps

So, again, what does MSBuild do? And if it does so little, why are we using it?

No, no, how about this:

  1. Build on the project.json project system
  2. Have dotnet build able to restore, build, test, and pack
  3. Allow an extension point to shell out to MSBuild for people who want it
  4. Also allow said extension point to shell out to other build systems

Essentially, we have a "basic process" built into the tools, which can be extended by custom builds if required. The only downside I see about that is that it probably keeps commands in the project.json file, which theoretically should be a pure project definition.

"build": "msbuild MyProject.csproj"; // Ta-daa!

OR, my original proposal, have both. If you have a csproj delivered project, build it using MSBuild, and if you have a project.json delivered project, build it using the built in simple tools.

And the syntax I suggested there is only one option.

We could, e.g. run msbuild if there's a project.msbuild file in the dir. There just needs to be something that tells the "default" system to use a "proper" build system.

A lot of the root cause for MSBuild's over involvement in the VS system is that it tries to do everything. It does references, and build, and project definition? Why? Because it's the root of the build system.

It shouldn't be. The root of the build system should build a raw project definition.

MSBuild should be an _option_ for those needing it's power and extensibility.

MSBuild should be an option for those needing it's power and extensibility.

Really liking this approach! At the very least we need to have more of an options-based solution where you pick your own builder. We have this for pretty much everything else in the CI/CD toolchain... but build is something that is assumed is a particular tech, precisely how you describe here.

Also digging @RichiCoder1's suggestion:

Make it trivial to shell out to something like .csx or other scripting tools for easy extension

Is that PowerShell? There definitely needs to be better integration with that as well. (and also, PowerShell should be more like C#, but that's another discussion. #shamelessplug :stuck_out_tongue: ).

"Ancient legacy tech" like msbuild is usually well tested, covers many cases you are blissfully unaware of (thorough battle tested scenarios), has good integration with many other components and tools.

On a world where .NET is truly a universal platform: all devices, all application types, you need msbuild.

Here are some "ancient and legacy tech" that is as old or older than msbuild that you might be familiar with: UNIX, Linux (powers the world), C (another one), C++ (another one), C#, Visual Studio, MacOS.

Like those, msbuild has not stopped evolving.

Like those, msbuild has not stopped evolving.

Easier statement to accept if it were possible to write MSBuild definitions in JSON. :smile:

No one doubts its power and capability. I am in agreement that MSBuild should tag along for the ride ( @shederman wants off the bus though LOL!). The challenge (or one of them, at least) that this issue is attempting to address is that how you go about interfacing with MSBuild is still very much 2002/2003. I mentioned this in another thread, but it's much like creating a VSIX in many ways: it feels arcane and unlike anything "new" that you see today.

Yeah yeah, I know it's a fool's errand to go chasing after the latest shiny toy, but the _best_ aspects that make these toys appealing should at least influence a good product in some way. I think proper evolution accounts for this, C# probably being the best example.

Nobody is saying we should deprecate MSBuild. Nobody is saying MSBuild doesn't have a future.

All we're saying is that right now, there are two build systems, we like the "new" build system, and we want to keep it. We're saying that MSBuild doesn't work well for a lot of our simpler scenarios. So, just let us keep the "simple build" alongside MSBuild support. Why take the choice away? Why force a complex, counter-intuitive build system on people who just wants to compile the stuff in the directory? If we want to run a non-MSBuild complex build system, why force us to run it _through_ MSBuild too? I mean, that's ridiculous.

I'm even offering to do the work. Put up or shut up, right? I will do what refactoring is require in order to keep the "new" build system working side by side with the old. As long as you don't make it impossible for me, of course 😉.

And it's easy to say MSBuild has been evolving, but I've seen more offered "evolution" of MSBuild in the past week than in the past 5 years! Why? Because there's a prospect of competition. But even significant evolution won't change the fact that it's really more ideal for dedicated build masters running big CI pipelines and complex builds. We don't all have that these days. Not even in many of the corporates. We have simple builds, and simple CI pipelines, and complex deployments. We tend to treat our builds like cattle, not pets - just like our servers.

I personally believe one of the selling points of .NET Core and ASP.NET vNext was to attract a new generation of developers to the platform, to help counter the slow slide in .NET adoption. Is removing one of the selling points of .NET Core, one of the core promises - and moving back to a system despised by a large fraction of your community (especially those working in the side of programming where the new generation cuts it's teeth) really going to help?

This was a bad decision IMO and made without considering all your users or their opinions, so please stop doubling down on it. Admit it, look at alternatives that allow a side by side solution, and move on.

Solutions have already been proffered for a smooth side by side experience.

Saying MSBuild will be shiny and new just because it (say) can now use JSON is so naïve. project.json might look simple for simple things, like defining dependencies, runtimes and stuff. If you have done any serious build work, you know that covers about 10% of what you can do with MSBuild. And, if you need to do those, I'm positive the JSON editing experience will be even worse than XML (as of today's "support" in VS for editing either).

Now, if you were proposing yaml.... ;).

I think there are ways to make MSBuild more approachable (better intellisense, better factoring/split of a .csproj for easier tweaking -.props/.targets alongisde .csproj by default), but its core concepts and features aren't all that complicated to grasp.

Its power and flexibility are only evident once you go beyond "it's a simple coordinator", which is what project.json might suggest is all there is to a complicated build.

Now, if you were proposing yaml.... ;).

Well yes... suggesting that and much more, actually. :smile:

I will say that I think we do a disservice to MSBuild and to the problem at hand by bringing in formats, or rather to describe the problem by using/comparing formats. It's unfortunate that the .NET Core build system only supported JSON, and that MSBuild only supports XML. At the end of the day these formats are (or should be) representing serialized models that are deserialized into memory at some point for the system to execute. But because these systems only support a certain format, that complicates the issue as developers turn it into a religious battle on who has the superior format (never mind the technology!). Additionally, the locked-in format tends to lend itself as the lightning rod for overall user/developer experience and product association -- for better or for worse.

Its power and flexibility are only evident once you go beyond "it's a simple coordinator", which is what project.json might suggest is all there is to a complicated build.

You sort of touch on this, but one thing that is being made apparent to me through this is that there is definitely a conceptual separation between _project definition_ (what .NET Core build is "good" at) and _process definition_ (what MSBuild is "good" at). Having better logistical separation of files (as you suggest!) in the format that developers prefer would go a long way, I feel.

(Btw, I appreciate the heavyweights dropping by to grace this topic with their thoughts. @kzu and @migueldeicaza your opinions are _always_ welcomed! Thank you for taking the time to share. :+1: )

@kzu As @Mike-EEE said, I don't believe anyone is saying we want MSBuild to support JSON. What's kind of evolved out of some of the discussion is that the project definition should probably be json based, and that the build process definition should be based in whatever language the "big" build tool uses (i.e. XML for MSBuild).

But a lot of builds won't _need_ a big build tool, their build is just to build the project definition, without any complexity. If so, why would MSBuild even be required in such a case?

Not arguing it shouldn't be _available_, we all occasionally need a powerful build system. But I don't believe it should be the default for all project types.

Especially given that this lightweight build exists, and is working. It's only the attempts to turn is into an MSBuild analogue that cause a problem. It shouldn't be that.

I don't believe anyone is saying we want MSBuild to support JSON.

Welllllll... technically I am. :smile: (And Yaml, and Toml, and Xaml and ...)

But this issue (to kindly remind is for a Format Agnostic MSBuild, as well as an improved model/representation via POCO) is an "eventual" or future ask/goal, not meant to be considered in the current raging debate. But if it can influence it, all the better. :+1:

Wellllllll, MSBuild already has an in-memory POCO representation of the files, so it shouldn't be too hard for anyone to write another serialization of that. It's all in Microsoft.Build.Construction ;)

And from that model you can see that the underlying concepts aren't complicated at all, IMO

Exactly what we're aiming for here, @kzu!

But to be fair, I think you meant the Construction namespace. :smile:

The goal being that any developer can "bring their own serializer" and MSBuild "just works" nicely with it.

Currently, if you look in the files (sample) you can see that they are all competely entrenched with XML-centric modeling/entities, which is a primary challenge at present.

It also looks like they do not have public parameterless constructors, which will not work out-of-the-box with most serializers.

That is true. Maybe it's "just" a matter of POCO-izing a bit more that set
of classes?

On Mon, May 16, 2016 at 2:10 PM Mike-EEE [email protected] wrote:

Currently, if you look in the files (sample
https://github.com/Microsoft/msbuild/blob/master/src/XMakeBuildEngine/Construction/ProjectItemElement.cs)
you can see that they are all competely entrenched with XML-centric
modeling/entities, which is a primary challenge at present.

It also looks like they do not have public parameterless constructors,
which will not work out-of-the-box with most serializers.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
https://github.com/Microsoft/msbuild/issues/613#issuecomment-219483928

Maybe it's "just" a matter of POCO-izing a bit more that set of classes?

Well-and-wisely-said. It's a slippery (or challenging?) slope. If we POCO-ize them, then the logic that is currently within the objects will most likely move into other classes/services.

And also... while we are in there... are we now defining "elements" or "objects"? :) ProjectItemElement describes or alludes to xml document nomenclature, whereas ProjectItem seems much more like a POCO. Well, to me, at least. :sunglasses:

No small task, but this is exactly the path I am thinking of.

@Mike-EEE I'm not sure what "format agnostic msbuild" will solve. You would still need to understand the concept of properties, items, tasks, targets, order of evaluation, dependencies etc. And how is it supposed to work? Everyone writes in whatever format they want and msbuild can read/write all of those - based on a file extensions or some header in the files?

Won't it make more difficult for "casual" users who look for a solution on stackoverflow etc, now they need to map it to whatever format the project, they are working on, happens to be using? It would make it more cumbersome to help people and to read+debug targets files (context switching between different formats, ensuring that escaping was correctly done in the files).

What is the real problem that you are trying to solve here?

Like @kzu said, moving the defaults to props files, should mostly reduce the csproj file to the most relevant bits - list of files, references etc, which should make it easier to make sense of and edit.

Also, I think it would be pretty straight-forward to create an msbuild task and accompanying targets that turns an arbitrary file into an MSBuild file that can be imported. With proper inputs/outputs, you wouldn't even notice ;)

For that to work flawlessly though, we'd need to allow Import inside a Target (the one after we generate the MSBuild from the YAML/JSON/Whatever).... ;)

But a lot of builds won't need a big build tool, their build is just to build the project definition, without any complexity. If so, why would MSBuild even be required in such a case?

Even regular C# projects depend on the logic in the Microsoft.*targets files which do lot of things like - figure out the assembly references, build resources, satellite assemblies, doing all that for project references, making sure those are available for the compiler to use, bookkeeping for future cleans, incremental builds, incremental cleans, while providing hooks to allow the user to tweak the build process as little or as much they need to. There is a lot of stuff that happens "behind the scenes" in the msbuild targets files and tasks.

And if you are using products like Xamarin.Android/iOS, which have more custom requirements to work with other tools to build, package, deploy, have non-.net target frameworks etc, then all that logic goes in the msbuild targets and tasks files. The user still sees the regular looking csproj with the source/content files, references etc. All the complexity is still hidden in the targets/tasks files.

And how is it supposed to work? Everyone writes in whatever format they want and msbuild can read/write all of those - based on a file extensions or some header in the files?

Correct, it could be a combination of solutions, and I have given my thoughts in the OP. Totally open to discussion/consideration! This is just to get the ball rolling. And it appears at this point it is rolling up a very, very steep hill. :smile:

Won't it make more difficult for "casual" users who look for a solution on stackoverflow etc, now they need to map it to whatever format the project, they are working on, happens to be using?

Same could be said w/ C# vs. VB.NET. :smile: I am sure you have seen the differences of projects in different formats. As long as the model is the same, the format won't introduce variance in how the model behaves once deserialized, correct?

What is the real problem that you are trying to solve here?

There are three (that I can think of at the moment):

  1. One of them as I (tried to :smile: ) state in the OP is that developers/organizations really love their formats, especially after the project.json movement (I invite you to read/participate this thread to see some of the sentiment towards JSON). Me, I personally am very much partial to Xaml. By restricting to a particular format you are precluding developers from using their format (and tooling) of choice. Making the way you can define your project files format agnostic allows developers/organizations to define their builds, and the tooling that might be developed around them to develop those definitions.
  2. The second is that of tooling. Here I bring up Xaml again because I think it has the most mature tooling built around it. Here is an article that I wrote that sort of sums up why I prefer it when defining application entities. If we are free to use any format we like, then we can take advantage around any tooling that has already been created around that format to work as productively as possible.
  3. Consistency. I don't know about you, but I prefer to work in C# for my code. If I can help it, I want to work in Xaml for all my data. Having one consistent format within solutions reduces complexity, context-switching, and overall confusion/hassle. Developers/organizations have their own recommended formats because of this.

Hopefully that helps clarify my perspective. Based on the feedback so far I am/have been doing a terrible job! :stuck_out_tongue:

For that to work flawlessly though, we'd need to allow Import inside a Target (the one after we generate the MSBuild from the YAML/JSON/Whatever).... ;)

Appreciating the ideas here, @kzu. Thank you for brainstorming and providing constructive ideas and feedback. :+1:

Just to be sure here, I'm old skool here and have over a decade's (on and off!) experience going through MSBuild "scripts." I am "ok" with the XML experience, but I would definitely like to see it improved if it can be helped. My goal/intention here is also to consider/onboard the JSON camp and help improve perception/adoption towards MSBuild and its perceived (poor) product experience.

@Mike-EEE help improve perception/adoption towards MSBuild and its perceived (poor) product experience.

For me, that's a simple solution: stop making it the core of all .NET build experiences. MSBuild is a very powerful system, but is complete overkill for 90% of use cases.

So here's a simple question. I get why

msbuild project.xproj

Would use MSBuild. I get it, I'm happy with it.

Why would

dotnet build project.json

Use MSBuild exactly? I mean you've got msbuild right? What does dotnet build add? Nothing other than confusion. If you're going to deprecate dotnet build with msbuild have the honesty to actually pull the trigger on it.

@aolszowka No offense, but you know less about how IntelliSense works than I or @Mike-EEE does. XML schemas are used when editing XML, sure. I highly doubt they are used anywhere else though. Like @Mike-EEE says, it would make no sense to do so. You mention XML documentation, but code that doesn't compile XML documentation (the default) and doesn't even include triple-slash comments still produces IntelliSense. Yes, XAML has an XSD, but if you look into that it includes very little of what your XAML files actually contain or that IntelliSense displays. No, there may be some internal implementation detail that means XSD is produced from other sources internally, in memory, but you'll have to prove that before I believe it. In any event, MOST IntelliSense comes directly from sources other than XSD. The fact that you have to manually produce an XSD to get IntelliSense within an MSBuild file is a non-starter for most. That said, that's obviously a tooling issue that can be fixed, and not a complete argument against MSBuild. I'm not taking sides in that argument, but it's not helpful to have you ignoring other peoples input solely based on "IntelliSense uses XSD" when in fact, it doesn't (beyond some hypothetical internal requirement of the implementation that hasn't been shown to be true).

Haha... @wekempf thank you for that. I think it's safe to say that I have found the code/technical equivalent of my anger translator. :smile: You are welcomed to explain my position any time, haha. :grin: @aolszowka is clearly a super smart cat. We just got caught up in the weeds in trying to articulate my ideas/position, which can be absolutely exhausting I might add. :stuck_out_tongue: I am hoping that we can continue constructive dialogue towards improving MSBuild and its current marketplace perception while maintaining all the power/functionality that those who have invested into it already enjoy without reservation. :+1:

Here's the thing: if you aren't a build/release engineer or otherwise intimately acquainted with msbuild, it's a miserable experience.

Without getting into specific file formats (I don't want to debate XML vs JSON vs YAML vs XAML) all I want (and I suspect many others do also) is the ability to quickly create a new standard project type (assembly, console app) that can immediately build and package on _any_ supported .Net platform that has _any_ .Net toolchain installed. i.e. if I have a full luxury edition Visual Studio install on Windows, the Core SDK on OS X or the Core SDK installed on my Windows build server, I can just do:

  1. git clone {whatever}
  2. {installation specific build command}, e.g. dotnet build or Build - Build Solution

There's no "if you're using the SDK you need to copy these targets from a machine that has Visual Studio installed" and I don't need to worry about magic csharp.targets or any of that nonsense.

It's the difference between saying:

  1. here's a list of ingredients for a cake, go bake me a cake

and

  1. go out to the garage and find this list of parts
  2. take those parts and assemble them into a car
  3. drive to the store, following this route
  4. for each ingredient, find it in
  5. take the following route home
  6. in the following order, do with each ingredient
  7. ...

That's all well and good if you're trying to do something weird, but when it's something simple like building an assembly and packaging it into NuGet package or creating a simple one-assembly program, msbuild is overkill. I don't need or want all that ceremony.

The mistake was trying to make project.json do everything instead of recognizing that it was really good at one set of tasks, msbuild is really good at another (much larger) set of tasks and figuring out how to make them work together. It's like progressive enhancement. You can start with a simple declarative project description, and when you need the full power of msbuild you can add it.

Some tools to aid in managing build files would be nice too...

@Mike-EEE Yep. I don't intend to tell @aolszowka to stop arguing his position, on topic. But his technical argument about IntelliSense is wrong, not only in theory, but in practice as well. When half this thread is taken up with the back and forth between the two of you on that specific point, we're all wasting time here.

If MSBuild files were to acquire the same IntelliSense support that other file types have, even if it did so through the generation of XSD files (provided that generation was friction free), then another MSBuild pain point would be ticked off the list. There's a lot of them to tick off, though. Some more important than others (while I don't care for XML in this context, I'm not convinced any other format is a big enough "win" to matter, for instance). The best idea I've actually heard was to separate the build definition from the project definition, for instance. Do that and it becomes the project's choice to use MSBuild, Make, PSake or any other solution they care to. I actually mostly like MSBuild, to be honest, though comments about "posers" (never constructive) and "build masters" sure indicates that the tool is far too complex at least for the simple tasks, if not in general. A good tool makes it easy to accomplish the simple tasks, and possible to accomplish the complex ones. I'm not convinced MSBuild, today, does that.

@wekempf
I have no problem being called out on anything I've said here; but please take the time to test and prove it to yourself before calling me out.

You could have easily proved this to yourself; as could have anyone in this thread that was unfamiliar with this; but instead I'll do it for you and others in the hopes of moving this conversation forward.

If you open up a new MSBuild File in Visual Studio and start with the default Project Tag you'll notice that there is no Intellisense being provided to you (See screenshot):

nomsbuildintellisense

Now if we add the xmlns (basically importing the schema/xsd) you'll immediately start to get Intellisense:

msbuildintellisenseafternamespacerefernece

The specification says that editors should be free to download from the specified location; but in Visual Studio's case (and I'm speculating on this point) it looks like they made the decision to cache the XSD locally to avoid the web call. You can find the XSD for Visual Studio 2012 located in C:\Program Files (x86)\Microsoft Visual Studio 12.0\Xml\Schemas\1033\MSBuild

The XSD standard allows for comments; if you look at that file you'll see there are several comments already available; so I could quickly show this in action I simply edited the documentation provided for ToolVersion

itseditable

I hope that settles both your and @Mike-EEE 's concerns that The Visual Studio Editor does indeed use XSD to provide Intellisense as has been documented by Microsoft in several places as the MSBuild file is nothing but an XML Document.

@colin-young

That's all well and good if you're trying to do something weird, but when it's something simple like building an assembly and packaging it into NuGet package or creating a simple one-assembly program, msbuild is overkill. I don't need or want all that ceremony.

I get, and agree with, the sentiment here. But, "if all you're doing is creating a simple blog, using C# is overkill" can be said as well. The power of MSBuild isn't really the issue. The complexity certainly is, but there may way be some things that can be done to reduce that complexity, at least in the "simple" cases. Maybe not. But dismissing the discussion about that is akin to the claim that Microsoft is dismissing the discussion of throwing MSBuild off the bus.

I don't want to debate XML vs JSON vs YAML vs XAML

while I don't care for XML in this context, I'm not convinced any other format is a big enough "win" to matter, for instance

I would like to kindly remind everyone that the purpose of this thread is to remove a dependency of format altogether, so that there is no longer a debate/battle at all :) If anyone can choose the format they wish to bring to the table (or "Bring your own serializer" as I saw in another issue) then they not only have the freedom to work with MSBuild in a manner that suits them best, but avoid having to argue about the most optimal format (as they see fit) to use in the first place.

The best idea I've actually heard was to separate the build definition from the project definition, for instance.

This is probably the biggest/beneficial yield from all of @shederman's ranting a raving :smile: , so I credit him: the notion that there is a project definition and then build (process) definition. I agree this is a major "aha" for me personally, and would love to see this adopted in some way going forward.

@aolszowka You're simply wrong. I'll prove it to you. Create a C# project add a class and see you get IntelliSense.

See, you're arguing from a very narrow POV. You are 100% correct about how the XML editor provides IntelliSense. You're also 100% wrong on how Visual Studio provides IntelliSense.

@wekempf
We're focused on how documentation could be provided to MSBuild in this conversation; not how Intellisense operates for other source types or editors. I think if you read up above I mention that XSD's are but one of many patterns Intellisense uses to gather input for its purposes.

Hahaha OMG @aolszowka at this point I am happier that at least someone else (@wekempf) understands what I was saying rather than trying to correct your understanding of what we are trying to convey. :smile:

@aolszowka Yes, and you're POV is still too narrow. XAML is "just" XML. The XAML editor in Visual Studio, however, is NOT the XML editor, and it provides IntelliSense for everything without using XSD files. Create a WPF project and prove that to yourself, if you insist on such non-"proofs" on this topic. What @Mike-EEE is suggesting is that the MSBuild files can have a much better editing experience if we don't treat it as "just XML".

@aolszowka BTW, I've never considered MSBuild's XSD files to be well thought out. You add properties to a property set with arbitrary element names that the editor flags as errors, for instance. That's not a good experience, especially for those new to/learning MSBuild.

@colin-young with everything becoming/being nugets, even down to MSBuild itself and even the compilers, you get your git clone && build for free and without changing a bit of anything ;)

@kzu Maybe eventually, not today. And we've a long way to go before that's true in the future. There's still pain points in bundling .targets up in NuGet today, for instance (chicken and egg issues that require you to boot strap before you can really build). These things can certainly be fixed, and there's work arounds now, but "without changing a bit of anything" is overselling it.

@wekempf I'm not trying to dismiss any of the discussion here. I was merely trying to point out that:

  • there will likely be a lot of input from people in the "I never want to touch msbuild" camp if we do indeed end up with msbuild as the default project format, and they are going to have very different motivations and requirements than the typical msbuild expert/power user
  • we could really use some better tools for editing msbuild files (I'm not touching the XML/XAML/Intellisense discussion myself as I have no dog in that hunt)
  • just maybe we're (I'm?) trying to attack 2 different problems at once

    • a declarative description of the purpose and outputs of a project

    • instructions about how to achieve that

Any discussion about ways to improve msbuild are important, IMHO.

@kzu it's a bit difficult to comment without knowing what the final file formats are going to look like. I really just wanted to express my experience with what I liked about project.json and how it eliminated a class of issues I've historically encountered with trying to build stuff on headless build servers. I will continue to follow the latest changes in the .Net ecosystem, and will continue to express my opinion when I feel that some of my favorite features might be under threat.

I think the problem is that right now an MSBuild file (.csproj, .vbproj, etc) is just treated as an XML file. This means that it only gets the XML intellisense provider, which is entirely based upon the XSD provided, as @aolszowka has repeatedly pointed out.

Rather than suggest that MSBuild be able to consume arbitrary file types (json, yaml, XAML, etc) which I think is an unrealistic idea (as well as a bad idea), I think it would make sense to suggest that there be a custom intellisense provider for MSBuild files. For example, when I type "$(", the intellisense provider could be smart enough to provide a completion list of all the properties that are currently in scope. The same for "@(" and items. F12 navigation for items and properties would also make MSBuild files a lot easier to author. I think this is what people are really asking for, they want it to be easier to author. XAML is easier to author because it has really nice intellisense support. That intellisense comes from the rich object model that drives XAML. There is a lot of "reflective metadata" that could be made available to an MSBuild file without completely re-imagining its fundamental nature (XML).

Why do I think that arbitrary formats is a bad idea? I'm glad you asked. Because it means that when I join a new project, I will likely have some new flavor of build file to learn. I disagree with the opinion that json is somehow magically easier to merge than XML. Both can be easy to merge, or a disaster to merge depending on the formatting and structure. XAML is just XML, so it clearly cannot be easier (or harder) to merge.

I think a lot can be done to simplify the default project templates, and make them easier to maintain, but that is a completely different discussion (that is currently happening in other github repos).

That intellisense comes from the rich object model that drives XAML

To be sure, that rich object model is actually the class definition(s) of the .NET POCO(s) you work with, which is what we're aiming for here. It is powerful (and dynamic) because if you change a property to your POCO, you've automatically changed the schema to the document in which you are authoring. This is a key dynamic that I have yet to find in any serialization tech and it's (one of the many reasons) why I prefer Xaml.

One of the (latent) problems we're encountering is that the "$(" and "@(" aren't really based on POCO's or anything .NET, but are ultmately magic strings (for lack of a better word) that are native to MSBuild's (XML) DSL, and not necessarily anything in .NET. If it's not tied to a .NET (POCO) model, then there is a large amount of (unnecessary) work that is required to build the model to make it tool friendly (XSD coming to mind here) and as a result should be a design approach that is ideally avoided.

Because it means that when I join a new project, I will likely have some new flavor of build file to learn.

Now you know how team JSON are feeling these days. :smile: Hopefully if you join a (functional) team, one of the discussions that they (should) have up front are things such as preferred formats and code formatting/styling rules. Meaning you should have a voice (vote) on the type of format you would prefer. Having options allows for more voices to be heard, and hopefully lending towards a better team experience.

@MarkPflug Yeah, that's mostly what @Mike-EEE was arguing for. You have to go a bit deeper, because MSBuild is an "extensible" system, and currently the only way for IntelliSense to know how to work with those extensions is through XSD. Either that XSD needs to be generated from the source (for instance, the C# code that implements a new task) automatically, or better yet, we drop the need for XSD entirely, as the XAML editor did. No more squiggles for things like SccAuxPath. IntelliSense for $() references. IntelliSense for arbitrary arguments to new task elements without the need to hand generate an XSD.

Other things that would help:

  1. Full support for NuGet. Rather than assume .target files are somewhere in the file system, specify NuGet references for them, solving the "boot strap" issue in the process. Also, adding a NuGet reference shouldn't require both the project file and the packages.config file to be modified. Far too often one gets modified correctly and the other doesn't, leaving you with a mess to clean up.
  2. "Proper" support for wild cards (yes, this is mostly a Visual Studio issue, not an MSBuild issue).
  3. Not requiring unload/edit/reload. Yes, that's entirely a Visual Studio issue, but this is the only reasonable place to bring up that discussion.
  4. Some consideration in how to reduce merge conflicts. (2) may go a long ways towards that. We don't complain as much about merge conflicts in XAML files, likely do to them being edited less frequently and edited in a fashion that generally maintains the structure, while simply adding a class file today can result in multiple conflicts.

Those are the biggies, but there's lots of other pain points with MSBuild. Like I said, the need to have a "build master" is telling here.

@wekempf

  1. So then MSBuild has a dependency on NuGet. Now I need to boot strap NuGet. I'm trading one bootstrap problem for another. There needs to be "something" on the machine to enable the build. Right now it is MSBuild, and a well known set of targets files.
  2. That is NOT an MSBuild issue at all. It is entirely a Visual Studio project system issue. There are discussions going on over in the AspNet/Home repo around this issue right now I believe.
  3. Again a VS issue, not an MSBuild issue. If you use VSCode, this issue doesn't exist.
  4. I agree here to some extent. The difficulty in merging the current .csproj flavor is due to the way that VS updates it. If it was more rigorous about the ordering of the files (alpha), or used globbing includes (which I think is the current proposal) **/*.cs, that would alleviate this problem as well.

Please list the "other pain points" with MSBuild so that we have something to work with. This argument that we need an expert to use MSBuild is true, in the same way that you need an "expert" in C# to write C# code, or an expert in Node to write node.js stuff. Yes, complex things require expertise to master, that is a fact.

@wekempf @Mike-EEE
So to rephrase your ask in another way: You'd like an MSBuild-centric Editor Interface to Visual Studio? (Similar to the designer surfaces for other file types such as Resource Files, XSD Explorer, and the Xaml Editor).

That's a much different ask in my opinion than what was being posted here.

While it something that I think would be useful; if I were asked to pit that against other needs in the Visual Studio ecosystem I'd personally place it low. However I'd still throw a vote it's way.

For what its worth the undocumented debugger understood enough about the $() and @() syntax to give a somewhat usable view of these properties and ItemGroups; could that be leveraged?

You'd like an MSBuild-centric Editor Interface to Visual Studio?

Well, close. You mention Xaml editor, so I will use that because that is precisely the paradigm that I am striving for here. However, why make a whole new editor/interface when the Xaml editor you mention that is in VS works perfectly well for defining/describing _any_ POCO you can think of?

Today. No installs or modifications or magic of the sort.

Accordingly, what is being asked here is is a POCO-based model to describe MSBuild documents. Once that is done, it can take advantage of the _tooling that already exists in Visual Studio_ to describe/define those documents. Additionally, opening up the format to any format/serializer that is available only opens the options even more (to other IDEs/tools other than VS, or even editors within VS).

So I guess I should say that... the tooling/editor/interface is already available in Visual Studio, it's just that MSBuild needs to evolve/update itself so that it can be compatible with it and take advantage of it, if that makes sense.

@Mike-EEE That makes much more sense; I'm not sure I agree with a change to the format. However it is much clearer that your end goal is a better designer; using the Xaml Designer as an example.

@Mike-EEE I'm trying to imagine what a XAML based build solution would even look like.

Imagine I want to customize my build process. What is involved? I need to define a POCO? Okay, I write a new C# file. Presumably, we'd want to support any .NET language, VB, F#? Once I've got my task type defined, that lives in a new project somewhere, right? I need to build it into an assembly? Well, we've got a recursive build issue to think about now, but we can surely solve that. That assembly needs to be referenced from the XAML. Now, I have intelliense in my XAML file, so I can have an easier time writing my original build script.

That doesn't actually sound like it is making the "overall" experience easier to me. It is making one small part of the build authoring process easier.

If I am completely misinterpreting your suggestion, please clarify.

Hey @aolszowka as long as it supports Xaml that is the only format I care about. :wink: Again for me supporting additional formats is a value/adoption proposition. I think it would be a cool way to extend an olive branch (for lack of a better analogy) to the community to say "hey look, we support your favorite format, get over here!" This sort of emulates the strategy that MSFT is implementing as a whole with Azure and Visual Studio, too. :)

@Mike-EEE I get that you are in love with XAML. But honestly, XAML is going to do nothing to attract non-MS developers to the ecosystem. MSBuild is obviously not going to be rewritten to ONLY support XAML. I think we can agree that that is never going to happen. So, you are asking for XAML to be supported in addition to the existing XML format that we have today? Why do you want XAML? My understanding is that you want the XAML intellisense provider when editing your build files. The XAML intellisense provider isn't going to give the same experience that project.json provided anyway. The autocompletion on nuget packages, requires querying the nuget repositories, that isn't something that XAML is going to magically provide.

People have espoused the simplicity of project.json, but I think it is only true when compared to the current complexity of the default .csproj file format. There is no reason that the default template needs to contain everything that it does today. If you have a simple build, your .csproj file could be as simple as importing the "Microsoft.CSharpDefaults.targets" (this doesn't exist today, I'm imagining what things could look like). Additionally, the .csproj file could be split up into separate pieces. Perhaps all your NuGet dependencies live in a separate msbuild file (similar to packages.config) that gets included for you, and that particular file would get intellisense (and UI) support from VS; it also wouldn't require unload/reload.

I actually, really like XAML too. It is a great tool for what it was designed for. I just think you need to really consider what, and more importantly why, it is that you are asking for.

@wekempf this can be done _today_. Here's a trivial batch file that downloads nuget.exe, restores build-script packages from a packages.config (which could include MSBuild or MS's version once it ships, plus the MS compilers, plus the NuGet Tasks and targets -someone needs to package those as another nuget ;)).

My point is that we're way closer with _existing_ tools to achieve that seamless experience than many think. @MarkPflug bootstrapping MSBuild, NuGet and the .NET compilers is getting easier by the day now.

That doesn't actually sound like it is making the "overall" experience easier to me. It is making one small part of the build authoring process easier.

Definitely. If you know anything about TFS Build process, then you know this was/is a nightmare. Getting all the assemblies resolving correctly, and then setting up the environment. Total hassle and a terrible experience/(and I would say name) for Xaml.

So, after bragging that there isn't any installs or magic (naturally :smile:), the point you make is quite salient, in that currently the Xaml designer works perfectly in the current csproj scheme. Its references/assemblies/projects are resolved from such. This of course would have to change with the new format. The csproj file (keep in mind is a serialized POCO) would get all of its references/data from ... itself? Another file? This a tooling consideration/concern that would certainly need to be ironed out.

But honestly, XAML is going to do nothing to attract non-MS developers to the ecosystem

Sooooooo that's a "yes" to supporting other formats, then? :smile:

Why do you want XAML? My understanding is that you want the XAML intellisense provider when editing your build files.

Well that's a part of it. It's the overall tooling experience, and also design. I explain here why Xaml is better than the app/web.config and project.json model, and I also explain here why it offers a better IDE experience. The key with the latter is that it lends itself very well for potential visual design. Which is great for supplementing the text design approach.

The autocompletion on nuget packages, required querying the nuget repositories, that isn't something that XAML is going to magically provide.

This is true, but in my opinion we shouldn't be using a text editor to define such things. :) Rather we should be making use of the wonderful UI that has been built around the NuGet Package Manager and have better integration with files with which they interact with and/or create (IMO of course -- I'm sure there's at least _one_ person out there who disagrees with me :wink:). Which leads to your next point:

(this doesn't exist today, I'm imagining what things could look like). Additionally, the .csproj file could be split up into separate pieces.

This is great! No I don't have all the answers and am just getting the conversation started. And I agree with the separate files. This is something the TFS Build system could have benefited from _tremendously_. I am definitely pro-Xaml but I am not locked in as it being the one and only format, but I think it should be _one_ of the formats supported.

Basically, in my world "all roads lead to MSBuild." Whether that road is built on XML, JSON, XAML, YAML, TOML, is up to the adventurer who takes the journey. How is that for some potential cheesy marketing mumbo jumbo? Ha ha. :smile:

Basically, in my world "all roads lead to MSBuild."

Just to be very, very clear: for a great many people that statement does not hold. I certainly see no reason why MSBuild must be executed by dotnet build unless I explicitly tell something (in project.json, or by presence of a xproj file or whatever) to use msbuild.

Should you choose to type msbuild I see every reason why _that_ should run MSBuild 😉

Discussion of file formats should be fit for purpose. XML has a place for complex, powerful build systems with great customisability. So does XAML. I agree that in such a system JSON may not be ideal, although my preference would be for Roslyn provided build definitions ala sbt or gulp, but that's a different argument.

XML, however, is overkill for a _project definition_ with a static, never changing schema. It's unnecessarily verbose, and adds a level of self-documentation which is just 100% unnecessary for a static schema.

So, why does my simple build that just builds the project definition require MSBuild?

I have heard not one valid reason for that. Just hand wavy rubbish from pro-MSBuild people talking about how powerful it is. Yes, it is, but that's a knife that cuts both directions.

I understand why Build Masters like @aolszowka have a vested interest in companies like mine using Build Masters for even small, simple builds; but it's just a waste of time and money, and makes zero economic sense to me.

Is .NET Core a jobs program for MSBuild Build Masters? Or is it a _framework_ for effective software development?

Discussion of file formats should be fit for purpose.

Ah but again in my world there is no discussion of formats. You pick the format you prefer and enjoy the most, with the best of tooling to support it. Look how ideal my world is, so perfect, so pristine, so fantastic and wonderful... so completely non-existent and devoid of reality. :laughing:

XML, however, is overkill for a project definition with a static, never changing schema. It's unnecessarily verbose, and adds a level of self-documentation which is just 100% unnecessary for a static schema

Correct, according to _your_ preference and way of approaching problems, which might be more ideal, proficient, and efficient. But might not be the case for others, where that verbosity is _familiar_, and as such (ironically) makes them more efficient and proficient. :) Why would we want to deny them their preference? Again, I go back to the original spirit of .NET: build your application in the language you prefer. Those who prefer C# can look at all the negative qualities of VB.NET and say why it is an inferior language. And those who (increasingly, I might add -- I am turning into one of them!) prefer F# look at all the negative qualities of C# (OO) and espouse why it is more expensive to build applications using its featureset.

But even still there are still experts in C# that can actually build solutions cheaper in that language than F#. So which is "better?" This foolish argument/discussion is the same I see whenever I see data formats. The "problem" isn't the format, but the technology that dictates/enforces one and only one. This is what I was driving at with the road analogy (seems like that one, didn't stick, either, LOL!).

So to (hopefully) sum it up: options -- like greed -- are good. :smile:

options -- like greed -- are good.
The "problem" isn't the format, but the technology that dictates/enforces one and only one
build your application in the language you prefer

👍

Dude, as long as I don't have to put my project definition in XML, and I don't have MSBuild running when I dotnet build, I really don't care what other formats are supported.

@shederman that's like saying that as long as whenever you write C#, it's compiled with such and such compiler and it's AOT'ed rather than JIT'ed and at runtime the GC uses that algorithm instead of that other one and what-not, "you really don't care".

why would you even care what happens behind the scenes as long as you write C# or "not-XML" as in this case?

Dude, as long as I don't have to put my project definition in XML, and I don't have MSBuild running when I dotnet build, I really don't care what other formats are supported.

Sounds like a I hear a vote for .INI format!!! :smiling_imp: :smiling_imp: :smiling_imp:

@kzu Not really what I'm saying at all really.

More like, I'm writing a Console application and it's been working fine, and now you're proposing to put an OWIN layer into Console apps. I get that it's powerful and flexible. I get that it works well for big and complex use cases.

But I don't get why I need it in my Console app.

@kzu Oh, and by the way, I _can_ choose between JIT and native, and I _can_ choose the GC algo, and I _can_ enter low latency sections, and I _can_ choose between C# and F#.

But I _can't_ choose between simple build and complex build apparently.

THAT'S a step too far. 😕

@kzu Throwing in a batch file is cheating. First, you're no longer an MSBuild based build. You're a batch file based build that uses MSBuild internally :P.

However, I could accomplish the same thing using another MSBuild file (in fact, I have, many times). It's still cheating. IDE doesn't know how to use this boot strapping concept. Clone the project and open the solution in the IDE and the first build is going to fail. This shouldn't be the case.

@wekempf What about declaring the restore as part of the BeforeBuild target? Its still contained within MSBuild.

If your goal is to get Visual Studio to bootstrap as necessary that would be the place to do it. Ideally throwing the logic into a common include and then having each of the projects reference that target. MSBuild will automatically detect that the target has been ran at least once so you shouldn't have to worry about keeping track if its been initialized or not.

@wekempf I dare you to clone that repo and have it fail the first build ;).

If you're opening the source in the IDE, you already have MSBuild. Everything else are nuget packages, which both the IDE and the batch file and MSBuild know how to handle. That's the magic of MSBuild (this time for my "ultimate cross-platform nuget package restore"), once more.

@kzu That's the magic of MSBuild

And yet, so many developers seem to dislike it, despite this "magic".

I don't really want to polarize this discussion any further, but I think @shederman's side of this discussion needs more support. I'm in the camp of not being content at all with MSBuild in its current incarnation. While plastering it with a better editor experience helps, it is only a remedy for a symptom of a deeper underlying problem.

To me, the problem with MSBuild is that it is too complex. It is built in a way that makes any serialization (XML, YAML, JSON, you name it) of its object model a nightmare to work with. To ensure a beautiful format, the serialization needs to be more than just a textual dump of an object model. It needs to be hand tailored to ensure that it looks and feels elegant and intuitive to a developer in any editor he chooses.

The textual representation used to drive the build engine needs to be a simplified abstraction of the object model. It needs to be true to the chosen format and should be perceived as elegant and easy to author. While this is wholly possible to do in XML (I think XSLT is a rather elegant XML-based language, for instance), it is not the case with MSBuild.

For inspiration on how an elegant and simple build system for .NET can look like, please check out Cake, which with little effort can be made just as extensible as MSBuild, but without all the complexity of having both XML and precompiled MSIL (Cake is all just C# in textual form; Cake does the compilation for you with Roslyn) and with a much, much simpler object model.

Cake, coupled with what project.json did, would cover well over 99% of what everyone uses MSBuild for today. Only most people wouldn't hate it, which is my experience from being a software architect and lead developer for large development teams on the .NET platform since the first beta version of .NET 1.0 shipped in 2000.

Now, I'm not advocating for replacing MSBuild with Cake (although that would be have my applauding, cheering and eye-watering support), but I'm trying to get those involved in this thread that believe the problems of MSBuild is possible to fix with different serialization or a better editor experience understand that they are wrong, and perhaps Cake can work as an eye-opener.

A little interesting nugget out there to help support the case for POCO. The mighty Rick Strahl understands: https://weblog.west-wind.com/posts/2016/May/23/Strongly-Typed-Configuration-Settings-in-ASPNET-Core

This of course is using ASP.NET Core, so it's in JSON. But again once you are working directly with an object model (and not a document object model), you can utilize any serialization mechanism you wish (while also using the corresponding tooling along with it). It's certainly a different way of thinking, but I am hoping this sort of example helps demonstrate what we're after here.

Given that this issue is probably NEVER going to happen, I believe that a sane tool that generates and outputs msbuild XML files IS possible as a community project. For those developers who hate MSBUILD, and who have devoted considerable time and energy to understanding the crazy that is in there, we HOPE that Microsoft will improve the story, but we do not BELIEVE it.

Here are the alternatives I am thinking of:

  1. CAKE. A C#-based DSL for elegant build systems. If the MS-in-house solution doesn't help me, this is the direction I'm going for. Extend cake to generate and invoke msbuild, and move my build "orchestration" out into CAKE.
  2. An "msbuild on rails" editor, this I think, must be what Microsoft themselves are doing, and I am thinking it's what 90% of developers who are mad about going away from Project JSON will accept.

For those developers who hate MSBUILD, and who have devoted considerable time and energy to understanding the crazy that is in there

Excellently said @wpostma! <3

I think what gets lost in all of these requests is that many of us have spent over a decade in this API. This is not something where we are coming off the street and making requests out of thin air. We are loyal, long-time users and want to see MSBuild evolve and improve to reflect the current marketplace expectations. At least, that's how I feel. 😄

cake, bau, bounce, nake, psake, fake, fstoml, ...

All these build automation tools for .NET show there is a lot of room and desire for alternatives to classic msbuild. I suppose they could all benefit from the POCO approach proposed in this issue one way or the other.

I think something else getting lost here is the unfortunate conflation of what I consider build script vs project definition.

In MSBuild, my project definition is the same as my build instructions. There isn't really another language that I can think of in which this is the case. The JSON format is exceptional as a project definition, but terrible as a set of build instructions. I think the people who have the most issue with project.json are coming at it from having already built a robust set of build instructions in the existing XML format, whereas the people who really dig project.json are using more or less build defaults.

As someone who has wrangled rather extensively with MSBuild's horridly poorly documented XML structure with its loads of gotchas and totally wonky schema, I'd honestly like to question the sanity of anyone suggesting that it is easier to deal with than build scripts. However, I also understand the ecosystem and investment hand-crafting some csproj takes, so I'll stop short of that.

I think the ideal is in between: a project definition file that is easily hand-editable, succinct, and merges well, and separating the build definition itself for maximum flexibility. That's why I really liked project.json, because it satisfied the simplicity angle while allowing for choosing whatever build system that you found most convenient. Now, if only VS would let you hook in your own build commands, then the disentanglement would be complete. Then you could use cake, fake, psake, etc with VS "Build my solution" support, and those who still would like to use MSBuild due to legacy issues, time investment, or just plain preference would be able to do the same.

So to sort of reiterate the purpose of this issue: it's not a request to support a particular format (JSON/XML/etc) in the traditional sense, but to move to _a POCO-based model_ that will allow you to serialize it in _any_ format you wish (with XML and/or JSON ideally being supported out of the box). Once it's capable of being serialized/deserialized in any particular format, then you as a developer/team are free to use the tooling you feel most comfortable with to describe your build files, thereby (in theory 😄) maximizing your development efficiency.

The idea is not to constrict to a particular format (which seems to have been the dangerously divisive strategy employed to date), but to allow developers and their teams the _choice_ to develop their build files in the format in which they feel they are the most comfortable and productive.

In short, supporting only JSON or only XML is about as bad as creating a solution where it supports only C# or only VB.NET. Same difference. We obviously have the ability to create .NET projects in any supported .NET language. In the same spirit, the analogous ask here is the ability to create MSBuild files in any supported data format (as they would all be based off the same POCO model).

BTW/FWIW, I am very happy to see the upvotes for this item currently sitting at 40, which make it by far and away the most voted issue in this repo in that regard:
https://github.com/Microsoft/msbuild/issues?q=is%3Aissue+is%3Aopen+sort%3Areactions-%2B1-desc

Thank you to all who have supported this idea!

And how would project to project reference work in this ideal world? How can an arbitrary build script coordinate artifacts with its project references that can themselves use whatever?

How would incremental build work inside the IDE? (FWIW, Cake has _no_ support for incremental builds).

"Anyone" could write an awesome editor extension for editing MSBuild. No need for core product extensions, I THINK

And how would project to project reference work in this ideal world? How can an arbitrary build script coordinate artifacts with its project references that can themselves use whatever?

I am afraid you are going to have to provide a little more context around this one. If one build file has some resources that are generally/generically available -- as in a resource dictionary -- and it includes another build file somehow, (which may or may not have its own resource dictionary), they both should be able to access each others' resources in a generic way. This could and should be done through the use of both strongly typed and basic string keys. WPF uses a model that is like this and has been very successful and popular.

In any case, these seem like very rudimentary problems/issues in which much more difficult ones have been answered by the talented engineers behind the walls of MSFT there. But, I am honored that you asked me thinking that I might have a better answer. :)

How would incremental build work inside the IDE? (FWIW, Cake has no support for incremental builds).

With incremental builds you are referring to building only the artifacts/code that has changed since the last build? If so, I might be a little lost here as this does not have much to do with this request/ask. This issue here is asking for a POCO-based model to describe the tasks and items which would then in turn provide these sorts of functionality and features.

"Anyone" could write an awesome editor extension for editing MSBuild. No need for core product extensions, I THINK

I am not sure if you are still in ideal world here or the real one here. If in the real world, I will say that it is pretty telling/interesting/foreboding that no one has done this to date. 🙁 Please correct me if I am wrong. No one seems to be a fan of the current MSBuild DOM and what is worse is that it seems as if no one seems to want to -- or _has wanted_ to -- improve its experience through tools/extensions.

Regardless of real or ideal world, I am also unsure how this would be done without an extension? Every editor that extends a format that I know of (example 😄) is in the Visual Studio marketplace as a download/install/extension. Unless there is some magic you are aware of, an extension/installation of some sort would be required.

I meant that just like there are myriad intellisense providers for json formats/schemas (which took quite some time to show up too, btw, not the format's fault), without requiring any changes to the json format or data model, there is no need for such a change in MSBuild core in order to provide better editing experience on top.

Including viewing it as something different inside VS, if you wanted to. The fact that nobody has done it only speaks to how little push there is for that. Just like there wasn't any need to provide super awesome editing experience for .net config files for years and years.

I mean, editing that thing isn't rocket science :p. I do it just fine and love it, even if I sometimes wished there was something a bit more integrated.

Including viewing it as something different inside VS, if you wanted to. The fact that nobody has done it only speaks to how little push there is for that. Just like there wasn't any need to provide super awesome editing experience for .net config files for years and years.

I would point out that this was a symptom of the state of the .Net Ecosystem until recently. It was largely expected that the only solution was Microsoft's, and pain points were either just dealt with or patched over behind closed doors. There was, and is, a huge mass of "dark matter" developers that simply just took whatever Microsoft delivered and used it (or were told to use it by execs), whether or not they were happy with it. That this issue exists and has been as active as it has is somewhat testament to that.

I mean, editing that thing isn't rocket science :p. I do it just fine and love it, even if I sometimes wished there was something a bit more integrated.

No, but it's pretty close. I know a great amount of people who have been completely stumped by MSBuild features like batching and transforms, and those are the ones who tried. Most developers I know don't even touch .csproj and treat simple changes as black magic. Had one review nearly get rejected because a nuget package added a target and props :P. Admittedly this is all anecdotal though.

I'd rather say that the .NET ecosystem hasn't matured yet to the point where it takes matters in its own hands, rather than always waiting for MS to deliver everything to satisfy everyone.

Like I said, it's been implied in this issue that improving MSBuild's editing experience is something that only MS can/should do. This is hardly the case.

If people cared that much, they should probably get on to "fixing" that already. There's is more than enough open source VS extensions in the wild (by Microsoft and others) to kick-start any such effort by a committed individual.

But I do applaud the renewed effort we're talking to improve MSBuild by implementing most of the ideas that devs enjoyed about xproj/project.json.

Like WPF, I do believe MSBuild is a gear piece of technology that just happened to languish for some time (in WPF's case, still is, unfortunately)

Like I said, it's been implied in this issue that improving MSBuild's editing experience is something that only MS can/should do. This is hardly the case.

I'd normally agree, except as we speak Microsoft is implementing a pretty darn massive overhaul of .Net's build, MSBuild's defaults, and how VS interacts with MSBuild. If there was ever a time that this was on Microsoft's plate, it's now.

I've been developing with .NET since it was in beta and have been the lead at work that had to get into the MSBuild nitty gritty setting up CI servers and such and I've always had a hard time convincing others to follow. As somebody who is primarily doing development work and once and a while modifying and tweaking build files my problem has always been that there is just too much time between each time I need to work with MSBuild specifics so I always have to keep relearning the same things especially because the majority of it I find very unintuitive, verbose, and a lack of well established documentation and convention. Even look at things like community tasks which was once promising now seems very outdated. As a recent example of one of my efforts with MSBuild, which took over 2 days to get working the way that I wanted, I needed a simple way to capture TFS version info in a file and package along with a web app. The same task would have taken me minutes in NodeJS/Git and would have felt much less like a hack. I could have come up with a solution quicker and more elegantly by coding it up in C# as well if the result was easy enough to plug into the build pipeline. For other process oriented stuff involved in packaging and deployment I've learned to lean more heavily on scripts such as Powershell since they merge better, are far more capable than MSBuild in this regard.

After spending time in the recent years working more in JavaScript and TypeScript both client side and with Node I've developed a deep appreciation for the simplicity of the model where the same language can be used for builds and application code. In the Node world it takes no effort to convince developers to tinker and explore the build process and I believe that's why there is such an explosion of mature and highly ergonomic process related projects on NPM.

Ultimately what I think is needed is for Visual Studio and perhaps MSBuild to make it easier to accept alternative build commands. I think VS Code is doing a great job on this front but there are still pain points that I think are mostly related to the gravity Visual Studio and MSBuild. An example would be with the current state of F# development. The editing experience for F# in VS alright but Go to Definition and features like that don't work correctly between csproj and fsproj based projects and even in VS Code with Ionide, a fsproj file is needed to get intellisense. So far I can't find any reference to how to latest announced tooling changes affect F# projects with regard to wildcard file includes and PackageReference elements. All while I'd be happy to completely ditch fsproj for a solution like FAKE as long as fellow Visual Studio developers could still develop on the project with proper intellisense support that they would expect.

My mine gripes with MSBuild and csproj files in general is just that it feels like a technology like SQL where it's just stuck in the past and seems that there is no hope in any real meaningful change beyond it's original aspirations. Maybe this will change now but after years of making suggestions to improve things like project relative pathing for dependencies, nuget package reference redundancy, non alpha item ordering, complex file pairing (ex. xsd, cs, code behind, designer.cs), lack of wild card support, and an overly complex extensibility story I've gotten the same feeling that I ended up getting with WebForms, then Silverlight, and then WPF that nobody on the other side really cared about moving things forward anymore and just about maintaining what was already there.

I think the suggestion of creating a clean object model around projects or a set of APIs of sorts that Visual Studio would use to read and understand a project, modify it, and ultimately build it would be great. If it could be kept in a way that could be extensible to allow for other project formats then great. If those formats were not restricted to different serialization flavors of the same object model but instead allow for alternative build implementations such as CAKE, FAKE, or something homegrown to be plugged in then that would be excellent. For those who have worked in Node.js I'd like something as ubiquitous, simple, and powerful as npm scripts so that any project can be built not by calling msbuild necessarily but by invoking the default build for that project which may or may not be MSBuild.

Just like there wasn't any need to provide super awesome editing experience for .net config files for years and years.

Yeah... speaking of which. That is another tech that continues to suffer from the same fundamental problem we are discussing here: XML-based mapped DOM (weak/no schema) vs. POCO-based serialization (strong schema).

I mean, editing that thing isn't rocket science :p

Haha... wanna bet? 😛 Especially compared with modern experiences and ESPECIALLY compared with existing editing experiences in Visual Studio with Xaml (which is what I keep going back to as the ideal model here).

I know you don't think this is a big deal @kzu as you obviously have a great handle on this, having creating what I feel are some of the best formatted/created MSBuild files I have ever personally seen (look at those beauties! Just you look at them!!!). Wouldn't it be great if everyone were so great and fluent with MSBuild? That's ultimately what we're after here.

If people cared that much, they should probably get on to "fixing" that already.

That's what we're trying to do here. :) However, as discussed above the current model is entrenched in the XML API and is nearly impossible to modify without mass, sweeping (and certainly breaking) changes. So until we get buy-in from the project holders here to reflect the will/desire of the community at large, the best course of action we have here is to talk about it in hopes that they do. 👼

Oh god. I just finished reading this.

Need more coffee.

@MaximRouiller

Everything was awesome in .NET Core and now we're back to this crap. Very sad!
-- on behalf of Trump.

Some people just want to keep things complex. No, they are not - I've been using MSBuild extensively in the past years but try to convince someone else from my office - you'll utterly fail because:

  1. It's way more complex than needed. It's a camel with a giant laser tied to its back - very extensible but who needs that?!
  2. Not readable.
  3. You really need a build engineer to get advanced job done properly. Don't you guys think this should be done by devs without investing too much time?
  4. You're expecting people NOT to touch .csproj files (look at VS tutorials for editing .csproj). Hence the camel thing. It's supposed to carry you on, but when it doesn't - use this laser to help yourself throughout the desert.
  5. I mentored a couple of interns straight into .NET Core since we're heavily using it in our organization. Yesterday I had to ask them to migrate to and take a look at MSBuild ( consider they were taught to add/remove packages and do changes straight into project.json. No UI stuff.). You don't really want to know their reactions.

Tl;Dr: I'm not just for json syntax. I'm all-in for making things simple. This philosophy of "It just works, but when it doesn't - buy yourself a book". It's not working good.

+1

Haha @YehudahA you can upvote the issue up top at the root comment. That is what matters here. Otherwise, you are bound to get the downvote as @MartinJohns has so aptly and deftly demonstrated. 😄

Speaking of upvotes, this issue now stands at 103, plus 9 hearts now. For context, the closest issues for either category are 9 upvotes, and 1 heart, respectively.

I also want to send a shoutout to @gulshan's great suggestion at https://github.com/Microsoft/msbuild/issues/1289. Please take a second and provide feedback and/or an upvote there, too.

Thank you everyone for your continued support and dialogue towards improving MSBuild, and making it the authoritative build solution we all would like to see and use. 👍

Please... no more "\

I've only spent a year in the GitHub repos and I can't stand it anymore (until my next coffee probably).

No technology is perfect. If \

\

Having a badly formatted csproj with MSBuild is just... horrible! What was wrong with having project.json that was used by the dotnet tools like publish?? Using XML to begin with is retarded for something that should be human readable (yes, project files SHOULD be human readable!).

@Grinderofl Let me poor some good diesel on the flames.

MSBuild is the standard project system for the WHOLE ecosystem right now. Microsoft has a finite amount of developers working on each technology/stack. If they split the project system in two, they now have 2 teams driving efforts to improve the build system in two different ways.

But that's not all! Now your customers are pissed because now you have two build systems and they have to learn another one.

You end up with a less focused effort and pissed off customers. So unless they are creating the future of Build System, going back to MSBuild is actually a pretty sensible idea. project.json was created when DNX was invented. The focus was not the build system but rather... throw ideas at the wall and see what sticks.

If they kept project.json, they would have ended up with the situation I described above. Pissed off customers and unfocused efforts on two systems instead of one.

They don't have unlimited money, time and customer patience any more than the rest of us do. This is open source though- if you really want to make this happen, it would be better to present a comprehensive design plan that includes a unified build system between all .NET project types (heck, C++ projects too or it won't be useful for my solutions), and actually write a proof-of-concept showing why this idea has only advantages over the msbuild system.
I would love that, because you guys do have some cool concepts. However, doing that work is different than sitting around and sending the message, "Microsoft, give us stuff, figure out the hard parts for us, and do all the implementation, and change the way all your customers work."

@jnm2 You forgot about the "but even after you did all this, we'll reject your solution anyway." 😉

Having a badly formatted csproj with MSBuild is just... horrible! What was wrong with having project.json that was used by the dotnet tools like publish?? Using XML to begin with is [..] for something that should be human readable (yes, project files SHOULD be human readable!).

Having a badly formatted project.json is just as horrible. Did you take a look at the new cleaned up csproj file? While there are still things I'd like changed (e.g. Version as an attribute instead of an element), I think it's a lot more clean and lean.

And honestly, XML is just as human readable and human editable as JSON is. I agree that project files should be human readable - and that's what the MSBuild project files are.

And the project.json had other drawbacks too (no way to comment, no trailing commas).

While the "Format Agnostic" approach is probably the most flexible solution, I don't think it's very feasible. It would lead to fracturing the eco-system even more with doubtful benefit and it would increase the complexity of the whole system enormously. "How do you do XYZ?" - "Well, that depends on which of the 50 build systems you use!"

Staying unified on the MSBuild system is the best choice they can make. And frankly, it's really not as awful as a lot of vocal voices want to make it appear to. Especially with the clean-up process Microsoft is doing.

@MartinJohns I hope you can appreciate the irony as I do of having you declare that there will be 50 flavors of build systems and then state it's never as bad as people make it out to be. 🎆

To be sure here, the idea is to have one CLR-based model that can be described in numerous formats.

Also, I am starting to edge up to @jnm2's challenge here of writing a very simple proof of concept to demonstrate the idea. With as much energy as I have already spent in this repo, I feel like I could have done it in spades times ten now. 😛 I do not want to spend too much time with it for the reason @MaximRouiller states (and more). Let's see where the afternoon gets me (what did I just get myself into now???).

@MartinJohns

While there are still things I'd like changed (e.g. Version as an attribute instead of an element)

This is happening. You can see here where we are updating the VS project templates to take advantage of this: dotnet/sdk#394 We are also working on further cleanup of the .csproj format.

To @Mike-EEE and others who want to experiment with supporting an entirely different project file format in MSBuild, the place to start looking is probably in the MSBuild Construction APIs. These abstract the underlying XML representation to some degree. They do still reflect the underlying structure of the XML though, so you still have properties inside of PropertyGroups and items inside of ItemGroups, for example.

Thank you @dsplaisted for the pointer. We actually found that out earlier in this (now rather lengthy!) thread, when discussing the feasibility of using POCOs vs. XML. XML is indeed tightly coupled to classes throughout the MSBuild libraries, even being used as full properties such as the Project.Xml.

(Can you imagine a Project.Json property? Data should not be tightly coupled to the serialized object it describes. That is what we are aiming to solve.)

To underline the ask here, the idea would be to _not_ introduce a new format, but simply make it so that properties such as Project.Xml go away, and the XML that is read in will simply serialize the Project entity in memory, along with all of its child instances. Hopefully that makes sense.

Once you have this working for one format (XML), then you can make it work for other formats as well. In essence, the project file becomes a serialized Project entity and it can be described in any supported format, and in doing so will yield any tooling magic that is already built for said format.

Part of this whole thing is that the knowledge of how to build a C# app is in a .targets file which absolutely needs to work with property groups and item groups. Even if you want to serialize the msbuild project 'POCO' as JSON, you'd need JSON versions of item groups and property groups, or else you'd have to abandon msbuild and the C# .targets and write the entire process from scratch.

OK, I took some time to flesh out a SUUUUUUOOOOOPER rough model in what I am thinking here. You can find that here:
https://github.com/Mike-EEE/Stash/tree/master/MsftBuild.Model

I've placed the project POCOs here and the processing POCOs here.

I've described a possible project file here and a possible processing file here. I used @gulshan's template from #1289 as a guide. That's not to say that I followed it completely, however. 👼

The design that I landed on (and am in no way suggesting, just rough-sketching here) makes the Project file a ServiceLocator, so that its "properties" (or services) are not strongly defined, which makes it super flexible. It is up to the processor to pull in the data that it needs (as demonstrated here).

_(Again, this is not in any way making a final suggestion in any sense. Just doing some sketching.)_

Design discussions/decisions aside, in the end what is important is:

  1. A Project POCO element is defined in a serialized format. (Example)
  2. That Project is then read into memory. (Not defined in this sample)
  3. That Project is then sent to a Processor of some sort. (Example)

I've also described this in Xaml (as you probably know, I am a fan) to also showcase some neat things you could do in this particular format, such as provide a version from an external file, and also use one to query the files needed to compile.

Finally, to showcase the "tooling magic" I keep talking about and the reason I am so Xaml/POCO-crazy, here is a screenshot of the mouse cursor in the minimum logging level of one of the elements:

You can see there's a drop down list, checkboxes, and everything, without having to do anything but simply define a POCO in the Xaml file. _This is truly a "lit-up" experience_, IMO. Furthermore, in Xaml, there is plugin support for the editors used for the property pane, so we could make it even more customizable/cool if we wanted to. It's this sort of power/paradigm that I think we should strive to achieve -- but in any format possible and not just Xaml.

Hopefully that will clear up my side/position here. But I am sure this will create more questions than not. 😄

@Mike-EEE Too much noise :) The cleared .csproj is much more readable and human friendly. XAML is not the way to go.

Haha @djanosik yes I should have put a disclaimer in there regarding that, as Xaml is definitely more chatty than JSON (or XML). As I mentioned in #1289, there are new flavors of Xaml emerging that address this issue.

The cleared .csproj is much more readable and human friendly.

This is a very subjective statement and really at the heart of the problem we're trying to solve here. Whereas you see Xaml as "less" human friendly, myself and many others love its expressiveness and power.

However, in the end, what is important here is not the verbosity and/or chattiness, but the _features and tooling integration_ that is possible with a POCO-based approach, which Xaml does a great job of doing and I hope that the screenshot and use of MarkupExtensions capture. You simply cannot achieve such power with XML or JSON.

Again, the idea is to eventually capture such concepts and make them available in _any_ format. I use Xaml as of course it does this out of the box without a lot of effort on my part to demonstrate. I'm lazy, what can I say. :)

@Mike-EEE my guess is that you'll interest more people with a project.json-like format reader first, then a csproj-format reader, after that write a XAML one. :-D

Haha... OMG I am dying here. OK. I went ahead and made you a Project.json file, @jnm2 and @djanosik:
https://github.com/Mike-EEE/Stash/blob/master/MsftBuild.Model/MsftBuild.SampleProject/Project.json

(And I did this by simply newing up the Xaml-described POCO and serializing it in a JSON-based format, which is a piece of cake since we are dealing with a POCO.)

So again: It's not the chattiness or (theoretical) schema that is used here that is important. What IS important is that this file and this file _both_ describe this POCO. Make sense?

@Mike-EEE It makes sense. How would you create custom build tasks (preferably without referencing other assemblies)? Or do you want to completely separate project metadata and build script?

Ah now @djanosik you're proving that the reward for hard work is more hard work. :) In this case, we are tackling a few items here. This actually started out over @ #1289 when @jnm2 rightly reminded me from a discussion in https://github.com/aspnet/Home/issues/1433 that processing should be separate from data. So this POC is really showing off two things:

  1. POCO-based modeling
  2. Separation of concerns between processing and data/definition.

Now, to answer your question, creating a new custom build task would be similar to implementing ITask as I have done for the BuildProcessor here. In this case, the BuildProcessor doesn't have any properties, but if it did, it would be described/defined in the Processor definition file.

Again, this is simply a sketched idea, and nothing definitive/authoritative, to help flesh out the idea of what I am interested in solving.

Reading this thread, it's as if Gradle and HOCON don't exist ... I don't mean that as trolling, just wondering why there is not even a mention of those since truthfully, they are a joy to use. HOCON is already there: http://getakka.net/docs/concepts/hocon. Now waiting for a Gradle plugin for .NET :)

Indeed @slorion, Gradle is very much a driving force behind #1289, Check it out and make sure you upvote it and/or share your feedback. I was not familiar with HOCON until you mentioned it here. Thank you for sharing. The idea here is to be able to support _any_ format as long as there is a serializer for it. So if HOCON has serializers, then it will be possible to tie into it as well. :)

Ah thanks, glad there is some push for that idea.

I was mentioning HOCON because I saw some comments about web/app.config which are also a pain to deal with, especially when deploying to multiple targets with some specific configuration.

I write C#.NET 6+ years, I hate the .csproj(.xml format file). And I have use .NET Core from DNX RC1 to .NET Core 1.1. The project.json is big forward step. Were you thinking of our feedback ? And if you (.NET Core MS dev team) are insist to give up project.json, I will give up .NET (Core).

Just as a simple idea, if .Net core needs to switch across to using .csproj files for the IDE
and the only way to make changes to it without the ide is via a tool (which sounds kind of horrible)

Why not have a tool / something part of the dotnet tool that reads in a project.json file and outputs the .csproj file as part of the build process
so instead of project.json.lock which is sort of a postbuild version of project.json, it outputs directly to a .csproj file instead

It's a bit of extra work but it's probably worth while

All I see is people arguing about the json format just because they never used it and because it knows msbuild from top to bottom.

<NuGetPack File="%(NSPSTransformed.Identity)" OutputDirectory="$(BuildOutputNuGet)" Properties="Configuration=Release" IncludeReferencedProjects="true" ToolPath="$(nugetToolPath)" />

Anyone is really comparing that 💩 to project.json ?

All I see in every csproj file suggested, posted, and called fabulous intelisense is full of magic strings even for stuff like True and False.

Ironically, feels like CSPROJ is not typed while JSON is typed.

on project.json side:
some smart people realized that this would be a great format for asp.net core. Even Fowler said it was it's favorite feature.
It's not about the format or the intelisense or feature A and B. it's about a whole experience.
It's typed, got intelisense, got http request options, was very fast, very understandable, super ultra duper hyper dicoverable, and if you think csproj also is, you didn't USED it, and just commenting based on looking at the file on some github repo.

If you are going against it, let us know that you used it and tried for a least a week.

but
apparently, it has some limitations that aren't present on csproj. (and this might be a very good/valid reason to drop it's support of right now)

I'm all supportive to the csproj and I trust people behind the product.
Just, give some credit and ask yourselves "why, if project.json is no good than csproj, the smart people behind asp.net invest and make a system that everyone (that used it) fall in love at first sight?"

Imports? Targets? Properties? ToolPath? those words are all meaningless and that's why you don't see it mentioned on project.json at all, while target framework, dependencies, includes (which tooltips suggesting that supports globbing patters, a string or an array of strings inside a publishOptions are very self explanatory.

conclusion: project.json is A-MA-ZING. period. If you don't think it is, you didn't used it. It's not about the file format but for a lot of subtle thing that became with the "experience". Stop comparing it via file diff. just pick a project and start developing.

True fact: When project.json was first introduced, it had no documentation, no answer on stack overflow, no real help from the IDE, still all web developers were editing it by hand and were very happy.
Now in csproj, we have docs, lot of history, a 1.1 stable product, experts around the globe, and yet web developers are using a less-productive GUI to make changes to the project file and aren't any happy with csproj.

To all project.json haters: try it. and point exactly what's flawed or bad about it.
To all msbuild lovers: sorry, there is better stuff out there, even if it means you need to change/adapt.
To all project.json lovers: it has limitations and associated costs. Deal with it, having one thing is best overall, and you can still dream of a better future.
to all msbuild haters: be vocal about your pain points. chances are (at least have some faith) it will evolve to something useful as project.json was, not today... but it might happen.

Proposed a .net object notation format based on object, collection and index initializers here- dotnet/roslyn#16648
Please have a look. I deliberately used a project description for example. 😄

I think @jnm2 is right. As much as I complained about the move back to XML, I can now see that it was a communication problem. The CLI/MsBuild team told us "trust us it will be great" but they didn't provide examples or ever update the docs.

IMHO the new csproj is GREAT. I'm having an easier time with it than with the json syntax honestly.

Here are two (names masked) csproj files i am compiling today (in VS2017 RC3).

  • I am developing in VSCode and VS2017 interchangeably and at the same time
  • I am running via F5 (debugging) and from dotnet run.
  • I am debugging and using Edit+Continue
  • I am hand editing the project file and VS2017 is updating in realtime.
  • I am generating platform specific executables via dotnet publish -r win7-x86 at others
  • Dozens of .cs files, thousands of lines.
  • No GUIDS
  • No crazy conditions

Library (MyCorp.Lib.csproj)

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netstandard1.6</TargetFramework>
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="Newtonsoft.Json" Version="9.0.2-beta2" />
    <PackageReference Include="protobuf-net" Version="2.1.0" />
    <PackageReference Include="System.Diagnostics.Process" Version="4.3.0" />
    <PackageReference Include="System.IO.MemoryMappedFiles" Version="4.3.0" />
    <PackageReference Include="System.IO.Pipes" Version="4.3.0" />
    <PackageReference Include="System.Runtime.Loader" Version="4.3.0" />
    <PackageReference Include="System.ServiceProcess.ServiceController" Version="4.3.0" />
    <PackageReference Include="System.Threading.Tasks.Dataflow" Version="4.7.0" />
    <PackageReference Include="System.Threading.Thread" Version="4.3.0" />
  </ItemGroup>
</Project>

Executable (MyCorp.Connector.Runner.csporj)

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netcoreapp1.0</TargetFramework>
    <AssemblyName>MyCorp.Connector.Runner</AssemblyName>
    <OutputType>Exe</OutputType>
  </PropertyGroup>
  <ItemGroup>
    <ProjectReference Include="..\..\Lib\MyCorp.Lib.csproj" />
    <ProjectReference Include="..\..\WebStuff\MyCorp.WebStuff.csproj" />
    <ProjectReference Include="..\..\Connector.Common\MyCorp.Connector.Common.csproj" />
  </ItemGroup>
  <ItemGroup>
    <PackageReference Include="protobuf-net" Version="2.1.0" />
    <PackageReference Include="Microsoft.Extensions.CommandLineUtils" Version="1.1.0" />
    <PackageReference Include="System.ValueTuple" Version="4.3.0" />
  </ItemGroup>
</Project>

I think we should all take a step back and look at the current progress of the simplification effort and decide if it is good enough now, because I think that it is.

It's working people.

@AlgorithmsAreCool Thanks a lot for the feedback, I'm glad you are happy with the progress we've made. If I look back from when we announced that we were moving from project.json -> csproj to now, we've added _a lot_ over the old csproj:

Project:

  • Smart defaults. AssemblyName, Default Namespace, etc now have sensible defaults and don't need to be specified by default.
  • No more GUIDs. :)
  • Implicit language-specific/target-specific props/targets via the Sdk attribute
  • Metadata as attributes, so you can say <PackageReference Include="Newtonsoft.Json" Version="9.0.1"/> instead of <PackageReference Include="Newtonsoft.Json"><Version>9.0.1</Version></PackageReference. This works with all items not just package references.
  • Implicit configurations - by default all projects have two configurations; Debug|AnyCPU and Release|AnyCPU with sensible defaults. (Note: We currently have 6 configs in the RC, but 2 by time we ship).
  • Implicit globbing support, with a default set of common globs so you don't need to specify them. Projects can override if they'd like
  • Simplified project references, no need to specify the Name or GUID metadata
  • Auto-generate AssemblyInfo, including <AssemblyVersion> and <AssemblyFileVersion> so that these can now be based on MSBuild properties.
  • Transitive project references

Targeting:

  • Friendly TFM names in <TargetFramework>, so you don't need to specify <TargetFrameworkIdentifier> or <TargetFrameworkVersion>.
  • Multi-targeting support via the <TargetFrameworks> property, so that a single project can produce multiple outputs targeting multiple frameworks/platforms
  • Implicit conditional compilation symbols based on the target framework you target
  • Support for conditional items such as references, packages, and compile items _and_ have VS respect them, so you can differ them based on target framework.

NuGet:

  • <PackageReference /> which enables you specify (just) your top-level package dependencies inside csproj along with your other dependencies
  • Transitive package references
  • NuGet Restore via msbuild
  • Produce a NuGet package via msbuild
  • Background restore, so packages are restored in the background while the project is open.

Visual Studio:

  • Real support for globbing (**\*.cs, etc) - no longer does VS expand the glob when you delete a file, or add an include when you add a file already included in the glob. VS also automatically picks up addition of new files (or deletions of existing files) automatically.
  • Auto refresh. VS will automatically pick up changes from disk when you switch branches or modify it another editor and just apply the differences without reloading the project
  • Edit while open. You can now edit the project file while the project is open, and have VS pick up the changes on Save.
  • New Dependency node (replacement for references) that shows packages, references, project references, SDKs and their dependencies.

Above is just the list that I can remember off the top of my head, I'm sure I've missed a bunch. Anyway, we've been on a journey that will continue long past this release. Please keep the feedback coming!

@davkean That is indeed a very big improvement syntax wise! That said, the real problems arise when a build requires more than simply declaring dependencies and conforming to standard .NET toolset. This is where Gradle shines and I still wonder why it is not used at least as a source of inspiration/lessons learned. This project format looks like Maven pom.xml with a much better syntax, which is nice, but there is a reason why Gradle or SBT were created.

That said, the real problems arise when a build requires more than simply declaring dependencies and conforming to standard .NET toolset. This is where Gradle shines

@slorion What do you mean? The new clean csproj is built on msbuild which is a full build system. I don't know much about Gradle, but in terms of build system, you can do pretty much whatever you want with msbuild and develop complex builds. That's one of the reasons they switched back from project.json to (clean) msbuild projects.

[Edit]The real next step is to replace *.sln with a proper clean msbuild solution file to coordinate the build of multiple projects (and this is actually what is done as a sln is already converted to a msbuild file under the hood..) but I believe that they will bring this change at some point after the project.json migration[/Edit]

@xoofx What I meant is that once you get past simple builds, you need to get into the innards of msbuild. For having done that in the past, I cannot say it is a pleasing experience to say the least. At least now, merging csproj files will not be such a pain.

@slorion I see your concern, so yes, the msbuild syntax sucks and I would not mind switching to a more simpler DSL + curly braced syntax (e.g ala Graddle)... but one step at a time... 😉

👍 to the great work of the new .csproj format. @davkean and team have been impressively busy and productive over in the CPS repo and they have been a marvel to watch. The new format is remarkable and nothing less than impressive. (But as @xoofx suggests, it's high-time to apply that magic to the arcane and cryptic .sln file. 😉)

Along such lines, that is not to say that we still cannot make great improvements to the IDE to further the goal here. In my mind, the goal of the issue is to create an experience that allows:

  • Viewable in a desired data format (much like how .NET developers can choose to work in C#/VB/F#)
  • Can hand-edit like a champ (seems like this is closer to reality now ala above)
  • Visual/designer-tooling friendly (sort of the initial driver here)

There are two camps here: those that enjoy the CLI and ... those who don't. :) Both of these camps should be considered going forward to create a successful development paradigm.

The purpose of using a POCO-based development scheme is that the schema is automatically generated for you from the class definitions, and designer tooling automatically "lights up" when directed towards it. It makes it easier for both developer and tooling to use for development.

However, I am definitely open to any other innovative ideas and directions to accomplish this goal. I really enjoy reading @gulshan's thoughts and efforts in the different threads for a different object notation. Gradle seems to be the super hot cake right now, so maybe there might be something worth learning and integrating there. Maybe some cross between its format and a visual tooling integration of @ionoy's great work with AmmyUI (imagine working with a VS/MSBuild file but with monster intellisense on steroids) is the answer? To be continued. :)

I'm new to C# and .NET in general. Hearing about Core made me excited to learn C#. I booted up VS2015 and created a solution. I immediately felt at home, and things like project.json definitely contributed to that feeling.

I'm not the kind of developer who stays in their comfort zone. You know the type: too lazy/afraid to learn something new. Still, glancing at .csproj files makes my brain cells commit seppuku. Looking through the patch notes though, I must say the team has made some nice improvements.

If MSBuild support entices people to migrate legacy projects thus contributing to the future success of the platform, then I can get on board with it. But what if the cost of that is scaring away new developers? It's just one of those things that makes me think, "There must be a better way". As a padawan I can't say I know the way, but even a padawan can share their perspective and insight.

Luke: _Master, moving stones around is one thing. This is totally different._
Yoda: _No! No different! Only different in your mind. You must unlearn what you have learned._
Luke: _All right, I'll give it a try._

I think the downside, is that on the whole VS studio is too opinionated either way. I have been a .net developer since .net 1.0 back in 2003. I have always hated MSBuild probably mostly because in my mind it is so closely aligned with the worst of Microsoft known as TFS. I have always preferred what was known as the ALT.net stack of the day. That's not say that there haven't been great products in the .net stack as whole. Probably most Devs have a Love/Hate relationship with VS Studio, just like most Java Devs have a love/hate relationship with Eclipse or IntelliJ.

Personally I think this is very clearly portrayed in the comments above, there certainly isn't the ONE right way. Honestly whichever way the decision went, there were going to be losers. Yet another Brexit/Trump choice!

I did really like the Project.json format. Found it very simple to grok and was easy to hand edit.
But saying that on the whole, I do like the new csproj format,

What I don't like and specifically the issue I have now, is that I am working in VS 2017 RC, we're just a couple of days away from being released, and some basic things just aren't working. Which are a direct result of json/csproj shift, and have had a direct impact on my productivity.

I'm used to working with OS products, and understand the risks of developing new software on release candidates stuff, but when a notifications goes out a few weeks ago, warning of the change, and steps to take to prepare for it. But yet things just don't work
>
dotnet ef
System.IO.FileNotFoundException: Could not find file 'C:\Users\gary\code\portal\src\DBDummyUIproject.json'.
File name: 'C:\Users\gary\code\portal\src\DBDummyUIproject.json'

I have had to basically read the internet cover to cover to only find this post
https://github.com/aspnet/Tooling/blob/master/known-issues-vs2017.md

Which kinda informs me that EFcore tooling will not work with the new csproj format. Yet, we're still only days away from a release.

It's not a moan, on the team or Microsoft. I know at the end of the day, it was always going to be a hobsons choice. However, I do think there is a lot of scope in the argument of giving the community the choice of format, instead of giving them an opinionated implementation, based on the fact that it works with MSbuild, which is ok if you're going to use MSBUILD but what if you're not ?

Apologies for the long ramble!

@garywoodfine you might be missing a reference to the msbuld version of the ef tools in your csproj (note the version number):

<ItemGroup>
  <DotNetCliToolReference Include="Microsoft.EntityFrameworkCore.Tools.DotNet" Version="1.0.0-msbuild3-final" />
</ItemGroup>

@dasMulli Thanks
I had
<DotNetCliToolReference Include="Microsoft.EntityFrameworkCore.Tools.DotNet" Version="1.1.0-preview4-final" />`

But now when I try
<DotNetCliToolReference Include="Microsoft.EntityFrameworkCore.Tools.DotNet" Version="1.0.0-msbuild3-final" />

I just get "ef is not recognized as an internal or external command"

I have tried with both packages together and I am back to the original error

--Edit
So I now finally have it working the steps I went through are, just in case anyone else had this issue.

I removed the reference to
<DotNetCliToolReference Include="Microsoft.EntityFrameworkCore.Tools.DotNet" Version="1.1.0-preview4-final" />`

Then did a dotnet restore on the terminal

then added the reference to
<DotNetCliToolReference Include="Microsoft.EntityFrameworkCore.Tools.DotNet" Version="1.0.0-msbuild3-final" />

then did a dotnet restore on the terminal

then tried dotnet ef and everything started working
Thanks @dasMulli

@garywoodfine, @dasMulli: Can you please move this support discussion out into a relevant issue? 😄

I'm late to the game but wanted to chime in here:

Getting rid of project.json in favor of an MSBuild based alternative will pretty much kill my present company's ambitions of supporting .NET at scale. I've recently inherited some .NET based work that needs to be brought current and was pretty excited to see what has changed in the world of .NET over the past decade (I haven't actively used .NET since 2010 and stopped using Windows altogether in 2013). With a move back to Visual Studio, even if there is support for Mac, it's looking unlikely that there will be interest in supporting Microsoft platforms moving forward.

@udev MSBuild is now cross-platform as it is built with .net core. Moving back to MSBuild doesn't mean going back Visual Studio. The command line tools and VSCode with Omnisharp can be used to develop .net everywhere.

I'm doing work on a Mac with VS Code and Visual Studio Mac, and the command line, and while I'm not a huge fan of the old MSBuild csproj, even I have to admit the new experience is kind of awesome.

I think Type Providers in C# will help a lot to fulfill the goal of this proposal.

In 20 years I have yet to work with a developer who cares one bit about proj files being in xml or some other format.

Only a handful even understand what the file actually does or how to alter it to make life easier or fix problems. This seems like work for work's sake.

We believe SDK style projects address this issue by making the project files concise. We have no plans to add new extension points to the project formats at this point.

Thanks for all the comments and discussion here.

@livarcocc - fine, like the nuget team, you aren't interested in maintaining feature compat or even parity. That seems like a move that only serves to punish the userbase thats actually _interested_ in things like proj formats. I really think that feedback for this feature was not segregated between "OMGPREVIEW fanboys", the project file casuals that make up most of the .net developer populace, to whom this sounds like a good idea from the surface view (simpler always sounds better, until you need a feature that got chopped), and power users for whom an implicit content and still undocumented project file is an echo of web site projects and a looming compilation and automation disaster.

Can you at least guarantee or promise we won't be forced to use these problematic and inferior SDK style project formats?

Can you at least guarantee or promise we won't be forced to use these problematic and inferior SDK style project formats?

In what way do you believe SDK-style projects are inferior?

Please point out anything non-factual. This is what it looks like from the outside.

  • They require package reference style nuget. This means losing useful package things like adding files to a project as part of a (private) package, or providing simple xdt config transforms or running a powershell script to cover all the other possible gaps. It also means not knowing what dependencies are actually going to be included until the project is built and the output is inspected. I can't make a private package that is as useful to my team with package ref as is possible with package config.

  • there isn't an explicit list of things to be included in the compilation.

    • Stray files were a headache with web site projects, web app projects fixed that, but SDK brings back that indeterminism for all types of projects. Tools that create extra files or folders in the project filesystem have to be cleaned up after, sloppy development/filesystem habits create more problems than before, and one of my personal favorites vssscc and vssspc vestigial visual source safe files get even more foothold in version control.

    • inspecting projects across codebases can't be done by looking at the proj file, now the constituent files to include (which ones?) need to be downloaded. Hopefully VCS or FS or the user doesn' have a whoops and miss a file, because the project won't know unless it compiles.

    • automating some changes to projects is significantly more difficult for the same reasons as inspection, but the dearth of defaults and that so few of them dare to be documented - makes reworking any existing tooling a time sink swag session.

The history of this format reads like it's the recovery from the A-ha moment when you all (mostly) realized that json is great for data exchange but horrible for configuration. Instead of taking that lesson you forged ahead when it wasn't necessary in the first place. It didn't need to be done, there are other things needing modernization (sln files, easier wcf rest support, etc.)

Hopefully the impetus was not inspired by statements like "my company will not adopt/will abandon netcore hinges on a project format" because they are empty and dumb statements. Anyone holding true to that deserves what they get from that and other dumb choices.

This is what I can muster before breakfast on my phone, and may get a few typographical or formatting edits when I can see more than a paragraph at a time.

@StingyJack Parts of that don't ring true to my experience. I know of packages that add files to projects can be done using a .props file. It never seemed useful to me to keep the dependency graph in source control, but you can do that with a lock file: https://docs.microsoft.com/en-us/nuget/consume-packages/package-references-in-project-files#enabling-lock-file

there isn't an explicit list of things to be included in the compilation.

From the production or consumption side? When creating packages, it's explicit except for the single built assembly.

inspecting projects across codebases can't be done by looking at the proj file, now the constituent files to include (which ones?) need to be downloaded

This is a relief to those with frequent merge conflicts who can't wait to get the rest of their projects on the newer csproj format. Having to be explicit always felt like duplication because of how basic the includes are.

automating some changes to projects is significantly more difficult for the same reasons as inspection, but the dearth of defaults and that so few of them dare to be documented - makes reworking any existing tooling a time sink swag session.

Many things were always implicit and idiosyncratic. The cleaner SDK pushes that further along the same path. I think this is the thing https://github.com/daveaglick/Buildalyzer solves.

Just to jump in and echo some of the comments:

there isn't an explicit list of things to be included in the compilation.

From the production or consumption side? When creating packages, it's explicit except for the single built assembly.

Its been awhile since I've looked at the new SDK format but I believe its from the consumption side; consider a project Layout Similar to the following:

└───Project
       A.cs
       B.cs
       Project.csproj

Based on my recollection Project.csproj will helpfully attempt to include everything under itself as part of the project format. This is bad for the reasons that @StingyJack mentions above.

The explicit nature ensures that the build system is doing what the developer asked for, not what was "helpfully found". It is very possible that a developer has forgotten to check in a source file, and unless your are explicit, could very well result in a run time failure as opposed to a build one. Consider a class library that utilizes Dependency Injection/IoC in which the classes contained within B.cs are consumed. If a Developer forgets to commit B.cs under an implicit model this is not discovered until runtime (hopefully in Internal testing, but as per Murphy at a Customer Site for sure).

There are other reasons to have extraneous files in a sub-folder that you explicitly do not want included. I know for a fact that in our large code base there are places where this is actually by design. Consider this pattern:

└───MyClassLibrary
        A.cs
        ATests.cs
        Implementation.csproj
        UnitTests.csproj

In this case you want A.cs to be included ONLY in the Implementation.csproj, whereas you want ATests.cs to be included ONLY in UnitTests.csproj. You can argue the merits of shoving this all into a single source folder (I know I have tried) but this is the reality for a lot of large development shops. It is difficult to get buy in from stake holders to refactor projects which previously "worked".

inspecting projects across codebases can't be done by looking at the proj file, now the constituent files to include (which ones?) need to be downloaded

This is a relief to those with frequent merge conflicts who can't wait to get the rest of their projects on the newer csproj format. Having to be explicit always felt like duplication because of how basic the includes are.

I would take the opposite side of this argument. We maintain 16 Branches of our code base (yeah, 16...). While on the surface I agree with you due to the number of merge conflicts there are things that can and should be done to try and minimize this. For starters ensuring that the project format is sorted in a deterministic matter (we settled on quasi-alphabetization) is helpful. I agree that having to write tooling to support this is not fun, and you can see that my GitHub is littered with tooling to do so, but the reality for us (and other Build Masters I've talked to in similar industries) is that its just par for the course.

The ability to quickly diff project files between branches is critical in understanding what is and what is not making it into the final binaries. Anything that can mux the output (by adding "indeterminism") is considered a defect from the DevOps world.

If you do like the manual maintenance of all .cs files, you can put <EnableDefaultCompileItems>false</EnableDefaultCompileItems> in Directory.Build.props or a csproj. <EnableDefaultItems>false</EnableDefaultItems> might also be interesting to you. See (https://docs.microsoft.com/en-us/dotnet/core/tools/csproj#default-compilation-includes-in-net-core-projects). The defaults are right for the majority of projects and you can override if they aren't right for your project.

How common is the problem of forgetting to add a source file to source control (and without failing CI)? New files show up prominently using Team Explorer (Git or TFVC) or VS Code or git from the CLI (posh-git).

The ability to quickly diff folders between branches using source control is powerful. It's more complete than diffing csproj files in my experience, and the diff UI lets you drop right down to the changes within any of the files. Files are the starting point, the ultimate source of truth.

How common is the problem of forgetting to add a source file to source control (and without failing CI)? New files show up prominently using Team Explorer (Git or TFVC) or VS Code or git from the CLI (posh-git).

At least once a week internally. We have ~70 Developers averaging 200 commits a week the average commit size is 3 CSPROJ Files +/- 20 CS Files, most are changes to existing files with a few additions sprinkled in. FWIW we're using Subversion, but I doubt changing VCS would really help (we already use Ankh which provides the overly). Even your absolute best developers (your 10x devs) will occasionally make a mistake. Its inevitable at this scale and rate of change.

YMMV; apparently it doesn't happen for you, I wish I was there (more than you can know).

The ability to quickly diff folders between branches using source control is powerful, more complete than diffing csproj files in my experience. Files are the starting point, the ultimate source of truth.

We don't disagree, however the scale at which this occurs is much larger than most diffing tools respond to reasonably quickly, our branches are ~670,000 Files 60,400 Folders

image

We have ~5,800 CSPROJ Files, quickly performing diffs at the CSPROJ file level is much more efficient and we have massive amounts of automated tooling (both Commit Hooks and a tool similar to SapFix (we call it Tattler)) having the structured, deterministic format of the CSPROJ files is a huge boon, such that we capture most of these failures very early in the process.

As @StingyJack mentions there are other places where the functionality would be very useful for example this hits right in the feels for me:

It didn't need to be done, there are other things needing modernization (sln files, easier wcf rest support, etc.)

And FWIW you might want to take a look at this closed idea: https://github.com/dotnet/cli/issues/12858 I am working on open sourcing our tooling that does something similar @StingyJack sounds like you would benefit from it (same boat you and I on the River Styx).

If you do like the manual maintenance of all .cs files, you can put <EnableDefaultCompileItems>false</EnableDefaultCompileItems> in Directory.Build.props or a csproj.

Thank you we will be enabling this shortly. Today we do not use the new style Project Format (mostly because we do not have much .NET Core) but if all goes as planned I am sure we will need it sooner rather than later.

@aolszowka We have ~60 devs plus a lot of consultants, and i have _never_ seen a case where a missing source file made it through CI (not to mention compilation). The setup you describe sounds absolutely horrifying :) How in the world can a _missing_ source file pass CI? I understand that theoretically in some cases reflection over types in an assembly could compile without error (though even that should be extremely rare), but what does it say about the CI setup if it cannot catch basic things like missing source code?

For us ( having almost 200 repos in our main github org), the new project format, and especially the implicit includes and package references (death to packages.config! :) has been a _huge_ advantage. Previously we actually developed our own hacky little tool to deal with merge conflicts because we wasted so much time on doing it manually, but since we started using the new project format we almost never need it.

How in the world can a missing source file pass CI?

It doesn't today; because you have to explicitly include the file (that is the point of the entire conversation and example above). Therefore it kicks out in CI.

The proposal is to remove this functionality.

CI setup if it cannot catch basic things like missing source code

We completely agree, which is why we will utilize the explicit <EnableDefaultCompileItems>false</EnableDefaultCompileItems> mentioned above. 👍

It was probably added to support a similar scenario.

Previously we actually developed our own hacky little tool to deal with merge conflicts because we wasted so much time on doing it manually, but since we started using the new project format we almost never need it.

This looks like it was deprecated by the use of PackageReference; we previously had a similar tool to yours to sort package.config in a deterministic manner as you did. That being said you still encounter the same issue today though, which is why our new tooling simply enforces that PackageReference are sorted in a deterministic manner (alphabetized by the Include attribute).

The same issue exists today for just about any attribute. For our devs its most painful when dealing with <Compile>, <ProjectReference>, <PackageReference> tags as these have high rates of change for us, hence we enforce deterministic ordering to give the merging tools a fighting chance at the correct solution.

I assume this is what you mean when you say:

we almost never need it.

I would be interested to hear what other corner cases you encounter; it seems like we're not alone.

@petertiedemann - keep in mind it not just missing files, it's extra files.

This is a relief to those with frequent merge conflicts who can't wait to get the rest of their projects on the newer csproj format. Having to be explicit always felt like duplication because of how basic the includes are.

@jnm2 - Perhaps the solution to this would have been to reliably order most of the contents of the project file, as should be done with every other computer generated non-data file, so changes were not appearing haphazardly in the file. That would probably eliminate much of the angst by allowing most merge tools to handle it automatically and it would have been a simpler fix. However it alone would not have helped the merge and compare tools built into the world's premiere IDE that have reached a new all time low in quality.

Every day, I need to be able to do more with less. Finding and fixing a bad pattern that's been repeated in hundreds of projects is harder without the file manifest and with so many still undocumented defaults.

Re: packages, I mean that now I can make a dll only package, but I can no longer make a package that co-workers can add that will also take care of configuration of the consuming project, or add CS or TT files to the project. Those all now have to be done manually.

How in the world can a missing source file pass CI?

It doesn't today; because you have to explicitly include the file (that is the point of the entire conversation and example above). Therefore it kicks out in CI.

The proposal is to remove this functionality.

CI setup if it cannot catch basic things like missing source code

We completely agree, which is why we will utilize the explicit <EnableDefaultCompileItems>false</EnableDefaultCompileItems> mentioned above. 👍

@aolszowka Sorry if i wasn't clear. We are already using the implicit includes, and have never encountered similar problems (as i mentioned we see the _implicit._ includes as a huge productivity improvement over _explicit_ ones). My point was that, _given_ implicit includes, how can you have your CI miss something like this except in some very exotic scenarios? It would seem that in the 90+% case you will simply have a missing type error somewhere and fail to compile. If you are using reflection / automated DI then i guess it could compile, but then i would be very worried if a test did not catch the problem.

This looks like it was deprecated by the use of PackageReference; we previously had a similar tool to yours to sort package.config in a deterministic manner as you did. That being said you still encounter the same issue today though, which is why our new tooling simply enforces that PackageReference are sorted in a deterministic manner (alphabetized by the Include attribute).

Actually, we rarely have the issue today, because you now only have to deal with the package reference, and not the duplicated entries in the csproj file. I do agree on maintaining it sorted though.

Our project files are almost empty actually. We have a shared project file that we import that have things like copyrights and namespace setup, and the rest we just leave to the implicit includes / default behavior. A typical project file is ~20-30 lines.

Perhaps the solution to this would have been to reliably order most of the contents of the project file, as should be done with every other computer generated non-data file

For most projects, folks want to leave behind the idea that a csproj is a computer-generated file.

and with so many still undocumented defaults.

Are there any undocumented defaults? The link I gave earlier seemed pretty thorough in the area I read.

Doesn't have explicit includes just move the problem? Back before SDK projects, I often saw commits that included a new source file but forgot to add the project file change. Either way these were usually caught by CI due to being referenced by other files, but it was pretty easy to make this mistake.

how can you have your CI miss something like this except in some very exotic scenarios? It would seem that in the 90+% case you will simply have a missing type error somewhere and fail to compile.

If you are using reflection / automated DI then i guess it could compile

It seems like you answered your own question? I am not sure why you feel the need to debate this point when you know the answer?

As @StingyJack mentions you don't account for the extra files (the Unit Test Scenario I listed above for you).

Regardless we have a documented work around; I am not sure why there is need to continue any discussion on it?

because you now only have to deal with the package reference, and not the duplicated entries in the csproj file. I do agree on maintaining it sorted though.

And this new package format does nothing to solve these problems (again what @StingyJack is pointing out here). For a trivial number of PackageReferences/ProjectReferences this is managable, but when you get upwards of 50 of them it gets out of hand without any type of sorting enforced.

@owenneil

Doesn't have explicit includes just move the problem?

Yes; but it moves it back to the Developer (earliest in the process), and even better it gives your gated check-in a fighting chance to interrogate the CSPROJ Files on add; we have a commit hook that does exactly this (If CS File is added or removed ensure that the appropriate changes were made to the CSPROJ file).

@petertiedemann - keep in mind it not just missing files, it's extra files.

But in a "normal" setup this really wouldn't be a problem, would it? In all of our ~200 repos (everything from web services, constraint engines to domain specific language parsens and compilers) we do not have source files in the project folder that we do not want to include (in either the ones using the new and the old project format). And even if you do have some exotic projects with this problem, you simply disable the implicit includes or explicitly exclude them.

It seems like you answered your own question? I am not sure why you feel the need to debate this point when you know the answer?
...
Regardless we have a documented work around; I am not sure why there is need to continue any discussion on it?

I guess because the opinion voiced by you and @StingyJack seemed to be that implicit includes as default was poor design, but the cases brought up were pretty exotic, meaning that implicit includes indeed does seem to be an excellent design choice. The unit test scenario that was described basically goes against common practice for how to structure projects, and it is not one i have ever seen before. The runtime reflection CI/CD scenario is just scary design by itself, and it seems almost unthinkable that no tests would exist to fail in that case ( you have defined a type that is never used in any test or other project?).

And this new package format does nothing to solve these problems (again what @StingyJack is pointing out here). For a trivial number of PackageReferences/ProjectReferences this is managable, but when you get upwards of 50 of them it gets out of hand without any type of sorting enforced.

Well the PackageReference certainly helps a lot. Before we had people giving up on rebasing branches because of conflicts in dll references, now we rarely need to spend time on it.

But you projects have ~50 references? Thats quite a bit i must say, but i am still surprised that they would change often enough to cause signifcant problems. We have considered expanding our tool to deal with PackageReferences as well, but it would take far more time to implement it that than we spent on manually dealing with it.

@jnm2

Are there any undocumented defaults? The link I gave earlier seemed pretty thorough in the area I read.

Depends on the context; I read that to indicate that the project file is still a black box in some respects. While Microsoft has made HUGE Strides in documenting the behavior and intent of the Project Format in the last few years with Microsoft Docs (and we thank them for this!).

The first one that comes to my mind is the behavior of References in the newish system; there are some very subtle bugs around the restore logic. See this NuGet Issue https://github.com/NuGet/Home/issues/8272 many of these are not noticed until they are attempted at scale and only after careful evaluation.

I am sure we could continue to cherry pick behaviors; but its most likely irrelevant to the discussion at hand.

@petertiedemann

I guess because the opinion voiced by you and @StingyJack seemed to be that implicit includes as default was poor design,

I don't believe I ever used the words "poor design". I noted that:

Anything that can mux the output (by adding "indeterminism") is considered a defect from the DevOps world.

Which again, can be worked around simply by using the posted work around: <EnableDefaultCompileItems>false</EnableDefaultCompileItems>

but the cases brought up were pretty exotic, meaning that implicit includes indeed does seem to be an excellent design choice.

As to your point of exotic or not: One man's exotic is another mans common. Its the world we live in, and the tools were/are written such that you are free to suit them to your needs. I think StingyJack's frustration (along with mine) is that these scenarios are written off as corner cases (as you do below).

The unit test scenario that was described basically goes against common practice for how to structure projects, and it is not one i have ever seen before.

The runtime reflection CI/CD scenario is just scary design by itself, and it seems almost unthinkable that no tests would exist to fail in that case ( you have defined a type that is never used in any test or other project?).

Hey neither had I; but we're here today aren't we? (See the above) I think your inability to accept that the system is being used in perverse ways is causing you a lot of frustration and I apologize. I am not here to advocate that this is by any means good or acceptable nor to question the design persay. I am simply trying to be inclusive to current use cases that are affecting consumers of the product.

But you projects have ~50 references? Thats quite a bit i must say, but i am still surprised that they would change often enough to cause signifcant problems.

Oh yes! Easily. The problem is so bad I wrote a tool to help visualize it! (Shameless Plug: https://github.com/aolszowka/MsBuildProjectReferenceDependencyGraph) in fact there is a CI Process around automatically generating these graphs over every commit just so we can keep developers apprised of the next low level change that is bound to burn you on your next svn update! Those graphs are committed in their own repository so people can keep updating to get the latest.

Here's an anonymized version of a very commonly used Solution File by most Developers. Throw it in your favorite program that supports rendering dotGraphs (GraphViz for example, and its so complex WebGraphViz will not work) https://gist.github.com/aolszowka/a93ea5545d54344c61f66830fae90c4e it takes about 15-20 minutes to render on my personal work station. Oh by the way that is just ProjectReferences the tool makes no attempt to graph PackageReferences (yet, I accept pull requests!).

Its been a real joy to watch the Roslyn team experience large solutions in Visual Studio (Roslyn.sln would be considered on the small side internally). On the plus side they have made HUGE strides towards fixing it (because it burns them day to day), so at least its now possible to load these in VS 2017+ I am still hopeful they reach the end of the road soon and realize they need to make Visual Studio 64bit so then we aren't crashing 3-5 times a day due to OOM errors.

but it would take far more time to implement it that than we spent on manually dealing with it.

Oh how I envy you, but remember much like puppies and kittens: They start out small, but the grow up! At one time these branches were maintainable (probably why the attitude was "well what's one more?").

At least once a week internally. We have ~70 Developers averaging 200 commits a week the average commit size is 3 CSPROJ Files +/- 20 CS Files, most are changes to existing files with a few additions sprinkled in. FWIW we're using Subversion, but I doubt changing VCS would really help (we already use Ankh which provides the overly). Even your absolute best developers (your 10x devs) will occasionally make a mistake. Its inevitable at this scale and rate of change.

@aolszowka
FWIW, my team had this problem all the time years ago when we were on SVN. After the migration to Git it never happened anymore. Especially since we started gating checkins on CI Passing.

@AlgorithmsAreCool

Especially since we started gating checkins on CI Passing.

For sure; I think there is some confusion here; we gate these check-ins today via a Subversion Commit Hook (which we keep metrics on which is why I can tell you how often it happens), however the Subversion commit hook needs to utilize the CSPROJ to determine what is "correct" see this comment:

Yes; but it moves it back to the Developer (earliest in the process), and even better it gives your gated check-in a fighting chance to interrogate the CSPROJ Files on add; we have a commit hook that does exactly this (If CS File is added or removed ensure that the appropriate changes were made to the CSPROJ file).

I guess because the opinion voiced by you and @StingyJack seemed to be that implicit includes as default was poor design, but the cases brought up were pretty exotic, meaning that implicit includes indeed does seem to be an excellent design choice

If I didn't say that, I mean it, and I dont say that just to be combative or flippant. There is nothing exotic about having a manifest of what is expected to be included into the final result. Lack of that manifest opens the door for unexpected ("exotic?") things to be included into the final result. Every other profession that creates something has an explicit list like this. To put it differently...

Would you eat a meal when you knew the chef was not in control of the ingredients and preparation of that meal?

Would you permit a renovation of your home when you knew the general contractor was not checking the gauge of electrical wiring used or if it was copper or aluminum before installing it?

Would you take a medication if you knew the producer was not in total control of the manufacturing and packaging of that medicine?

These other "creation" industries may be regulated and thus required to use an explicit list, but the ones that aren't will use one because its necessary for planning and because it is an easy way to have a reasonable expectation of similar quality of output between different efforts. The second part is the key for us; we cant incrementally improve upon something if that something can change in quality without us even knowing it. I recently discovered something like this had been happening in a python project I work on. Some builds would just be _weird_, some unusable. I found that early on, someone had added a bunch of packages into the setup.py file and missed the comma between an upper and lower version range. It was an easy thing for at least 5 skilled developers to miss for almost a year. Pip (python's nuget.exe) read the whole thing as the lower version with some prerelease tag and started always including the latest version even if it was an alpha package.

Implicit code inclusions, wildcard/floating version dependencies and automatically included transitive references are a convenience, but this convenience comes with risks to your projects success that are very real. With the latter putting the project at the mercy of any package author in the dependency chain's release schedule and release quality (or un-release schedule - see "left-pad"). Assuming this risk should be an opt-in and not be the default. Its not setting programmers up for the "pit of success" by any stretch.

Also, we aren't usually involved in making software that has immediate Life or Death consequences, but we are all involved in making software that has Quality of Life consequences for users, ourselves, and our fellow programmers. The .net community tried this implicit file experiment in the .net v1 -2 era with Web Site Projects, and while it was great when you had the one exotic use case where you needed to update compileable files on the fly, it sucked to have to manage everything in the project based on the presence or absence of files (or phantom files - .dll.refresh) or specific file extensions (.excluded). But what really sucked was when some kind of pollution (like an unintended file, or the wrong version of a file) crept in and you had to troubleshoot the result. Anyone wanting to try that out is still able to do so, but most of us use Web Application Projects - where we can control the inputs and outputs to achieve a predictable result - instead.

@StingyJack:

There is nothing exotic about having a manifest of what is expected to be included into the final result. Lack of that manifest opens the door for unexpected ("exotic?") things to be included into the final result. Every other profession that creates something has an explicit list like this.

Why isn't Git enough of a "manifest" to you? If you are adding heaploads of files to Git that aren't supposed to be there or as a part of the final product, perhaps you should revise your development process?

@asbjornu - I think you mean "version control system" and not specifically git, correct? This isnt about version control in the first place, and AFAIK msbuild still requires the files to be present on a filesystem, and not in the VCS of our choosing. The abridged comment I made above regarding project files with all these implicit inclusions and defaults is about the project file. It was never a heap of files that caused problems with web site projects, it was always that one file that resulted several hours of troubleshooting.

I'm curious to know how you would entertain those three questions I posed about other professions, because the answer from me is going to be No, No, and No. If you also would answer "No" to those three, then consider the point of view of a business of our profession. Would they trust a programmer to make something important for their business who does not control what goes into the program?

If you want to take a shortcut with implicit includes, and you can manage the associated risk, I've got no complaints. But making all of these implicit includes the default behavior so that I have to do work now to mitigate a risk that I did not have to do with the prior format is something I am going to complain about.

@StingyJack:

I think you mean "version control system" and not specifically git, correct?

Nope, I mean Git, since most other version control systems don't have a cryptographically verifiable history, signed commits and signed tags. With these in place, you have a pretty strong source of truth for what should and should not be considered a part of the resulting application.

If you currently allow developers to add any strange file to your VCS without any code review, no verification, no sign-off or any other editorial process in place, I say your development process is at blame here, not the project system. Most development platforms take the same route here of implicit inclusion; Node.js, Ruby, Python, PHP, Go, Rust, Docker, etc.

Would they trust a programmer to make something important for their business who does not control what goes into the program?

Why do you claim the programmer doesn't have control over their VCS (preferably Git)?

Nope, I mean Git, since most other version control systems don't have a cryptographically verifiable history, signed commits and signed tags. With these in place, you have a pretty strong source of truth for what should and should not be considered a part of the resulting application.

A "cryptographically verifiable history, signed commits and signed tags" does not describe intent of the developer. You can cryptographically sign anything you want; that property does not mean that its contents should be trusted (and is the jist of @StingyJack 's argument).

Why do you claim the programmer doesn't have control over their VCS (preferably Git)?

This assumes that the developers in the system are competent or at least not malicious. Ask event-stream how that worked out for them (and everyone upstream that got burned "I don't know what to say" for me was the quote of 2018).

Hi folks,

We think for most people, automatically including all files with the right extension makes sense. For those who don't want that behavior, you can turn it off by setting the EnableDefaultItems property to false.

This is in line with the philosophy we had when designing the updated project files, which was to have sensible defaults that could be overridden when necessary.

There's been a ton of discussion on this issue and it seems like it's been a catch-all for any comments about the project file format. That makes it less likely that we will pick up on and address feedback. If you have concrete things causing you trouble, I recommend creating new issues for those.

Thanks!
Daniel

@aolszowka:

A "cryptographically verifiable history, signed commits and signed tags" does not describe intent of the developer.

The commit log should reflect the developer's intent. Every single commit can be signed to identify the developer of said code. A code review signed off by someone else can double-verify the intent. A signed and tagged merge-commit of the reviewed commits can triple-verify the intent. How many verifications do you need?

You can cryptographically sign anything you want; that property does not mean that its contents should be trusted (and is the jist of @StingyJack 's argument).

Can no one in your team be trusted, not even a chain of command that are able to sign off on code reviews or tagged merge-commits?

Why do you claim the programmer doesn't have control over their VCS (preferably Git)?

This assumes that the developers in the system are competent or at least not malicious.

If all your developers incompetent and/or malicious, having explicit includes in projects files won't make a difference. If having three different people at different levels in the chain of command cryptographically sign and thumbs up a range of commits isn't sufficient to you (it is to code under the submission of a PCI-DSS review process) then nothing will ever be.

Ask event-stream how that worked out for them (and everyone upstream that got burned "I don't know what to say" for me was the quote of 2018).

Irrelevant and incomparable to the process I'm describing. If a hostile take-over of a Git repository is possible in your process and VCS, your process and VCS is broken, not your project system.

Was this page helpful?
0 / 5 - 0 ratings