Vcpkg: "Why not Conan?" FAQ section confuses and misrepresents

Created on 1 Jan 2017  路  4Comments  路  Source: microsoft/vcpkg

All 4 portions of the "Why not Conan?" section of the FAQ either lack logical coherence or are completely orthogonal to technical merit and tool suitability.

The "Public federation vs private federation" section argues against permitting individuals to publish packages, stating

"we believe there should be a single, collaboratively maintained version which works for the vast majority of cases and allow users to hack freely on their private versions. "

You don't need extreme (and unfortunate) technical limitations in order to create a walled garden. Just run your own Conan server, and only authorize uploads from the microsoft account. You can keep the ports directory - just replace 'CONTROL' with conanfile.py (there's an automated tool for it). Only permit CI to with that account - and voil脿! You've perfectly mirrored the existing access permissions and model for vcpkg, and done so without crippling user freedom.

You can argue for curation, sure, but baking that into design for build dependencies doesn't end well. I love Homebrew, but we're lucky to get non-core packages like libvips updated within 2 months. Reliance on curated listings doesn't scale. If you do curation, strictly limit to to what you can commit to manage and respond to 0days for. Encourage and support the community in doing the rest.

If you want to improve discoverability of 'official' packages in Conan, submit a PR for the package sorting algorithm. A convention of prioritizing packages where the username and package match would be an nice, zero-maintenance way to allow authoritative versions through username reservation. Or just only allow 1 instance per package name. We're talking about trivially tweaking the package server. Your first reason lacks substance.

Per-dll vs Per-application. When dependencies are independently versioned on a library level, it encourages every build environment to be a completely, unique, unable to take advantage of, or contribute to a solid, well tested ecosystem [emphasis mine]. In contrast, by versioning all libraries together as a platform (similar to a system package manager), we hope to congregate testing and effort on very common sets of library versions to maximize the quality and stability of the ecosystem. This also completely designs out the ability for a library to ask for versions that conflict with the application's choices (I want openssl Z and boost X but X only works with openssl Y).

This contradicts decades of evidence. All other development package managers (that I know of) allow libraries to specify dependencies. This isn't something the entire world got wrong. ABI fragility isn't unique to C/C++, just accentuated. We'll talk about system package mangers (and their role in C/C++ software development) later.

Conan supports per-application versioning, and dependency version overrides. Your application just uses another conanfile.py to specify dependency versions. That said, I've been just fine (and override free) with Conan's ecosystem.

The maintainers of a library are usually in the best position to know which versions of its dependencies should be the default. You use the word ecosystem. I do not think that word means what you think it means. For in the previous paragraph, you said:

we believe there should be a single, collaboratively maintained version

That is ...not an ecosystem. But we will discuss ecosystems later. You continue

we hope to congregate testing and effort on very common sets of library versions to maximize the quality and stability of the ecosystem.

So.. "versions". Okay. I like the concept of opt-in platform versions. I.e, "here's a set of library versions, all of which are compatible". And here's a new set of versions. etc.

That's great. That could also come in the form of:

  • A git repository with a single plain text file of conan version numbers.
  • A trivial pull request to add said feature to Cargo.
  • Something not claiming to replace a package manager. I like the name "Port Tree". It doesn't imply anything more than it is - a collection of build scripts for various curated packages.

Doesn't vcpkg only support 1 (user-wide) set, though?

An enforced system-wide version set is exactly the pain we've been trying to escape from for decades on linux/mac. We've invested tremendous effort in escaping this situation. It completely prevents cross-platform development, because no two operating systems will contain the same version set.

Don't you think designing out the ability for libraries to specify dependency versions is a bit overkill? That's one way to ensure an ecosystem can never develop, because packages can only have good usability if they live in an official platform set.

Your choice of OpenSSL as a conflict example is a good one. Across my projects, breaking OpenSSL API changes have painful. But none of that pain was present for Conan-managed libraries. I can clearly see what versions libraries support, and if I need to update something, it only takes a few minutes. How would a Windows-system-wide copy of OpenSSL be any less painful with vcpkg than it was with homebrew or apt-get? Please describe - this would be an excellent addition to the FAQ. I would describe the balance and boundary between curated and non-curated software, and how the size of that surface affects the suitability of this tool.

Reason 2 is hyperbolic doublespeak.

Cross-platform vs single-platform. While being hosted on many platforms is an excellent north star, we believe the level of system integration and stability provided by apt-get, yum, and homebrew is well worth needing to exchange apt-get install libboost-all-dev with brew install boost in automated scripts.

I don't use homebrew or apt-get or Chocolatey to manage my dependencies. I could, in theory, develop within Docker containers, and use apt-get or homebrew inside them. But I'm not forced to, because Conan exists. I have full package isolation, and can support as many different sets of versions as I want. Today, I'm simultaneously running Conan on OS X 10.11, Windows 10, and Ubuntu 14.04. I have several different project with different needs, and everything JUST WORKS.

This is nice. It's what I'm used to experiencing with RVM+Bundler, Virtualenv+pip, Rustup+Cargo, npm, and even NuGet (modulo a few bugs).

Let's clarify some terms:

  • System package managers: apt-get, homebrew, Chocolatey. Only one version of a package can exist on the system. One can have gcc and gcc5.3, but they are separately named packages. You can't run gcc5.3 by invoking gcc. The other limitation of apt-get, homebrew, and Chocolatey is that they only install with root access.
  • Dependency package managers: Bundler, npm, Cargo, pip, nuget. Usually associated with a certain language or set of languages, where the package manager's design often takes specific needs of the language's compiler/runtime into account. These usually rely on the system compiler/runtime for their language, and are therefore combined with environment managers so that developers can work on more than 1 project at a time (assuming different projects need different versions). There are usually multiple instances installed if you're using a version manager. Many offer primitive runtime/compiler switching, or are intentionally environment.version manager friendly.
  • Environment managers: rvm, rbenv, nvm, virutalenv, dnx. rustup. These allow you to use different compilers/runtimes - and different package manager versions - for a given language at the same time. They may shuffle storage data around so that package managers (if using user-local storage) aren't reusing the same caches against multiple runtime versions.

In developing C/C++, we're constrained by our system package manager's curation as to what compilers we can have simultaneously installed. We do have the CXX and CC env vars, however, and Conan helpfully provides compiler switching. In doesn't redefine any behavior outside of its process duration, however, so in this respect it is more like rbenv than rvm.

System-wide build dependencies preclude reasonable cross-platform development

System-wide package management works smoothly when you're not adding any custom code to the mix, and you're not at all picky about what version of at tool you end up with. This constraint is often unacceptable, so modern tools often ship in docker containers. A new generation of languages appeared just to solve this problem through pervasive static linking. (Which has servicing concerns)

Developing software against system-wide dependencies is a nightmare. We have no recourse against breaking changes in security updates, no method for compiling two (completely separate!) tools that need different libpng ABIs. This is unnecessary pain.

This is a very strange point in time (given Microsoft's new strategy) to begin (deceptively, or through ignorance) promoting approaches that tie developers to Windows and ensure future pain.

We chose to make our system as easy as possible to integrate into a world with these very successful system managers -- one more line for vcpkg install boost -- instead of attempting to replace them where they are already so successful and well-loved.

There are successful system managers in this world, yes. I'd also say OneGet might deserve the designation. Your design does not make vcpkg easier to integrate. This paragraph tries to frame vcpkg as filling the role of a systems package manager, and is very cleverly worded.

But the goal of improving developer experience across operating systems is not well served by vcpkg. Linux package mangers came first, and distributions offered floating, but usually compatible, versions of libraries and tools in curated repositories. One could switch 'stable/beta' channels by changing repositories. But compatibility between distributions was still poor, and we often, through time constraint, had to pick one distribution at a time to support. Backports might happen. So when we started bringing those apps to Mac, we needed to mimic that set of dependencies. Today, Homebrew recipes often use precompiled binaries, and often fetch specific dependency versions. Homebrew also makes it possible to install specific recipie versions.
We didn't, and can't improve developer experience by increasing the number of permutations they must troubleshoot.

It should be telling how unified and focused the entire industry has been on eliminating system package management. Dependency hell is painfully expensive in development, but catastrophic in production. Thus we got Docker, and Ubuntu Snap Packages, and a hundred iterations of containerized packages, and a dozen new cloud operating sytems all with the goal of NOT sharing dependencies with anyone else, and sometimes even eliminating system package management altogether.

Reason 3 ignores modern development and devops practices, and uses some of the trickiest wording I've seen outside diplomatic cables.

C++/CMake vs python. While Python is an excellent language loved by many, we believe that transparency and familiarity are the most important factors when choosing a tool as important to your workflow as a package manager. Consequently, we chose to make the implementation languages be as universally accepted as possible: C++ should be used in a C++ package manager for C++ programmers. You should not be required to learn another language just to understand your package manager.

Arguments: 1. Transparency, familiarity of interaction with tool. 2. Implementation language as universally accepted as possible. 3. You shouldn't need to learn another language to understand your package manager.

I'm going to assume that if you've successfully edited a .ini file, you can use the declarative conanfile.txt syntax. If you're keeping logic in CMake, it's likely sufficient. Conan passes CMake all the variables you need. You'd probably switch to conanfile.py if you need different dependencies depending on the operating system (say you use SChannel on Windows, but OpenSSL on linux).

For the sake of argument, let's start with one of the most complex examples listed for conanfile.py:

from conans import ConanFile, CMake

class PocoTimerConan(ConanFile):
   settings = "os", "compiler", "build_type", "arch"
   requires = "Poco/1.7.3@lasote/stable"
   generators = "cmake", "gcc", "txt"
   default_options = "Poco:shared=True", "OpenSSL:shared=True"

   def imports(self):
      self.copy("*.dll", dst="bin", src="bin") # From bin to bin
      self.copy("*.dylib*", dst="bin", src="lib") # From lib to bin

   def build(self):
      cmake = CMake(self.settings)
      self.run('cmake "%s" %s' % (self.conanfile_directory, cmake.command_line))
      self.run('cmake --build . %s' % cmake.build_config)

Is anyone capable of writing C or C++ going to have trouble understanding this?

The most magical bit is that CMake(self.settings) has a .command_line and .build_config method, and that this generator-specific logic makes more variables available to CMake than you would expect.

I can't compare this to vcpkg, because I don't see where it even offers a dependency list. I assume that some kind of script is used to invoke it for each package? Looks like you're learning another language!

Argument 2. Implementation language as universally accepted as possible.

Accepted by who? The operating system? Python is the most ubiquitously installed scripting language on earth, as far as I know. It's usually present on even the most minimal docker images. You distribute it with Visual Studio. It's bundled with pretty much every distribution of linux and os x.

If you're talking about comprehension by the C++ development community, that's an unusual argument to make. Package managers aren't usually inspected; we just expect them to work. If they don't, we complain to specialists who understand the domain. It's a large domain, and there are many pivots, and complexities.

Let's look at some examples. NuGet has been around for 5+ years, but only acquired 30 significant contributors. I didn't spot any that weren't MSFT employees. Perhaps the project isn't managed as a true open-source project, and that had an effect. Let's look at a community-developed project - Chocolatey. Wow. Just one significant contributor.

Domains like package management, codecs, image processing libraries - they require a lot of domain knowledge to contribute to. As maintainers, we want to choose the technologies that we think are most likely to encourage contribution. We're better off choosing the best tool for the task. If there's already a tool that does what we need, we use it.

Rejecting a tool because isn't written in your favorite language seems like a Microsoft culture phenomenon. In other communities, we aggressively blend tools written in a range of languages. Take a rails app. There are usually 1-2 tools written in an ML-style language, several C and C++ components, embedded lua runtimes, some form of npm for asset management (at least), Perl (using git?), a bit of Crystal, Rust, various Java packages, and a variety of Python apps that you probably don't know are built with Python, because they've haven't spewed a stacktrace yet.

We're hoping to see NuGet reach 60% feature parity with Bundler by 2019. I'm told that, before NuPack, in the early stages, Bundler was being used. But it needed to be C#, not Ruby, of all things. So we resorted to ... a single-platform, single-tool set of scripts claiming to be a package manager. There's history here, and it seems to be repeating.

Argument 3. You shouldn't need to learn another language to understand your package manager.

My response: documentation.

vcpkg should not brand itself as a package manager

There's already an open-source package manager for C and C++. It's well designed, the result of many real-world iterations and deployments, and backed by developers with an incredible amount of experience in the space.

  • It handles pivots extremely well across operating systems, C runtimes, and compiler compatibilities. The design is excellent.
  • It is build-system agnostic - able to integrate with everything - yet is incredible simple to understand and use.
  • It can work directly with Visual Studio, through CMake, or nmake, or any tool you like, yet has minimal boilerplate.
  • You have conanfile.txt or conanfile.py to declare your package, dependencies, extra configuration/pivots, and build invocation through which the pivots are sent. Providers for the top 12 build systems make this automatic - unless you want more control. At no level of abstraction is there pain. conaninfo.txt, the file Conan creates in a target build directory, provides visibility into the compiler settings and package versions in use.
  • Essentially cruft-free, yet handles extremely complex situations.

It is trivial to use with Visual Studio and CMake.

Ecosystem

I said I'd get back to ecosystems. vcpkg's FAQ describes itself as the champion of a package ecosystem, and specifically superior in this respect to Conan. Let's zoom back a bit and look at the ecosystem of package managers.

Adoption of package managers

C and C++ are hard languages for which to do package management. Developers are justifiably gun-shy - many build tools have claimed to tackle package management, but failed horribly on key details. Some build-tool-specific managers exist (cpm, hunter).

It's hard to encourage devs to invest in new tooling, given obvious limitations (or suspicion of hidden ones). Before Conan, the core developers created Biicode. As a company, Biicode was unable to monetize C/++ package management, and eventually went bankrupt. They took the lessons learned from commercial C++ package management and designed Conan, while helping Biicode users transition as best they could, and funding their work with weekend consulting. They community grew, and They're now a part of JFrog/Artifactory, but I've not seen that much dedication to a problem before.

The C/C++ ecosystem consensus seems to be converging towards CMake as a cross-platform build tool. This makes my life easier. But adoption of package management is a slower task, despite requiring a much smaller effort. The pain of system package managers has spawned a "copy the source in-tree" religion that is very hard to eradicate.

Maintainers of common packages are just beginning to add conanfile.py, and setup Travis and AppVeyor for cross platform package deployment. We badly need this trend to continue if we want low-friction cross-platform development to become ubiquitous.

Microsoft has a recorded history of displacing existing or nascent open-source projects. Often the replacement was designed without a study of prior art, and lacking in technical merit.

The most effective approach to ensuring a solution succeeds based on advertising or brand, rather than merit, is to either

(a) pretend you've never heard of them, and be silent in places where users trust you to credit prior art, or
(b) use a disinformation strategy to ensure the competition's role and merit is misunderstood.

Given that a massive contingent of the software development community looks to the Microsoft brand for leadership in tool and technique selection (evangelism does its job), I would argue that a majority of Windows C# and C++ developers entering the open-source space are doing so (at least in part because) Microsoft has embraced open-source. They still favor tools Microsoft signals interest in.

Herd mentality overrides almost all else, and software developers participate in this intentionally, perhaps hoping that usage of a thing will delay it's inevitable, eventual, abandonment. Microsoft may no longer be the bellwether of the software industry, but it is still the bellwether for an extremely large contingent, and that is quite enough to start a herd.

There's some irony that download and Github issue count may have no bearing on software longevity - the Contributors tab is what matters - but in adoption, perception is 90%.

I like to think that there is a hangover from closed-source culture that takes a long time to shed, and that this hangover (rather than any particular intention) is responsible for the more careless events that have so strained Microsoft's relationship with the open-source community in the last decade.

Microsoft employees who have been active for a very long time in the larger open-source community seem more careful to consider the impact of their publications, and more effective at reaching their goals through collaboration and coordination. More wood behind fewer arrows.

Many people argue that 1 package manager per language is too many, and unsustainable given the domain difficulty and contributor pool. They're right. It doesn't matter, though, because software developers need package managers for rapid development, and they need language-specific features. We can (and have been) unifying the science of package management, and this helps, but there are simply not enough people for the workload. So we take what we can get. We'll even sacrifice build reproducibility for years to get that productivity (see NuGet).

I do not see any evidence that there was legitimate prior art analysis prior to this project's promotion. Many other rationale in the documentation are troubling.

The misrepresentation of Conan (and the wider context) in the FAQ is very troubling. We have worked hard to make open source a kinder and more sincere place.

There's a nice tradition of mentioning close prior art at the top of their README, and providing a differentiation (or link to one) above the fold.
This differentiation should stick to objective and carefully researched facts, along with each project's official declaration of scope and goals.
It is very considerate to provide the reader with all of the facts, and let them consider what solution has the best tradeoffs for their situation.

This sets the stage for a spirit of friendly collaboration and sincere work towards our common goal of making software developers happier (or something adjacent to it, I hope).

documentation

Most helpful comment

Hi @nathanaeljones, happy holidays and thank you for taking the time to write out this critique; we do want to encourage discussion around any design points of Vcpkg to ensure we're building the best tool possible for C++ developers.

First and most importantly, I'm deeply sorry that GitHub's spam algorithm flagged your post! GitHub has been an excellent site for many projects and if their spam detection algorithm has a chilling effect on useful discussion, we should do something about that. I have no evidence that Microsoft's repositories receive special treatment (please share some regarding your comment!), but let us know if we can help you take this further to GitHub support.

Curation

Glad to hear you find homebrew as useful as we do! As you can tell, we do believe the curation models of Homebrew, Debian, Ubuntu, Gentoo, BSD, MacPorts, Pacman, and others have been profoundly shaped by the way native code is written and distributed. We believe their success is in part due to their strong curation models which enable a standard interoperability model between different developers to benefit users. All of these have grown various mechanisms to support the community adding additional packages outside the curated region and we intend to follow a similar path -- a standard, reliable, curated core with the ability for users to override anything they want, including replacing the core packages.

Today, for example, anyone can supply a CONTROL+portfile from their open source project that users can simply drop into their local vcpkg and we will treat it the exact same as any official package. In a more regulated scenario such as inside a company, they can easily fork the Vcpkg repository and change whichever ports they'd like (most commonly: freezing the versions). It can be even merged directly into a project's main source tree (though we'd recommend a submodule), giving them something extremely similar to the "check in third party sources" if that's what they determine is most appropriate in their scenario. We do not rely on the Microsoft/vcpkg repository anywhere in the tool, save in the various help documentation.

Improving this experience is still a very important area for us. We definitely see the use of something like homebrew taps and we want to implement something similar which doesn't cause dependency graph fragmentation.

Per-Library vs Application-Wide Versioning

I believe we've had a miscommunication about what we mean by library: in the bullet point we meant _user-authored libraries_. Especially if the user wants to avoid excessive lock-in (and therefore doesn't want to be forced to republish all their outputs as packages within the system), it becomes difficult to synchronize separate builds and to ensure they use the exact same dependency versions. At the end of the day, we all want one version of each dependency to go into the OS process. Sometimes games can be played depending on the specific packaging and deployment choices to enable multiple deployment of the same library, however our experience (with large scale software development) is that this is a very bad idea. Therefore, there needs to be a single definition of which version will be used in that final process. The obvious entity to choose this is the final application (or suite of applications if they share implementation libraries). We support this model by placing the ports\ directory under the moral control of the application developer, which is the single unambiguous source of truth for all external source code that goes into your product.

As a side note, we observed that in most of the above package managers, inter-package versioning is most commonly used for binary packages, not source packages [1]. Source packages are generally referred to simply by name, perhaps with a "dependency must be greater than this _very_ old version". In this respect, we do model and respect a very strict versioning scheme between our internal binary packages.

Glad to hear you like the idea of platform versions. We generally think these are an excellent example of falling into the pit of success, so for Vcpkg they opt-out -- you automatically get the latest libraries that work together, but you can easily switch to any of the older sets (git checkout an older commit) or overwrite any particular port directory to change that specific package.

User-wide

I'm happy to note that we definitely support more than one set! However, as you've clearly pointed out, our documentation in this area is insufficient.

We believe that as an enthusiast or hobbyist developer, there's a serious productivity win to be gained from having a blessed set of libraries that you get "by default" for your projects, but as you scale up, the need to control your specific dependencies (transitively) requires the ability to tie the above platform version set to a particular commit of your application code.

To handle this, we support as many Vcpkg instances on the same machine as you'd like! With the CMake integration, you always specify which particular instance you're using by how the toolchain is specified. With MSBuild, you need to set a single property across your solution to point at the particular Vcpkg instance you want to use.

Shared dependencies between applications

You're absolutely right that the current trend is to avoid system libraries for binary deployment due to the DLL hell it creates. I'm happy to say that this is _exactly_ what Vcpkg is designed to do -- to help you build fully contained applications that are extremely easy to deploy and do not risk DLL hell or system updates breaking your application.

Vcpkg is designed to look like a system package manager for your build environment (there is a single definition of where and what "boost" is), but at the end of the day you get a containerized application out with local copies of every runtime dependency you used. From there, distribution is as simple as zipping the .exe and .dlls in your output folder and copying them to any machine you like (no Vcpkg required on the customer machine!) [2].

Note that on Windows (for various reasons) the standard way to avoid DLL hell has been to install them inside your application folder, alongside the executable (unless the vendor has a distributable MSI). This is what happens with Vcpkg as well: we copy the exact set of dependency DLLs alongside the output executable, so updating Vcpkg will not break any of your installed programs.

Misrepresentation of Conan

We certainly have no intention of spreading misinformation about Conan.io! I was fortunate enough to have the opportunity to talk to Diego in person at CppCon, where we were able to discuss the differences between Vcpkg and Conan (goals, design choices, etc). We also have extended an open offer to correct any issues in our compare-contrast FAQ section, which has yet to be taken up. Because of this, I currently believe we are not misrepresenting Conan and the differences between the projects arise from focusing on different parts of the problem space (a large one, as you've mentioned!).

The fine people over at Conan have even gone so far as to write a bridge between Vcpkg and Conan [3], which is great to see! We have some reservations about the specific details [4], but the overall direction is a great one that we'd like to continue in the spirit of friendly collaboration.

Conclusion

Thanks again for taking the (immense) effort to write out your critique above. You've really helped to highlight areas of documentation that need improvement and some technical points we should keep an eye on as well.

Further, I'm sincerely sorry for the trouble that GitHub's filters caused you -- I can imagine how brutal it would be to lose access!

[1] https://github.com/Homebrew/homebrew-core/blob/master/Formula/cpprestsdk.rb#L16
[2] You may need to also install the VC++ 2015 Redistributable, same as always
[3] http://blog.conan.io/2016/10/17/Using-Vcpkg-ports-as-Conan-packages.html
[4] https://github.com/conan-io/conan-io.github.io/issues/17

All 4 comments

Update: My Github account has been reinstated, and my repositories are back online. Given the apparent risk of interacting with Microsoft repositories (particularly when one's repo uptime matters for bundle install and CI), I'll be keeping my side of the conversation elsewhere.

First of all let me say that it is a shame that your account got closed over this. I'd really like to know who or what triggered that process.

Now assuming that your arguments won't convince the maintainers to just drop the project or completely change the way vcpkg works. Is there a list of actionable items you'd like to be resolved (like remove incorrect statement X, add functionality Y)? You named a few candidates, but they imho get lost in the larger discussion.

I'd really like to know who or what triggered that process.

See https://github.com/Microsoft/vcpkg/issues/478#issuecomment-269916351

Now assuming that your arguments won't convince the maintainers to just drop the project or completely change the way vcpkg works.

I didn't advocate dropping vcpkg. I did advocate for honesty in its labeling, and I would suggest some form of on-boarding for Microsoft employees new to open-source.

My understanding of the term 'preview' is that both implementation and interface are subject to heavy change. The initial commit for this project is dated Sept 18, 2016. I would not expect that a project with 0 releases is already prioritizing backwards compatibility over architecture.

FWIW, vcpkg could present the same UI, but be a small wrapper around Conan, so that users are not locked in to a single platform and toolset. vcpkg appears to have around ~5500 lines of C++ at this point. We wouldn't be turning the titanic.

Is there a list of actionable items you'd like to be resolved (like remove incorrect statement X, add functionality Y)? You named a few candidates, but they imho get lost in the larger discussion.

Sure. Mostly pasted from above.

The most effective approach to ensuring a solution succeeds based on advertising or brand, rather than merit, is to either

(a) pretend you've never heard of them, and be silent in places where users trust you to credit prior art, or
(b) use a disinformation strategy to ensure the competition's role and merit is misunderstood.

The most effective way to avoid criticism about this to follow the "alternatives" pattern in README.md.

It's pretty simple:

  1. Add an "Related Projects" or "Alternatives" section to README.
  2. Which contains links to functioning projects with overlapping goals.
  3. And differentiates your project from the alternatives you link to. This differentiation should stick to objective and carefully researched facts, along with each project's official declaration of scope and goals. There's usually some form of communication to ensure this is accurate - Github makes notifications on PRs easy.

This simultaneously demonstrates confidence

  • that your project is objectively superior for its specified scope
  • that you've done good prior art research
  • that you are acting in the best interests of your corner of the open-source ecosystem
  • in your users, that they are capable of determining which solution has the best tradeoffs for _their_ situation.

It is important that the project lead(s) author this section. Many projects discover a better scope or focus when taking a second pass at this research.

Hi @nathanaeljones, happy holidays and thank you for taking the time to write out this critique; we do want to encourage discussion around any design points of Vcpkg to ensure we're building the best tool possible for C++ developers.

First and most importantly, I'm deeply sorry that GitHub's spam algorithm flagged your post! GitHub has been an excellent site for many projects and if their spam detection algorithm has a chilling effect on useful discussion, we should do something about that. I have no evidence that Microsoft's repositories receive special treatment (please share some regarding your comment!), but let us know if we can help you take this further to GitHub support.

Curation

Glad to hear you find homebrew as useful as we do! As you can tell, we do believe the curation models of Homebrew, Debian, Ubuntu, Gentoo, BSD, MacPorts, Pacman, and others have been profoundly shaped by the way native code is written and distributed. We believe their success is in part due to their strong curation models which enable a standard interoperability model between different developers to benefit users. All of these have grown various mechanisms to support the community adding additional packages outside the curated region and we intend to follow a similar path -- a standard, reliable, curated core with the ability for users to override anything they want, including replacing the core packages.

Today, for example, anyone can supply a CONTROL+portfile from their open source project that users can simply drop into their local vcpkg and we will treat it the exact same as any official package. In a more regulated scenario such as inside a company, they can easily fork the Vcpkg repository and change whichever ports they'd like (most commonly: freezing the versions). It can be even merged directly into a project's main source tree (though we'd recommend a submodule), giving them something extremely similar to the "check in third party sources" if that's what they determine is most appropriate in their scenario. We do not rely on the Microsoft/vcpkg repository anywhere in the tool, save in the various help documentation.

Improving this experience is still a very important area for us. We definitely see the use of something like homebrew taps and we want to implement something similar which doesn't cause dependency graph fragmentation.

Per-Library vs Application-Wide Versioning

I believe we've had a miscommunication about what we mean by library: in the bullet point we meant _user-authored libraries_. Especially if the user wants to avoid excessive lock-in (and therefore doesn't want to be forced to republish all their outputs as packages within the system), it becomes difficult to synchronize separate builds and to ensure they use the exact same dependency versions. At the end of the day, we all want one version of each dependency to go into the OS process. Sometimes games can be played depending on the specific packaging and deployment choices to enable multiple deployment of the same library, however our experience (with large scale software development) is that this is a very bad idea. Therefore, there needs to be a single definition of which version will be used in that final process. The obvious entity to choose this is the final application (or suite of applications if they share implementation libraries). We support this model by placing the ports\ directory under the moral control of the application developer, which is the single unambiguous source of truth for all external source code that goes into your product.

As a side note, we observed that in most of the above package managers, inter-package versioning is most commonly used for binary packages, not source packages [1]. Source packages are generally referred to simply by name, perhaps with a "dependency must be greater than this _very_ old version". In this respect, we do model and respect a very strict versioning scheme between our internal binary packages.

Glad to hear you like the idea of platform versions. We generally think these are an excellent example of falling into the pit of success, so for Vcpkg they opt-out -- you automatically get the latest libraries that work together, but you can easily switch to any of the older sets (git checkout an older commit) or overwrite any particular port directory to change that specific package.

User-wide

I'm happy to note that we definitely support more than one set! However, as you've clearly pointed out, our documentation in this area is insufficient.

We believe that as an enthusiast or hobbyist developer, there's a serious productivity win to be gained from having a blessed set of libraries that you get "by default" for your projects, but as you scale up, the need to control your specific dependencies (transitively) requires the ability to tie the above platform version set to a particular commit of your application code.

To handle this, we support as many Vcpkg instances on the same machine as you'd like! With the CMake integration, you always specify which particular instance you're using by how the toolchain is specified. With MSBuild, you need to set a single property across your solution to point at the particular Vcpkg instance you want to use.

Shared dependencies between applications

You're absolutely right that the current trend is to avoid system libraries for binary deployment due to the DLL hell it creates. I'm happy to say that this is _exactly_ what Vcpkg is designed to do -- to help you build fully contained applications that are extremely easy to deploy and do not risk DLL hell or system updates breaking your application.

Vcpkg is designed to look like a system package manager for your build environment (there is a single definition of where and what "boost" is), but at the end of the day you get a containerized application out with local copies of every runtime dependency you used. From there, distribution is as simple as zipping the .exe and .dlls in your output folder and copying them to any machine you like (no Vcpkg required on the customer machine!) [2].

Note that on Windows (for various reasons) the standard way to avoid DLL hell has been to install them inside your application folder, alongside the executable (unless the vendor has a distributable MSI). This is what happens with Vcpkg as well: we copy the exact set of dependency DLLs alongside the output executable, so updating Vcpkg will not break any of your installed programs.

Misrepresentation of Conan

We certainly have no intention of spreading misinformation about Conan.io! I was fortunate enough to have the opportunity to talk to Diego in person at CppCon, where we were able to discuss the differences between Vcpkg and Conan (goals, design choices, etc). We also have extended an open offer to correct any issues in our compare-contrast FAQ section, which has yet to be taken up. Because of this, I currently believe we are not misrepresenting Conan and the differences between the projects arise from focusing on different parts of the problem space (a large one, as you've mentioned!).

The fine people over at Conan have even gone so far as to write a bridge between Vcpkg and Conan [3], which is great to see! We have some reservations about the specific details [4], but the overall direction is a great one that we'd like to continue in the spirit of friendly collaboration.

Conclusion

Thanks again for taking the (immense) effort to write out your critique above. You've really helped to highlight areas of documentation that need improvement and some technical points we should keep an eye on as well.

Further, I'm sincerely sorry for the trouble that GitHub's filters caused you -- I can imagine how brutal it would be to lose access!

[1] https://github.com/Homebrew/homebrew-core/blob/master/Formula/cpprestsdk.rb#L16
[2] You may need to also install the VC++ 2015 Redistributable, same as always
[3] http://blog.conan.io/2016/10/17/Using-Vcpkg-ports-as-Conan-packages.html
[4] https://github.com/conan-io/conan-io.github.io/issues/17

Was this page helpful?
0 / 5 - 0 ratings