snl-atdm-issue
One major portability problem with Spack is that it tries to build every low-level library from scratch and often there are build errors in those libraries. Examples include #13189 and #12143. In my limited experience Spack (less than 100 hours of work), this appears to be the biggest portability problem with Spack.
In my 20+ years of building CSE and HPC software, I have never even heard of many of these low-level libraries because they are ubiquitous and pre-installed on almost every system that I have ever worked with and the build systems I have used just find them by default automatically.
The current approach to address is to make people generate their own packages.yaml file from scratch that points to pre-installed libraries on the system. But to do that you have to manually do the concretization of the packages you want to install and then try to recognize which of these libraries are standard ubiquitous libraries and then generate the entries for these packages yourself manually. This is a very tedious process.
Every portable build system I have every seen for a piece of software just finds libraries like these by default. To address this problem, I propose that a new standard variant called something like find-on-system
be added to all of these ubiquitous low-level Spack packages. When that variant is set, then the package.py file will try to find it installed on the system and will fail if it can't find it. Therefore, I could just set the global variant 'find-on-system' in my spack.yaml
file and then give it a go. If some of these packages could not be found on the system, then I could add ~find-on-system
for those packages and let Spack try to build them or I could use yum
to install them. I suspect this would significantly improve the portability of spack builds. Otherwise, Spack is throwing away all of the hard work that goes into creating distributions.
@bartlettroscoe: something to solve this is on the agenda for this fall coming FY, but it is not the solution you suggest. Specifically:
I propose that a new standard variant called something like find-on-system be added to all of these ubiquitous low-level Spack packages. When that variant is set, then the package.py file will try to find it installed on the system and will fail if it can't find it.
We already have the externals mechanism for this. See this milestone: https://jira.exascaleproject.org/browse/STNS02-14, where we would be extending the existing compiler detection to find things like build dependencies and common libraries and add them as externals. We might add detection logic to package.py
files, but we wouldn't add a variant in this case because Spack can already represent this stuff in its metadata model.
The idea is you might run spack external find
like you currently run spack compiler find
, and the various external packages would be automatically added to the database.
We already have the externals mechanism for this. See this milestone: https://jira.exascaleproject.org/browse/STNS02-14
@tgamblin, that says:
We will start by implementing support for auto-detection of build dependencies for packages (as we do for compilers) and then evaluate what it would take to implement external link dependency detection.
That sounds like Spack may not actually auto-find standard library dependencies in that deliverable (not due til 4/30/2020).
For each system we need this to work on, I will have to just manually create a preinstalledPackages.yaml
file to get the job done in the meantime.
@bartlettroscoe: we may be able to do this sooner rather than later -- but I'll have to see how current tasks go.
Most helpful comment
@bartlettroscoe: something to solve this is on the agenda for this
fallcoming FY, but it is not the solution you suggest. Specifically:We already have the externals mechanism for this. See this milestone: https://jira.exascaleproject.org/browse/STNS02-14, where we would be extending the existing compiler detection to find things like build dependencies and common libraries and add them as externals. We might add detection logic to
package.py
files, but we wouldn't add a variant in this case because Spack can already represent this stuff in its metadata model.The idea is you might run
spack external find
like you currently runspack compiler find
, and the various external packages would be automatically added to the database.