Sorry, this is a weird one :slightly_smiling_face:
Let's say you have 3 packages: a, b, and c. Both a and b depend on c. Furthermore, a and b are in separate Stack projects. Both specify c as a package in their project. This makes c a shared package between a and b.
If you build a, it will of course build c. Then make a change to c and build b, which will re-build c. Finally re-build a. I would expect that to re-build (or at least re-link) c but instead it does nothing. This means that b has the new c but a has the old c.
Here's exactly how to reproduce:
mkdir c
echo '{ name: c, library: { dependencies: base, exposed-modules: C } }' > c/package.yaml
echo 'module C where c = 1' > c/C.hs
mkdir a
echo '{ resolver: lts-8.11, packages: [., ../c] }' > a/stack.yaml
echo '{ name: a, executables: { a: { dependencies: [base, c], main: a.hs } } }' > a/package.yaml
echo 'import C; main = print c' > a/a.hs
mkdir b
echo '{ resolver: lts-8.11, packages: [., ../c] }' > b/stack.yaml
echo '{ name: b, executables: { b: { dependencies: [base, c], main: b.hs } } }' > b/package.yaml
echo 'import C; main = print c' > b/b.hs
cd a
stack build
stack exec a
# => 1
cd ..
echo 'module C where c = 2' > c/C.hs
cd b
stack build
stack exec b
# => 2
cd ..
cd a
stack build
stack exec a
# => 1
# Should be `2`!
Thanks for the report! This is a similar issue as https://github.com/commercialhaskell/stack/issues/2904 . The solution there was to unregister downstream packages when rebuilding c. However, in this case when building from b's stack.yaml.
The only straightforward solution to this that comes to mind is putting the path to the project's root directory (location of stack.yaml) into the ConfigCache. This would force rebuild whenever switching the location of project configurations. Does that seem like reasonable behavior?
I think it's safe to use the project dir, as in the case of configurations that share the same root directory, if they also share the same dist path, then they will also be sharing the same package DB, and c would get unregistered. If they don't share the same dist path, then this issue won't occur.
The only straightforward solution to this that comes to mind is putting the path to the project's root directory (location of stack.yaml) into the ConfigCache. This would force rebuild whenever switching the location of project configurations. Does that seem like reasonable behavior?
I don't know enough about Stack's internals, but that sounds good to me.
This issue has been a development pain point for a while, and today I experienced a production issue because one of the downstream dependencies wasn't built properly due to not picking up a shared change.
Is there any way I can help accelerate this? Thanks!
Would be nice to track this bug down the codebase and see what causes it.
I think @mgsloan described above what needs to happen--if a Stack developer gave me an outline of what to do I could take a stab at it?
Yes, he did, I see it now, but that was a year ago and the code has changed (probably not by much).
AFAIK the code has not changed much in this respect. However, the solution I described earlier seems like a hack - what is so special about a change in stack.yaml location?
Instead I think there needs to be a mechanism to record which package DBs a local package has been installed to. I think the issue here might be that the build of the package is correct and matches what both configurations want, but it's only installed to one DB - the other has the older install.
Sorry for the vagueness, but it has been a while since I've looked at this, and generally my in-brain cache of info about stack is not quite as populated as it once was. I hope someone with a more up-to-date cache can comment and provide guidance.
Speaking as someone totally unfamiliar with Stack internals -- could the downstream dependencies simply check the hash of the package that they have installed against the latest-built hash of the local package? (Assuming that Stack uses hashes to keep track of build products...)
@mgsloan thanks for your help. One more question, do you know who would be best to ping about this? :)
@thomasjm No problem! Seems like @mihaimaruseac and @snoyberg are good folks to ping about this. Indeed, I agree that this is a pretty gnarly correctness flaw and I hope it gets resolved cleanly.
I'll need to learn more about that part of the build plan code but will try to help if there are questions.
I'm reviewing issues now, but I'd be happy to advise on this @thomasjm if you're still interested.
I would like to see this fixed. If you can point me in the right direction I'd be happy to take a look @snoyberg ?
I'd start by adding the appropriate field for project root to the ConfigCache datatype (in Stack.Types.Build) and following the compiler errors. I just opened a similar PR for adding the PATH env var: #4740
I have not been bitten by this bug recently, but I was combing through old issues and figured I'd see if this was still an issue with Stack 2.1.3. Unfortunately it still is :cry:
This issue seems to have changed slightly in 2.1.3. Instead of silently getting a stale package, I'm now seeing builds fail completely with ghc-pkg: cannot find package [the shared package]. I still haven't found time to dig into this unfortunately.