"Upper bounds are information about what you know your package works with."
Not really; the upper bounds generally are speculative. They specify that the package will not work with particular versions when typically those versions have not yet been released. Often when those versions are released they work fine, which prompts a useless flurry of dependency bumping.
"If I and all my dependencies had followed the PVP and specified correct upper bounds everywhere, I would still be able to build my app today."
I now distribute my packages with the output of the ghc-pkg command so you can see the exact versions the library builds with. That solves your problem without using speculative upper bounds.
I wouldn't expect it to work for "years to come" in any event, though: Haskell changes too much. You probably won't be able to get today's package to work with the compiler and base package that will exist years hence. It's hard to even get an old GHC to build: today's GHC won't build it because of...dependency problems.
> Not really; the upper bounds generally are speculative.
Yes, an upper bound is still useful information. It says that a package is known to work with a certain range of dependency versions. This is why I think cabal should support something like < for speculative upper bounds and <! for known failure upper bounds. Then we could have a flag for optionally ignoring speculative upper bounds when desired.
> I wouldn't expect it to work for "years to come" in any event, though
I can still download a binary of GHC 6.12.3. Hackage also keeps the entire package history so there's no reason things shouldn't build for years to come. I'm all for the fast moving nature of the Haskell ecosystem, but we also need things to be able to build over the long term and there's no reason we shouldn't be able to do that.
> Which won't even work on current versions of Debian GNU/Linux because the soname for libgmp changed.
Again, my argument assumes that you're not upgrading to current versions of things. Enterprise production-grade deployments often fix their OS version because of this. There's a reason RHEL is so far behind the most recent versions. If you are upgrading, then you'll be fixing these problems incrementally as you go along and it won't get out of hand.
> No it doesn't. I had specified a dependency on a particular version of text
You're making my point for me. It looks like you were probably depending on 0.11.1.4, which doesn't exist now. Your problem would not have happened if you had used an upper bound of "< 0.12" like the PVP says instead of locking down to one specific set of versions. If you had done that, cabal would have picked up subsequent 0.11 point releases which should have worked fine. Also, I can still download text-0.1 which is more than 5 years old.
Not really; the upper bounds generally are speculative. They specify that the package will not work with particular versions when typically those versions have not yet been released. Often when those versions are released they work fine, which prompts a useless flurry of dependency bumping.
"If I and all my dependencies had followed the PVP and specified correct upper bounds everywhere, I would still be able to build my app today."
I now distribute my packages with the output of the ghc-pkg command so you can see the exact versions the library builds with. That solves your problem without using speculative upper bounds.
I wouldn't expect it to work for "years to come" in any event, though: Haskell changes too much. You probably won't be able to get today's package to work with the compiler and base package that will exist years hence. It's hard to even get an old GHC to build: today's GHC won't build it because of...dependency problems.