How is the GSP receiver a consequence of GR (honest question)? I know internal clocks of the satellites have to compensate for time dilation, but to be honest I don't see why that couldn't have been 'fixed' with an empirical solution devoid of theoretical basis. After the first launch, of course.
I think the empirical solution would not be as straight-forward as you proclaim. Namely, because lacking any clear understanding of the science any number of theories might crop up that attempt to explain the perceived anomaly thus hindering efforts at finding a workable solution. Some of the emergent theories might actually lead to declaring outright that the concept is flawed and unsolvable.
On the other hand, having a clear conceptual framework allows these people to pin-point what areas might be causing problems. It allows for effort to be focused and justifies certain fixes. Especially in costly scenarios such as this one, where there might not have been a second launch given the failure of the first.
However, my limited knowledge of the problem precludes me from understanding if a simple 24 hour re-sync would address the underlying problem.
From what I do understand the accuracy/precision of the GPS system would be affected. I'm also inclined to believe that even with a regular re-synch, the overall usefulness of the system would be affected. As I don't believe the re-synch itself would be exempt from the underlying problem.
What I do know is that given our better conceptual framework we're able to leverage very precise location information; this leads to a more useful and productive GPS system than we would have otherwise.
Apologies - it's actually SR that's key to the whole idea (though GR corrections are also very important to the accuracy). Without the observed SoL being independent of the relative motion of the user and the satellites, the entire concept is pretty unworkable.