> The point of a compiler is not to try to show off that who ever implemented it knows more loop holes in the C standard, then the user, but to help the programmer write a program that does that the programmer wants.
The author makes it sound like the people working on optimizing compilers are deliberately seeking out these weird corner cases and selecting some random surprising behavior for them out of a hat, gleefully imagining how confusing it will be for end users. That's not how it works. Optimizers can be extraordinarily complex and need to maximize this ill-defined thing called "performance" in a highly multi-dimensional solution space. They ping-pong around inside this space constrained only by the specific requirements of the standard, and it's not surprising that some of the techniques used would produce some counter-intuitive results if the programmer is breaking the rules and relying on undefined behavior. It's kind of like if you trained a neural network to classify cat and dog pictures, and then you showed it a picture of a fire truck and expected it to give you a useful result.
The idea of a new version of the C standard that defines some of the most surprising undefined behavior is an interesting one though, and I'd be interested to see how much that really impacts the ability of the optimizer to do its work.
I'd love it if the C standard just removed undefined behavior, replaced explicit instances ("the behavior is undefined" to "the behavior is implementation defined") and put in a blanket "Any behavior not specified by this standard is implementation defined". Keep the rest the same, just document the footguns. Implementation defined is exactly as powerful as undefined, it just makes the compiler writer describe what will happen.
The author makes it sound like the people working on optimizing compilers are deliberately seeking out these weird corner cases and selecting some random surprising behavior for them out of a hat, gleefully imagining how confusing it will be for end users. That's not how it works. Optimizers can be extraordinarily complex and need to maximize this ill-defined thing called "performance" in a highly multi-dimensional solution space. They ping-pong around inside this space constrained only by the specific requirements of the standard, and it's not surprising that some of the techniques used would produce some counter-intuitive results if the programmer is breaking the rules and relying on undefined behavior. It's kind of like if you trained a neural network to classify cat and dog pictures, and then you showed it a picture of a fire truck and expected it to give you a useful result.
The idea of a new version of the C standard that defines some of the most surprising undefined behavior is an interesting one though, and I'd be interested to see how much that really impacts the ability of the optimizer to do its work.