Why is it important to keep some particular quirk of floating point conversion in JS? JS breaks stuff all the time. Nobody would notice if some floating point conversion will change its lowest bit.
JS libraries break all the time. APIs and experimental language features break sometimes.
But the low-level core fundamentals of the language, such as the way math works, have never changed in it's entire existence. Even the fairly minor changes in strict mode require explicitly opting-in.