I know it's irrational but I really dislike Yoda notation. Every time I encounter one while reading code I have to take a small pause to understand them, I don't know why. My brain just doesn't like them. I don't think I'm the only one either, I've seen a few coding styles in the wild that explicitly disallow them.
Furthermore any decent modern compiler will warn you and ask to add an extra set of parens around assignments in conditions so I don't really think it's worth it anymore. And of course it won't save you if you're comparing two variables (while the warning will).
I don't think "Yoda notation" is good advice. How do you prevent mistakes like the following with Yoda notation?
if ( level = DEBUGLEVEL )
When both sides of the equality sign are variables, the assignment will succeed. Following Yoda notation provides a false sense of security in this case.
As an experienced programmer I have written if-statements so many times in life that I never ever, even by mistake, type:
if (a = b)
I always type:
if (a == b)
by muscle memory. It has become a second nature. Unless of course where I really mean it, like:
One way to not write any bugs is to not write any code.
If you must write code, errors follow, and “defence in depth” is applicable. Use an editor that serves you well, use compiler flags, use your linter, and consider Yoda Notation, which catches classes of errors, but yes, not every error.
These kinds of issues are excellent commercials for why the strictness of a language like F# (or OCaml, Haskell, etc), is such a powerful tool for correctness:
1) Outside of initialization, assignment is done with the `<-` operator, so you're only potentially confused in the opposite direction (assignments incorrectly being boolean comparisons).
2) Return types and inputs are strictly validated so an accidental assignment (returning a `void`/`unit` type), would not compile as the function expects a bool.
3) Immutable by default, so even if #1 and #2 were somehow compromised the compiler would still halt and complain that you were trying to write to something unwriteable
Any of the above would terminally prevent compilation, much less hitting a code review or getting into production... Correctness from the ground up prevents whole categories of bugs at the cost of enforcing coding discipline :)
Note these are only constant pointers. Your data is still mutable if the underlying data structure is mutable, (e.g. HashMap). Haven't used Java in a few years, but I made religious use of final, even in locals and params.
I don't really see how unless you've never actually read imperative code before; either way you need to read both sides of the comparison to gauge what is being compared. I'm dyslexic and don't write my comparisons that way and still found it easy enough to read those examples at a glance.
But ultimately, even if you do find it harder to parse (for whatever reason(s)) that would only be a training thing. After a few days / weeks of writing your comparisons like that I'm sure you'll find is more jarring to read it the other way around. Like all arguments regarding coding styles, what makes the most difference is simply what you're used to reading and writing rather than actual code layout. (I say this as someone who's programmed in well over a dozen different languages over something like 30 years - you just get used to reading different coding styles after a few weeks of using it)
Often when I glance over code to understand what it is doing I don't really care about values. When scanning from left to right it is easier when the left side contains the variable names.
Also I just find it unnatural if I read it out loud. It is called Yoda for a reason.
But again, not of those problems you've described are unteachable. Source code itself doesn't read like how one would structure a paragraph for human consumption. But us programmers learn to parse source code because we read and write it frequently enough to learn to parse it. Just like how one might learn a human language by living and speaking in countries that speak that language.
If you've ever spent more than 5 minutes listening to arguments and counterarguments regarding Python whitespace vs C-style braces - or whether the C-style brace should append your statement for sit on its own line - then you'd quickly see that all these arguments about coding styles are really just personal preference based on what that particular developer is most used to (or pure aesthetics on what looks prettiest - that that's just a different angle of the same debate). Ultimately you were trained to read
if (variable == value)
and thus equally you can train yourself to read
if (value == variable)
All the reasons in the world you can't or shouldn't are just excuses to avoid retraining yourself. That's not to say I think everyone should write Yoda-style code - that's purely a matter of personal preference. But my point is arguing your preference as some tangible issue about legibility is dishonest to yourself and every other programmer.
There are always assumptions being made, no matter what you do. But "uppercase -> constant" is such a generic and cross-platform convention that it should always be followed. This code should never have passed code review for this glitch alone.
[0] https://en.wikipedia.org/wiki/Yoda_conditions