(a && b) || (!a && !b)
(a && b) || !(a || b)
!(a ^^ b)
I personally prefer option 3 as it is the most succinct. Few people seem to know about exclusive-or though and so I was concerned I’d make the code less readable. So I wrote a straw poll email to my colleagues. I added a fourth option,
a == b out of completeness, but pointed out that since such a comparison was a complete programming no-no, it really was there just for completeness.
I was caught out by the results. Everyone responded with “why would you use any method other than a == b”? The answer to that was obvious to me and it really surprised me that others didn’t know about this taboo. It stems from me spending a number of years writing FORTRAN and C code that had to talk to each other. The FORTRAN compiler used a byte for a boolean with false being 0x00 and true being 0xFF. C has no concept of a boolean data type, so people invent their own using various numeric data types. Values vary: sometimes 0 might be used as true,; sometimes 1; sometimes the value is defined as !0, which will have different values for different sized numeric data types. Because of these differences in values, a == b is a recipe for disaster when using C. It is vitally important that one therefore uses one of the three logical expressions I listed above when comparing booleans. I had this drummed into me twenty years ago and I had never given it another thought.
Languages have moved on since then though. These days, the Boolean type is built into many languages, and so its value is predictable in those languages. It is safe to compare boolean values directly therefore. This is why everyone else was so caught out by my question. It is safe to do a == b and it is the most easy to read. So why would anyone do anything else? Of course life isn’t that easy. I’d stuck with the idea of never doing this for twenty years and had never questioned it. I had become that most frustrating of human traits: dogmatic. At first I rejected what everyone else was telling me. They had to be wrong. It was bad practice! Except it no longer was bad practice. It took a huge effort of will (and no small amount of desperately trying to come up with counter-arguments) for me to accept this though.
This whole episode has highlighted the dangers of allowing good practice to become dogmatic practice. Good practices are put in place for a reason, but the basis of that reason might change over time. So it is important to question good practices. If they stand up to the questioning, then they are clearly still good practices worth keeping. If they don’t, then get rid of them before they become dogma.