Reason for precedence of instanceof/is

In both C#/Java the operator precedence of is respectively instanceof leads to some ugly necessary parenthesis. For example instead of writing if (!bar instanceof Foo) you have to write if (!(bar instanceof Foo)).

So why did the language teams decide that ! has a higher operator precedence than is/instanceof? Admittedly in C# you can overwrite operator! which would lead to a different result in some situations, but those situations seems exceedingly rare (and non-intuitive in any case), while the case of checking if something is not a type or subtype of something is much more likely.


Conspiracy theory:

C# designers don't want you to use is operator. Usage of this operator is often a smell of bad OOP design. If you find yourself using it often, it probably means that your class hierarchy is wrong and you need to rely on virtual methods and patterns more heavily. Java designers went even further: they named the operator instanceof to make you cringe every time you use it.

This isn't unlikely actually. There're many cases when language and library designers make some features inconvenient to use. Some examples: character encodings in .NET (you should always use Unicode), goto in Pascal (you should avoid it) etc. Sometimes it is caused by bad design (like WPF in .NET), but sometimes it's intentional.

In Java, instanceof is one of the relational operators and has the same precedence as the other ones:

    RelationalExpression < ShiftExpression
    RelationalExpression > ShiftExpression
    RelationalExpression <= ShiftExpression
    RelationalExpression >= ShiftExpression
    RelationalExpression instanceof ReferenceType

From that perspective it makes sense that those two lines should follow the same structure:

if (!(a instanceof b))
if (!(a < b))

Here are my thoughts on the matter with no authoritative source.

instanceof is a very large operator. Most operators are two characters at most. Additionally, instanceof must have whitespace between it and a variable. Because of these two unique things, when you look at an expression like !bar instanceof Foo, the instanceof seems to naturally separate !bar and Foo, and many people would find it surprising if !bar was not a sub-expression.

Similar lines of thought can also be applied to is, with the additional argument of just following what Java already did.

I think it’s just historical. If I remember correctly in the very first versions of Java you could not even write if(a instanceof Foo || a instanceof Bar) without parentheses. I think there was a change around Java 2. I don’t know why they didn’t put it on a higher precedence (e.g. higher than logical not). Maybe because it would interfere with the typecast operator and thus break compatibility then?

C# seems to have just used the same precedence as Java then.

I still think that it is a mistake as well to keep the precedence of bitwise and/or on the same level as the logical and/or. Having to write things like if( (x&(FLAG1|FLAG2)) != 0) … is annoying.

Because by writing if (!bar instanceof Foo) it negates the bar and then looks for instanceof. because this it the most left statement and I dont think that instanceof even has precedence

then if (!(bar instanceof Foo)) it makes it instanceof first and then negates the whole thing.

if you need to negate bar and then check for instance then do ((!bar) instanceof Foo)

  1. instanceof is very long word comparing to basic operators like + or ++. When you read condition, you just lost your focus, at least I do.
  2. It's surrounded by spaces which can increase readability but on the other hand you are unable to connect it with other operands, e.g. like 5+6 can do.

I believe that guys decided to say: ok, lower priority so everyone must provide parenthesis to be sure what's going on

Obiously, an expression like !b instanceof SomeType (read: "negate b, then check if the resulting value is of Type SomeType") doesn't make much sense in Java:

Logically, b had to be some kind of boolean object (so that ! works) and even if you negated its value, it would still be a boolean value, of the same type as before, so why bother negating it in the first place?

(Actually, you can't even do it: b can't be a boolean, because instanceof requires it to be a real Object, but then again, if b is a Boolean, !b would still evaluate to a primitive boolean, so instanceof doesn't work.)

We can therefore say that !b instanceof SomeType has no semantic meaning in Java at all. So we could reassign its meaning to "check if b is not of type SomeType" - can't we?

Given that this could've been changed semantically and still wasn't done leaves me with the conclusion that this was not really intentional, but there was a more pragmatic reason to go with the lower precedence for instanceof:

Off the top of my head, I would suspect that parsing gets complicated if you give instanceof higher precedence than the unary operator !. You might want to check that.

On the other hand, if !b instanceof SomeType would mean "check if b is not of type SomeType", this could still trick novice programmers into thinking that ! operates on b when in fact it negates the result of instanceof, so it's less ambiguous to leave !b instanceof SomeType essentially undefined.

instanceof is a binary operator. ! is a unary operator.

It would be very confusing for instanceof to bind more tightly than !.
A classic example of the confusion is ** and - in Python, where we have:

-1 ** 2 == -(1 ** 2)  # true

I don't know about you, but this just looks ridiculous to me in Python, so I'm glad they're not doing the same thing in Java.

Another Python example is:

False is (not None)   # false


False is not None     # true

which I think is equally confusing, this time because is and is not are different operators.

Because the C programming language got it wrong, and Java blindly followed C.

In C, ! and ~ are the same precedence. It does not actually matter much in C in practice because one writes a<b rather than !(a>=b).

But there is no notinstanceof operator.

You might also ask why /* /* */ */ do not nest properly. Or why Java counts from 0. Or why there needs to be a void keyword. Or why Java uses the horrid {{}{}} notation instead of endif (or fi). It's all C legacy.

And perhaps for good reason. C programmers will argue that all these things are the correct approach, because it is what they are used to. And Java's first job was to become noticed and used, unlike many other little long forgotten programming languages.

Be grateful that Java does not have null terminated strings.


 ? Converting to XOR conjunctive form
 ? programming a "compiler" - any way of knowing if a { (opener) was } (closed)?
 ? Design a DFA for a string {0,1} which when reversed is decimal equivalent of 2 mod 7?
 ? Design a DFA for a string {0,1} which when reversed is decimal equivalent of 2 mod 7?
 ? Design a DFA for a string {0,1} which when reversed is decimal equivalent of 2 mod 7?
 ? How to partition the states of a DFA having a dead state during minimisation of the same
 ? how to intuitively think while Designing an NFA
 ? Design DFA accepting binary strings divisible by a number 'n'
 ? Write a DFA to recognize the following language
 ? NFA Acceptance Confusion