Someone recently asked me about devirtualization optimizations: when do they happen?
when can we rely on devirtualization? do different compilers do devirtualization
differently? As usual, this led me down an experimental rabbit-hole. The answer
seems to be: Modern compilers devirtualize calls to final methods pretty reliably.
But there are many interesting corner cases — including some I haven’t thought of,
I’m sure! — and different compilers do catch different subsets of those corner cases.